CN113041622A - Virtual throwing object throwing method in virtual environment, terminal and storage medium - Google Patents
Virtual throwing object throwing method in virtual environment, terminal and storage medium Download PDFInfo
- Publication number
- CN113041622A CN113041622A CN202110439760.2A CN202110439760A CN113041622A CN 113041622 A CN113041622 A CN 113041622A CN 202110439760 A CN202110439760 A CN 202110439760A CN 113041622 A CN113041622 A CN 113041622A
- Authority
- CN
- China
- Prior art keywords
- virtual
- point
- throwing
- target
- prop
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
- A63F13/573—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application discloses a method, a terminal and a storage medium for throwing a virtual throwing object in a virtual environment, and belongs to the technical field of computers. The method comprises the following steps: responding to the trigger operation of the target skill control, and controlling the first virtual object to hold the drop point mark prop; marking a prop through a release point to emit a virtual ray, and displaying the release point in a virtual environment based on the virtual ray; and responding to the throwing operation, and controlling the virtual carrier to throw the virtual throwing object into the virtual environment based on the throwing point. In the embodiment of the application, the virtual ray is used for selecting the throwing point, so that a user can clearly determine the specific position of the throwing point in the virtual environment, the throwing is carried out by using the virtual carrier, the throwing accuracy rate can be improved, the situation that the virtual throwing object deviates from the ideal throwing point when the first virtual object is controlled to be directly thrown is avoided, and the practicability of the virtual throwing object is improved.
Description
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method, a terminal and a storage medium for launching a virtual throwing object in a virtual environment.
Background
The battle game is a game in which a plurality of user accounts compete in the same scene, a player can control a virtual object in a virtual environment to walk, run, climb, shoot and other actions, a plurality of players can form a team on line to complete a certain task in the same virtual environment in a coordinated manner, and the player wins by defeating the battle of an enemy.
In the related art, in order to facilitate a player to control a virtual object to remotely attack a distant enemy, a throwing virtual prop, such as a bundled bomb, a bundled missile, etc., with strong attack force, a long attack distance and a large attack range is usually provided.
However, in the related art, when a player throws a throwing object, a throwing point is selected through a map control, or a virtual object is directly controlled to throw in a certain direction, so that the player cannot accurately throw the throwing object to a distant enemy, and the utilization rate is low.
Disclosure of Invention
The embodiment of the application provides a virtual throwing object throwing method, a terminal and a storage medium in a virtual environment, and the throwing accuracy and the practicability of the virtual throwing object can be improved. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a method for delivering a virtual projectile in a virtual environment, where the method includes:
in response to the triggering operation of the target skill control, controlling a first virtual object to hold a drop point marking prop, wherein the drop point marking prop is used for marking a drop point of a virtual throwing object in a virtual environment, and the virtual throwing object is used for changing the attribute value of the virtual object in an action range;
marking a prop through the release point to emit a virtual ray, and displaying the release point in the virtual environment based on the virtual ray;
and responding to a throwing operation, and controlling a virtual carrier to throw the virtual throwing object into the virtual environment based on the throwing point.
In another aspect, an embodiment of the present application provides a device for launching a virtual projectile in a virtual environment, where the device includes:
the first control module is used for responding to triggering operation of the target skill control, controlling a first virtual object to hold a throwing point marking prop, wherein the throwing point marking prop is used for marking a throwing point of a virtual throwing object in a virtual environment, and the virtual throwing object is used for changing an attribute value of the virtual object in an action range;
the first display module is used for emitting virtual rays through the launching point mark prop and displaying the launching point in the virtual environment based on the virtual rays;
and the releasing module is used for responding to releasing operation and controlling the virtual carrier to release the virtual throwing object to the virtual environment based on the releasing point.
In another aspect, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory; the memory has stored therein at least one instruction, at least one program, set of codes, or set of instructions that is loaded and executed by the processor to implement a method of delivering a virtual projectile in a virtual environment as described in the above aspect.
In another aspect, the present application provides a computer-readable storage medium, in which at least one computer program is stored, the computer program being loaded and executed by a processor to implement the method for delivering a virtual projectile in a virtual environment according to the above aspect.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the terminal reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the terminal to implement the method for delivering a virtual projectile in a virtual environment provided in the various alternative implementations of the above aspect.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
in the embodiment of the application, the user controls the first virtual object to release the target skill, the stage property is marked through the throwing point capable of emitting the virtual ray, the throwing point of the virtual throwing object is directly selected in the virtual environment, the throwing point is selected by utilizing the virtual ray, the specific position of the throwing point in the virtual environment can be clearly determined by the user, the throwing is carried out by utilizing the virtual carrier, the throwing accuracy rate can be improved, the deviation of the virtual throwing object from the ideal throwing point when the control of the first virtual object is avoided, and the practicability of the virtual throwing object is improved.
In addition, the launching point can be selected in the virtual environment by marking the item with the launching point of the virtual ray, and compared with a mode of selecting the launching point in a map control, the interaction and the interestingness of launching operation are improved.
Drawings
FIG. 1 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 2 is a flow chart of a method of delivering a virtual projectile in a virtual environment as provided by an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram of a trigger target skill control provided by an exemplary embodiment of the present application;
FIG. 4 is a flow chart of a method of delivering a virtual projectile in a virtual environment provided by another exemplary embodiment of the present application;
fig. 5 is a schematic diagram illustrating an illegal placement identity according to an exemplary embodiment of the present application;
FIG. 6 is a schematic illustration of an updated drop point location provided by an exemplary embodiment of the present application;
FIG. 7 is a schematic view of switching viewing angles provided by an exemplary embodiment of the present application;
FIG. 8 is a flow chart of a method of delivering a virtual projectile in a virtual environment as provided by another exemplary embodiment of the present application;
FIG. 9 is a schematic illustration of delivering a virtual projectile provided by an exemplary embodiment of the present application;
FIG. 10 is a schematic illustration of virtual throwing object validation as provided by an exemplary embodiment of the present application;
FIG. 11 is a flow chart of a method of delivering a virtual projectile in a virtual environment as provided by another exemplary embodiment of the present application;
FIG. 12 is a schematic view of a prop equipment interface provided in an exemplary embodiment of the present application;
FIG. 13 is a schematic illustration of a switch target skill control display state as provided by an exemplary embodiment of the present application;
FIG. 14 is a schematic diagram illustrating a prompt corresponding to a target virtual object according to an exemplary embodiment of the present application;
FIG. 15 is a block diagram of a virtual projectile delivery apparatus in a virtual environment as provided in an exemplary embodiment of the present application;
fig. 16 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Reference herein to "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
First, terms referred to in the embodiments of the present application are described:
1) virtual environment
Refers to a virtual environment that an application program displays (or provides) when running on a terminal. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional three-dimensional environment, or a pure fictional three-dimensional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, and the following embodiments illustrate the virtual environment as a three-dimensional virtual environment, but are not limited thereto. Optionally, the virtual environment is also used for virtual environment engagement between at least two virtual characters. Optionally, the virtual environment has virtual resources available for use by at least two virtual characters.
2) Virtual object
A virtual object refers to a movable object in a virtual scene. The movable object may be at least one of a virtual character, a virtual animal, and an animation character. Alternatively, when the virtual scene is a three-dimensional virtual scene, the virtual object may be a three-dimensional stereo model. Each virtual object has its own shape and volume in the three-dimensional virtual scene, occupying a portion of the space in the three-dimensional virtual scene. Optionally, the virtual character is a three-dimensional character constructed based on three-dimensional human skeleton technology, and the virtual character realizes different external images by wearing different skins. In some implementations, the virtual role can also be implemented by using a 2.5-dimensional or 2-dimensional model, which is not limited in this application.
3) Virtual prop
The virtual prop refers to a prop which can be used by a virtual object in a virtual environment, and comprises at least one of a virtual weapon, a functional prop and virtual equipment. Illustratively, a virtual item in this application refers to a virtual weapon for changing the value of a property of a virtual object in a virtual environment. For example, the virtual weapon includes shooting virtual props such as handguns, rifles, sniper guns and bow and arrow, close attack virtual props such as daggers and hammers, and throwing virtual props such as grenades, guided missiles, flash bombs and smoke bombs. The throwing type virtual prop is a virtual prop which is thrown by a virtual object or other virtual vehicles towards a certain direction or place and takes effect after reaching a throwing point or colliding.
4) User Interface (UI) controls
Refers to any visual control or element that is visible on the user interface of the application, such as controls for pictures, input boxes, text boxes, buttons, tabs, etc., some of which are responsive to user actions.
The method provided in the present application may be applied to a virtual reality application program, a three-dimensional map program, a military simulation program, a first person shooter game, a Multiplayer Online Battle Arena game (MOBA), and the like, and the following embodiments are exemplified by applications in Games.
The game based on the virtual environment is often composed of one or more maps of game worlds, the virtual environment in the game simulates the scene of the real world, and a user can control the virtual object in the game to perform actions such as walking, running, jumping, shooting, fighting, driving, switching to use the virtual prop, using the virtual prop to hurt other virtual objects and the like in the virtual environment, wherein, in order to facilitate the player to control the virtual object to remotely attack remote enemies at remote places, such as bundled bombs, bundled missiles and the like, the player controls the virtual object to throw the bundled bombs to a certain place in the visual field range, so that continuous and multiple explosions occur in the range selected by the player, and more players can be quickly defeated. However, in the related art, when a player throws a throwing object, it is generally necessary to select a throwing point in a map control or directly control a virtual object to throw the virtual throwing object in a certain direction, and it is not possible to accurately throw the throwing object to an enemy at a distance, and the utility rate is low.
In order to solve the technical problem, an embodiment of the present application provides a method for launching a virtual projectile in a virtual environment, the method includes adding a target skill to a first virtual object, when a terminal receives a trigger operation on a target skill control, controlling the first virtual object to hold a launch point mark prop, where the launch point mark prop can emit a virtual ray in the virtual environment, determining a position of a launch point by the terminal based on the virtual ray, and displaying the virtual ray and the launch point in a virtual environment picture, so that a user can visually determine the position of the launch point, and can adjust the launch point mark prop by controlling the first virtual object, change the launch point, and when receiving the launch operation, controlling a virtual vehicle to launch the virtual projectile based on the launch point, thereby improving launch accuracy.
FIG. 1 illustrates a schematic diagram of an implementation environment provided by one embodiment of the present application. The implementation environment may include: a first terminal 110, a server 120, and a second terminal 130.
The first terminal 110 is installed and operated with an application 111 supporting a virtual environment, and when the first terminal operates the application 111, a user interface of the application 111 is displayed on a screen of the first terminal 110. The application 111 may be any one of a military Simulation program, an MOBA Game, a large-fleeing and killing shooting Game, and a Simulation strategy Game (SLG). In the present embodiment, the application 111 is exemplified as a Role-Playing Game (RPG). The first terminal 110 is a terminal used by the first user 112, and the first user 112 uses the first terminal 110 to control a first virtual object located in the virtual environment for activity, where the first virtual object may be referred to as a master virtual object of the first user 112. The activities of the first virtual object include, but are not limited to: adjusting at least one of body posture, crawling, walking, running, riding, flying, jumping, driving, picking, shooting, attacking, throwing, releasing skills. Illustratively, the first virtual object is a first virtual character, such as a simulated character or an animation character.
The second terminal 130 is installed and operated with an application 131 supporting a virtual environment, and when the second terminal 130 operates the application 131, a user interface of the application 131 is displayed on a screen of the second terminal 130. The client may be any one of a military simulation program, a MOBA game, a large fleeing and killing shooting game, and an SLG game, and in this embodiment, the application 131 is an RPG for example. The second terminal 130 is a terminal used by the second user 132, and the second user 132 uses the second terminal 130 to control a second virtual object located in the virtual environment to perform an activity, where the second virtual object may be referred to as a master virtual character of the second user 132. Illustratively, the second virtual object is a second virtual character, such as a simulated character or an animation character.
Optionally, the first virtual object and the second virtual object are in the same virtual world. Optionally, the first virtual object and the second virtual object may belong to the same camp, the same team, the same organization, a friend relationship, or a temporary communication right. Alternatively, the first virtual object and the second virtual object may belong to different camps, different teams, different organizations, or have a hostile relationship. In the embodiment of the present application, the first virtual object and the second virtual object belong to the same camp as an example.
Optionally, the applications installed on the first terminal 110 and the second terminal 130 are the same, or the applications installed on the two terminals are the same type of application on different operating system platforms (android or IOS). The first terminal 110 may generally refer to one of a plurality of terminals, and the second terminal 130 may generally refer to another of the plurality of terminals, and this embodiment is only illustrated by the first terminal 110 and the second terminal 130. The device types of the first terminal 110 and the second terminal 130 are the same or different, and include: at least one of a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer.
Only two terminals are shown in fig. 1, but there are a plurality of other terminals that may access the server 120 in different embodiments. Optionally, one or more terminals are terminals corresponding to the developer, a development and editing platform for supporting the application program in the virtual environment is installed on the terminal, the developer can edit and update the application program on the terminal and transmit the updated application program installation package to the server 120 through a wired or wireless network, and the first terminal 110 and the second terminal 130 can download the application program installation package from the server 120 to update the application program.
The first terminal 110, the second terminal 130, and other terminals are connected to the server 120 through a wireless network or a wired network.
The server 120 includes at least one of a server, a server cluster composed of a plurality of servers, a cloud computing platform, and a virtualization center. The server 120 is used to provide background services for applications that support a three-dimensional virtual environment. Optionally, the server 120 undertakes primary computational work and the terminals undertake secondary computational work; alternatively, the server 120 undertakes the secondary computing work and the terminal undertakes the primary computing work; alternatively, the server 120 and the terminal perform cooperative computing by using a distributed computing architecture.
In one illustrative example, server 120 includes memory 121, processor 122, user account database 123, battle services module 124, user-oriented Input/Output Interface (I/O Interface) 125. The processor 122 is configured to load an instruction stored in the server 120, and process data in the user account database 123 and the combat service module 124; the user account database 123 is configured to store data of user accounts used by the first terminal 110, the second terminal 130, and other terminals, such as a head portrait of the user account, a nickname of the user account, a fighting capacity index of the user account, and a service area where the user account is located; the fight service module 124 is used for providing a plurality of fight rooms for the users to fight, such as 1V1 fight, 3V3 fight, 5V5 fight and the like; the user-facing I/O interface 125 is used to establish communication with the first terminal 110 and/or the second terminal 130 through a wireless network or a wired network to exchange data.
Fig. 2 shows a flow chart of a method for delivering a virtual projectile in a virtual environment, provided by an exemplary embodiment of the present application. The embodiment is described by taking as an example that the method is used for the first terminal 110 or the second terminal 130 in the implementation environment shown in fig. 1 or other terminals in the implementation environment, and the method includes the following steps:
The target skill control is a trigger control corresponding to the target skill, when the trigger operation on the target skill control is received, the terminal controls the first virtual object to use the target skill, namely controls the first virtual object to hold the release point mark prop, and if the first virtual object holds other virtual props before the trigger operation on the target skill control is received, the other virtual props are switched to the release point mark prop.
Optionally, the triggering operation on the target skill control includes a click operation, a long-time press operation, a drag operation, or a voice control operation on the UI control.
Drop point marking props are virtual props used to select a drop point of a virtual projectile in a virtual environment. And the user controls the first virtual object to hold the mark point to launch the prop, so that the first virtual object selects a launching point in the virtual environment.
The virtual throwing object is a throwing virtual object used for changing the attribute value of a virtual object in an action range, such as an antitank grenade, a bomb, a missile and the like, and can be directly thrown by the virtual object or thrown by a virtual carrier.
Illustratively, as shown in fig. 3, a target skill control 301 is displayed in a virtual environment interface (i.e., a game interface), and when a trigger operation on the target skill control 301 is received, the terminal controls the first virtual object to put back a currently used virtual prop 302, and uses a target skill to hold a drop point to mark the prop 303.
In one possible implementation, since the virtual throwing object belongs to a remote attack prop, the user usually needs to attack a remote virtual object when using the throwing object, so that the throwing point needs to be determined at a position far away from the first virtual object. In order to facilitate the user to control the first virtual object to remotely select the drop point, the drop point marking prop can emit a virtual ray (e.g. a virtual light beam), and the terminal displays the drop point in the virtual environment based on the virtual ray. In addition, the user can select the release point by observing the position, the orientation and other information of the virtual ray in the virtual environment, so that the operation is more intuitive and simple.
As shown in fig. 3, the drop point marking prop 303 emits a virtual ray (shown as a dotted line) which is visible in the interface and can be embodied as a light beam.
And step 203, responding to the throwing operation, and controlling the virtual carrier to throw the virtual throwing object into the virtual environment based on the throwing point.
When the throwing operation is received, the terminal determines to throw the virtual throwing object to the current throwing point, and accordingly the virtual carrier is controlled to throw the virtual throwing object to the virtual environment based on the throwing point. The virtual carrier is a carrier for delivering a virtual throwing object into a virtual environment, such as a virtual airplane, a virtual parachute, a virtual tank, and the like.
For example, when the terminal receives a triggering operation on the throwing control 304, the virtual vehicle is controlled to throw the virtual throwing object into the virtual environment.
Optionally, the terminal controls the virtual carrier to throw the virtual throwing object once based on the throwing point, or the terminal controls the virtual carrier to throw the virtual throwing object once every preset time interval and continuously throw the virtual throwing object n times, where n is a positive integer.
Optionally, the releasing operation is a triggering operation on a releasing control; or, performing secondary trigger operation on the target skill control; or, the method and the device are applied to other touch operations (e.g., a click operation, a slide operation, etc.) in the virtual environment interface, which is not limited in this application embodiment.
To sum up, in this application embodiment, the user controls first virtual object release target skill, marks the stage property through the put in point that can launch virtual ray, directly selects the put in point of virtual throw thing in virtual environment, utilizes virtual ray to select the put in point, makes the user can make clear and definite the concrete position of put in point in virtual environment to utilize virtual carrier to put in, can improve and put in the rate of accuracy, the skew ideal put in point of virtual throw thing when avoiding controlling first virtual object and directly putting in, and then improved the practicality of virtual throw thing.
In addition, the launching point can be selected in the virtual environment by marking the item with the launching point of the virtual ray, and compared with a mode of selecting the launching point in a map control, the interaction and the interestingness of launching operation are improved.
In a possible implementation manner, the terminal determines and displays the release point based on the virtual ray emitted by the release point marking prop, and the user can change the release point by adjusting the release point marking prop. Referring to fig. 4, a flow chart of a method for delivering a virtual projectile in a virtual environment according to another exemplary embodiment of the present application is shown. The embodiment is described by taking as an example that the method is used for the first terminal 110 or the second terminal 130 in the implementation environment shown in fig. 1 or other terminals in the implementation environment, and the method includes the following steps:
For a specific implementation of step 401, reference may be made to step 201 described above, and details of this embodiment are not described herein again.
In a possible implementation manner, in order to facilitate a user to control the virtual ray by controlling the first virtual object to hold the drop point mark prop and improve the reality of the interactive operation, the terminal controls the pointing direction of the virtual ray in the virtual environment to be consistent with the pointing direction of the drop point mark prop. If the direction of the drop point mark prop changes, the direction of the virtual ray also changes correspondingly, so that a user can select a drop point by controlling the drop point mark prop.
For example, the drop point mark prop is displayed as a preset prop model, such as a flashlight, a laser pen, and the like, as shown in fig. 3, the drop point mark prop 303 is a virtual flashlight, a virtual ray is a virtual light beam emitted by the virtual flashlight, and the virtual ray and the drop point mark prop 303 point in the same direction.
And step 403, determining the intersection point of the virtual ray and the virtual object in the virtual environment as a drop point, and displaying a drop point identifier at the drop point.
In a virtual environment, a virtual ray emitted by a launching point marking prop intersects with a virtual object (such as the ground, a wall, a tree, a virtual object and the like) to form an intersection point, the terminal determines the intersection point as a launching point, and a launching point mark is displayed at the launching point to prompt a user of the position of the current launching point, so that the user can determine whether to adjust the current launching point.
Illustratively, as shown in fig. 3, the virtual ray emitted by the drop point marking prop 303 generates an intersection point with the ground in the virtual environment, and the terminal displays a drop point identifier 305 at the intersection point.
Since in the real world the range that can be observed by the human eye is limited, in order to improve the reality of the throw, the first virtual object can only select the throw point within a preset range, and in one possible embodiment, step 403 includes the following steps:
and in response to the distance between the intersection point and the position of the first virtual object being smaller than a first distance threshold value, determining the intersection point as a drop point, and displaying a drop point identifier at the drop point.
The terminal determines the distance between the position of the intersection point of the virtual ray and the virtual object and the position of the first virtual object in real time, wherein the distance refers to the distance in the virtual environment, and if the distance is smaller than a first distance threshold (for example, 100 meters), the terminal displays a throwing point identifier at the intersection point to represent that the virtual throwing object can be thrown at the point currently.
When the distance between the position of the intersection point and the position of the first virtual object exceeds a first distance threshold, the first virtual object cannot throw the virtual throwing object at the intersection point, and the method further comprises the following steps:
and in response to the distance between the intersection point and the position of the first virtual object being larger than a first distance threshold value, displaying an illegal release identifier at the intersection point.
As shown in fig. 5, when the terminal determines that the distance between the intersection and the position of the first virtual object is greater than the first distance threshold (100 meters), an illegal throwing mark 501 is displayed at the intersection to prompt the user that the virtual throwing object cannot be thrown at the intersection. In addition, the terminal can also display prompt information 502 through a virtual environment interface to prompt a user to adjust the virtual ray and change the intersection point position.
And step 404, receiving control operation on the drop point marked prop and adjusting the orientation of the drop point marked prop.
Optionally, the user drags the arm of the first virtual object or the drop point virtual prop to enable the terminal to control the first virtual object to change the direction of the drop point mark prop based on dragging operation; or the display position of the drop point mark prop relative to the screen is unchanged, and the user enables the terminal to change the virtual environment in the visual field range of the first virtual object based on the sliding distance and the sliding direction of the sliding operation by sliding the virtual environment interface, so that the orientation of the drop point mark prop in the virtual environment is changed. The embodiments of the present application do not limit this.
Schematically, a user performs an upward sliding operation in the virtual environment interface, so that the orientation of the virtual ray is close to a vertical downward direction, that is, the intersection point is close to the position of the first virtual object; and the user performs downward sliding operation in the virtual environment interface, so that the orientation of the virtual ray is close to the horizontal forward direction, namely the intersection point is far away from the position of the first virtual object.
And step 405, updating the position of the drop point in the virtual environment based on the orientation of the adjusted drop point marking prop.
And after the terminal receives the control operation, updating the position of the launching point based on the orientation of the launching point mark prop.
In a possible implementation, the user can also control the first virtual object to move in the virtual environment through the terminal, so as to change the position of the drop point. As shown in fig. 6, the first virtual object releases the target skill at a position farther from the virtual building 603, holds the drop point mark prop, displays a drop point identifier 602 at the corresponding drop point, and when receiving a trigger operation on the movement control 601, controls the first virtual object to move forward and approach the virtual building 603 based on the trigger operation, and updates the position of the drop point based on a change in the position of the first virtual object.
And 406, in response to the throwing operation, controlling the virtual carrier to throw the virtual throwing object into the virtual environment based on the throwing point.
For a specific implementation of step 406, reference may be made to step 203 described above, and details of this embodiment are not described herein again.
In the embodiment of the application, the release point is determined based on the intersection point of the virtual ray and the virtual object, the release point identification is displayed at the release point in real time, the position of the release point in the virtual environment is updated in time based on the control operation of marking the prop to the release point, and a user can conveniently determine the position of the release point by observing the virtual environment interface, so that whether the adjustment is needed is judged.
In a possible implementation manner, in order to facilitate a user to select a drop point and avoid a blocked line of sight, when the terminal receives a trigger operation on the target skill control in a display state of a third person perspective view, the method for dropping a virtual projectile in a virtual environment in the embodiment of the present application further includes the following steps:
and responding to the triggering operation of the target skill control, and switching the virtual environment picture from the third person perspective to the first person perspective.
In one possible embodiment, the terminal displays the virtual environment through the virtual environment screen. Alternatively, the virtual environment screen is a screen that observes the virtual environment from the perspective of the virtual object. The perspective refers to an observation angle when observing in the virtual environment at a first person perspective or a third person perspective of the virtual object. Optionally, in an embodiment of the present application, the viewing angle is an angle when a virtual object is observed by a camera model in a virtual environment.
Optionally, the camera model automatically follows the virtual object in the virtual environment, that is, when the position of the virtual object in the virtual environment changes, the camera model changes while following the position of the virtual object in the virtual environment, and the camera model is always within the preset distance range of the virtual object in the virtual environment. Optionally, the relative positions of the camera model and the virtual object do not change during the automatic following process.
The camera model refers to a three-dimensional model located around a virtual object in a virtual environment, and when a first-person perspective is adopted, the camera model is located near or at the head of the virtual object; when the third person perspective is adopted, the camera model may be located behind and bound to the virtual object, or may be located at any position away from the virtual object by a preset distance, and the virtual object located in the virtual environment may be observed from different angles by the camera model. Optionally, the viewing angle includes other viewing angles, such as a top viewing angle, in addition to the first person viewing angle and the third person viewing angle; the camera model may be located overhead of the virtual object head when a top view is employed, which is a view of viewing the virtual environment from an overhead top view. Optionally, the camera model is not actually displayed in the virtual environment, i.e. the camera model is not displayed in the virtual environment displayed by the user interface. To illustrate the case where the camera model is located at an arbitrary position away from the virtual object by a preset distance, optionally, one virtual object corresponds to one camera model, and the camera model can rotate around the virtual object as a rotation center, for example: the camera model is rotated with any point of the virtual object as a rotation center, the camera model not only rotates in angle but also shifts in displacement during the rotation, and the distance between the camera model and the rotation center is kept constant during the rotation, that is, the camera model is rotated on the surface of a sphere with the rotation center as a sphere center, wherein any point of the virtual object may be a head, a trunk or any point around the virtual object, which is not limited in the embodiment of the present application. Optionally, when the camera model observes the virtual object, the center of the view angle of the camera model points in a direction in which a point of the spherical surface on which the camera model is located points at the center of the sphere.
As shown in fig. 7, the terminal displays a virtual environment picture at a third person visual angle, when a triggering operation on the target skill control 701 is received, the terminal switches the third person visual angle into a first person visual angle, and controls the camera model lens to be drawn close, so that a user selects a release point at the first person visual angle, a virtual ray is prevented from being blocked by a first virtual object, and the experience of controlling light through a prop in a real environment can be simulated, thereby improving interactivity and reality.
In a possible implementation manner, the embodiment of the application provides at least two launching manners, corresponding launching accuracies of different launching manners are different, and the terminal determines the target launching manner based on the distance of a launching point, so that launching of the virtual thrower is more practical. Fig. 8 shows a flow chart of a method of delivering a virtual projectile in a virtual environment provided by another example embodiment of the present application. The embodiment is described by taking as an example that the method is used for the first terminal 110 or the second terminal 130 in the implementation environment shown in fig. 1 or other terminals in the implementation environment, and the method includes the following steps:
For the specific implementation of steps 801 to 802, reference may be made to steps 201 to 202, which are not described herein again in this embodiment of the present application.
And 803, responding to the releasing operation, determining a target releasing mode based on the releasing points, wherein different releasing modes correspond to different releasing precisions.
The throwing distance refers to a distance between a throwing point and a position of a first virtual object in a virtual environment, in the embodiment of the application, the terminal determines a throwing mode of a virtual throwing object based on the throwing distance, for example, when the throwing distance is smaller than a preset distance, the terminal determines a throwing mode with high throwing precision as a target throwing mode, and when the throwing distance is larger than the preset distance, the terminal determines a throwing mode with low throwing precision as a target throwing mode.
In a possible implementation manner, the terminal determines the target delivery manner according to the corresponding relationship between the delivery distance and the delivery manner, and step 803 includes the following steps:
and 803a, in response to that the throwing distance is smaller than the second distance threshold, determining the first throwing mode as a target throwing mode, wherein the throwing distance is the distance between the throwing point and the position of the first virtual object.
And 803b, determining the second release mode as the target release mode in response to the release distance being greater than the second distance threshold.
And the throwing precision of the first throwing mode is higher than that of the second throwing mode. The throwing precision is related to the deviation between the landing position of the virtual throwing object in the virtual environment and the throwing point when the virtual throwing object is thrown by the virtual carrier. The smaller the deviation is, the higher the throwing precision is; alternatively, the smaller the probability of occurrence of a deviation, the higher the delivery accuracy.
In a possible implementation manner, the delivery manners in the embodiment of the present application include a first delivery manner and a second delivery manner. Optionally, the first launching mode is a precise launching with a launching point as a launching center point, and the second launching mode is a range launching with a launching range determined based on the launching point and the launching distance; or the first putting mode and the second putting mode are both range putting based on the putting point and the putting distance to determine the putting range, and the putting range corresponding to the first putting mode is smaller than the putting range corresponding to the second putting mode. The putting center point is a reference point when the virtual throwing object is put in the virtual carrier. For example, when the number of the virtual throwing objects is 1, the throwing central point is an action point of the virtual throwing object; when the number of the virtual throwing objects is 2, the throwing central point is the middle point on the connecting line of the action points of the two virtual throwing objects; when the number of the virtual throwing objects is 3, the connecting lines among the action points of the three virtual throwing objects form an equilateral triangle, and the throwing central point is the middle point of the equilateral triangle.
And step 804, determining a release center point based on the release point and the target release mode.
And when the terminal receives the releasing operation, determining a releasing central point in the virtual environment based on the target releasing mode and the releasing point. In one possible implementation, step 804 includes steps 804a to 804b, or steps 804c to 804 d:
step 804a, in response to the target delivery mode being the first delivery mode, determining a first delivery area based on the delivery point, the first delivery area being an area centered on the delivery point.
And 804b, determining a delivery central point from the first delivery area.
In one possible embodiment, the first launch area is fixed in area and shape. When the throwing distance is smaller than the second distance threshold value, the area and the shape of the first throwing area are consistent regardless of the throwing distance.
Illustratively, the second distance threshold is 50 meters, when the release distance is less than 50 meters, the terminal determines a circular area with a release point as a center and a radius of 5 meters as a first release area, and randomly determines a point from the first release area as a release center point.
And 804c, responding to that the target putting mode is a second putting mode, determining a second putting area based on the putting point and the putting distance, wherein the second putting area takes the putting point as the center, is larger than the first putting area, and has positive correlation with the area of the second putting area and the putting distance.
And 804d, determining a delivery central point from the second delivery area.
In a possible embodiment, when the target delivery mode is the second delivery mode, the greater the delivery distance, the greater the area of the second delivery region.
Illustratively, the second shot region is a circular region with a shot point as a center, and the calculation formula of the radius of the second shot region is r0+ (s-50)/a, where r0 is the radius of the first shot region, s is the shot distance, and a is the preset attenuation coefficient.
In a possible implementation manner, in order to facilitate other virtual objects to know in time that a virtual throwing object is about to be thrown in a virtual environment, after the terminal determines a throwing central point, a notification message is displayed through the global message display area, wherein the notification message includes the position of the throwing central point, the position of the first virtual object, the time length from the virtual carrier to the virtual throwing object, and the like, and the message is globally visible. Other virtual objects can reduce damage by lying down or being far away from the release central point, and the attribute value is prevented from being reduced rapidly.
As shown in fig. 9, when a launch operation is received, the terminal determines a launch area 901 based on a target launch mode and a position of a launch point, determines a launch center point from the launch area 901, displays a center point identifier 902 corresponding to the launch center point in a virtual environment picture, displays a distance between the launch center point and a first virtual object, and after a preset duration, controls the virtual vehicle to launch a virtual projectile 903 into the virtual environment based on the launch center point.
And step 806, responding to the virtual throwing object being effective, and generating m sub virtual throwing objects based on the virtual throwing object, wherein m is a positive integer.
In one possible implementation, the virtual throwing object bounces off when colliding with a virtual object in the virtual environment except the ground until landing, and starts to count down after landing, and becomes effective after a preset time (for example, 2 s). When the virtual throwing object takes effect, a corresponding special effect (such as bomb explosion) is displayed in the virtual environment picture.
In a possible implementation mode, after the virtual throwing object takes effect, prop fragments are generated, and the terminal takes the prop fragments as a sub virtual throwing object and controls the sub virtual throwing object to carry out secondary throwing. For example, the virtual projectile is a bomb, and m bomb fragments generated by the terminal control bomb are determined as a sub virtual projectile after the bomb explodes based on the launching center point.
Optionally, the number of the sub virtual throwers generated by the target virtual thrower is a fixed number, that is, m is a fixed numerical value; or, the terminal randomly determines the value of m within a certain range of values, that is, the number of sub virtual throws generated after the virtual throws take effect may be different, which is not limited in the embodiment of the present application.
And step 807, controlling the m sub virtual throwers to move along a preset track.
In one possible implementation mode, the terminal controls the m sub virtual throwers to respectively move along the corresponding preset tracks. Optionally, the preset trajectory is a fixed trajectory, that is, the child virtual throwers move according to the same trajectory after the virtual thrower takes effect each time, or the terminal randomly determines m preset trajectories from the preset trajectories, so as to control the child virtual thrower to move. Illustratively, the predetermined trajectory is a parabola starting from a point of generation of the sub virtual projectile.
And 808, responding to the collision of the sub virtual throwing object in the motion process, and controlling the sub virtual throwing object to take effect at a collision point, wherein the sub virtual throwing object is used for changing the attribute value of the virtual object in the action range.
The terminal controls the sub-virtual throwing object to move along a preset track, and if the sub-virtual throwing object collides in the moving process, for example, with a virtual building, a virtual object or the ground in a virtual environment, the sub-virtual throwing object is controlled to take effect immediately at a collision point. In a possible implementation mode, the action range of the sub virtual throwing object is smaller than that of the virtual throwing object, for example, the virtual throwing object is a grenade, the action range of the virtual throwing object is a spherical area with the ground point as the center and the radius of 20m, when the virtual throwing object takes effect, the attribute values of virtual objects in the area are all changed, grenade fragments generated after the grenade explode move according to a preset track and explode secondarily after the collision, and the action range of the virtual throwing object is a spherical area with the collision point as the center and the radius of 10 m.
As shown in fig. 10, after the target virtual prop takes effect at the actual launching point 1001, the terminal displays the special effect of the target virtual prop and generates the sub virtual prop 1002 through the virtual environment screen, the terminal controls the sub virtual prop 1002 to move along the preset trajectory, and after the collision occurs at the collision point 1003, the sub virtual prop takes effect at the collision point 1003.
In the embodiment of the application, the target throwing mode is determined based on the distance between the throwing point and the first virtual object, and the throwing precision of the virtual throwing object is changed along with the change of the throwing distance, so that the throwing of the virtual throwing object is more practical, and the authenticity of throwing the virtual throwing object in a virtual environment is improved; after the virtual throwing object takes effect, the generation of the sub virtual throwing object is controlled, and the sub virtual throwing object takes effect again, so that the problems that the user accurately throws the virtual throwing object and the virtual throwing object cannot act on other virtual objects due to the time difference between the user selecting the throwing point and the virtual throwing object taking effect are solved.
In one possible implementation, the target skill is not set by default for the virtual object in the game, and the user needs to control the first virtual object to equip the target virtual item, so as to add the target skill to the first virtual object, and in addition, the target virtual item can also add other passive skills to the first virtual object. Figure 11 shows a flow chart of a method of delivering a virtual projectile in a virtual environment provided by another example embodiment of the present application. The embodiment is described by taking as an example that the method is used for the first terminal 110 or the second terminal 130 in the implementation environment shown in fig. 1 or other terminals in the implementation environment, and the method includes the following steps:
And the terminal displays the skill prop equipment interface before entering the game, or displays the skill prop equipment interface when the game starts and the first virtual object does not attack the second virtual object.
Optionally, the first virtual object can be equipped with a predetermined number of skill props in one game. After the first virtual object is provided with the skill prop, the first virtual object always has the attribute or skill of the corresponding skill prop in the subsequent single game-play process.
Referring to fig. 12, a schematic diagram of a skill prop equipment interface is shown. The skill prop equipment interface comprises a skill prop selection column 1201, wherein the skill prop selection column comprises skill props owned by a first virtual object corresponding to the current account. When the terminal receives a trigger operation of a target selection control 1202 corresponding to a target skill prop, a target skill prop 1203 and a prop introduction are displayed in a skill prop equipment interface, when the trigger operation of the equipment control 1204 is received, the target skill prop selection operation is determined to be received, a prop equipment instruction is sent to the server, and the first virtual object is controlled to be equipped with the target skill prop 1203.
And 1102, displaying a target skill control corresponding to the target skill through a virtual environment interface.
In one possible embodiment, the terminal displays the target skill control in the virtual environment interface when the first virtual object is equipped with the target skill prop, and does not display the target skill control when the first virtual object is not equipped with the target skill prop.
In one possible implementation, step 1102 includes the steps of:
step 1102a, displaying the target skill control in a non-triggerable state through the virtual environment interface.
In one possible embodiment, the target skill has a skill cooling duration. After the game starts, the target skill is in a skill cooling state, and correspondingly, the target skill control is in a non-triggerable state, as shown in fig. 13, the target skill control 1301 in the non-triggerable state cannot receive and respond to the trigger operation.
And step 1102b, responding to the skill cooling time, and switching the display state of the target skill control from the non-triggerable state to the triggerable state.
When the skill cooling duration is reached, the target skill releases the skill cooling state for the first time, at this time, the terminal switches the display state of the target skill control from the non-triggerable state to the triggerable state, as shown in fig. 13, the target skill control 1301 in the triggerable state can receive a trigger operation.
In a possible implementation manner, when the terminal receives a trigger operation on a target skill control in a triggerable state, it is determined that a first virtual object releases a target skill, the first virtual object is controlled to hold a drop point mark prop, at the moment, the target skill enters a skill cooling state again, and a display state of the target skill control of the terminal is switched from the triggerable state to a non-triggerable state until the skill cooling time is reached again. Wherein the skill cooling time period in the game process is fixed, or the skill cooling time period can be reduced along with the increase of the target skill release times.
And 1104, marking the prop through the release point to emit a virtual ray, and displaying the release point in the virtual environment based on the virtual ray.
Reference may be made to the above steps 202 to 203 for specific implementation of the steps 1104 to 1105, which is not limited in this application.
The target skill prop also has a detection function on the target virtual object, the function belongs to passive skill, active triggering of a user is not needed, and the skill immediately takes effect when the first virtual object is provided with the target virtual prop until the first virtual object unloads the target skill prop. The target virtual object is a second virtual object in a wing mounting state, and the second virtual object and the first virtual object belong to different camps.
In one possible implementation, step 1106 includes the steps of:
step 1106a, in response to that the target virtual object is located in a first preset area, displaying a location identifier of the target virtual object in the map display control, where the first preset area is an area with the location of the first virtual object as a center.
When the target virtual object exists in the first preset area, the terminal displays the position identifier of the target virtual object through the map display control, wherein the position identifier is different from the position identifiers of other virtual objects, for example, the position identifier of the target virtual object is different from the position identifiers of other virtual objects in color and/or shape, so as to remind a user that the target virtual object exists nearby and needs to be kept alert. As shown in fig. 14, the terminal displays a location identifier 1401 of the target virtual object through the map presentation control.
Step 1106b, in response to the target virtual object being located in the second preset area, displaying a preset special effect on the edge of the virtual environment interface.
The second preset area is an area with the position of the first virtual object as the center, and the second preset area is in the first preset area. For example, the first predetermined area is a circular area with a radius of 150m centered on the position of the first virtual object, and the second predetermined area is a circular area with a radius of 100m centered on the position of the first virtual object.
In a possible implementation manner, when the target virtual object exists in the second preset area, the terminal displays a preset special effect at an edge of the virtual environment interface to prompt the user that the target virtual object exists in the current short distance. As shown in fig. 14, when the target virtual object exists in the second preset region, the terminal displays a preset special effect (a region surrounded by a curve and an interface edge in the drawing) on an interface edge of the virtual environment, for example, displays a red flickering special effect in a preset range.
The target skill prop also has the function of improving the filling speed of the target shooting prop (such as a rocket gun), the function belongs to passive skill, the user does not need to actively trigger, and the skill is immediately effective when the first virtual object is provided with the target virtual prop until the first virtual object unloads the target skill prop. Wherein the target firing prop is used to reduce the value of the attribute of the virtual object being hit.
And step 1108, responding to the filling operation, filling and supplying the target shooting property according to the target filling speed, wherein the target filling speed is greater than the default filling speed, and the default filling speed is the speed for filling the target shooting property when the target skill property is not equipped.
When the target shooting prop is lack of supply (for example, the number of cannonballs in a rocket cannon is 0), the target shooting prop can be filled and supplied through filling operation, and if the first virtual object is provided with the target skill prop at the moment, the terminal fills the target shooting prop according to the target filling speed. For example, the target prime speed is 120% of the default prime speed.
In the embodiment of the application, the target skill and the passive skill are added to the first virtual object by equipping the first virtual object with the target skill prop, wherein the passive skill comprises a detection skill of the target virtual object and a skill for improving the filling speed of the target shooting prop, so that when a user does not trigger the release of the target skill, the remote operation capability of the first virtual object can be still improved, and the practicability and the utilization rate of the target skill prop are improved.
Fig. 15 is a block diagram of an apparatus for delivering a virtual projectile in a virtual environment, according to an exemplary embodiment of the present application, the apparatus including:
a first control module 1501, configured to control, in response to a trigger operation on a target skill control, a first virtual object to hold a launch point marking prop, where the launch point marking prop is used to mark a launch point of a virtual projectile in a virtual environment, and the virtual projectile is used to change an attribute value of the virtual object within an action range;
a first display module 1502, configured to launch a virtual ray through the drop point marking prop, and display the drop point in the virtual environment based on the virtual ray;
a release module 1503, configured to, in response to a release operation, control a virtual vehicle to release the virtual projectile into the virtual environment based on the release point.
Optionally, the first display module 1502 includes:
the first display unit is used for emitting the virtual ray through the launching point mark prop, and the direction of the virtual ray is consistent with the direction of the launching point mark prop;
and the second display unit is used for determining the intersection point of the virtual ray and the virtual object in the virtual environment as the release point and displaying a release point identifier at the release point.
Optionally, the second display unit is further configured to:
determining the intersection point as the release point and displaying the release point identification at the release point in response to the fact that the distance between the intersection point and the position of the first virtual object is smaller than a first distance threshold;
the device further comprises:
and the second display module is used for responding to the fact that the distance between the intersection point and the position of the first virtual object is larger than the first distance threshold value, and displaying an illegal release identifier at the intersection point.
Optionally, the apparatus further comprises:
the first receiving module is used for receiving control operation on the drop point marked prop and adjusting the orientation of the drop point marked prop;
and the updating module is used for updating the position of the release point in the virtual environment based on the adjusted orientation of the release point marked prop.
Optionally, the releasing module 1503 includes:
a first determining unit, configured to determine, in response to the placement operation, a target placement mode based on the placement point, where different placement modes correspond to different placement accuracies;
the second determining unit is used for determining a release central point based on the release point and the target release mode;
and the releasing unit is used for controlling the virtual carrier to release the virtual throwing object to the virtual environment based on the releasing central point.
Optionally, the first determining unit is further configured to:
determining a first launching mode as the target launching mode in response to the launching distance being smaller than a second distance threshold, wherein the launching distance is the distance between the launching point and the position of the first virtual object;
determining a second delivery mode as the target delivery mode in response to the delivery distance being greater than the second distance threshold;
and the throwing precision of the first throwing mode is higher than that of the second throwing mode.
Optionally, the second determining unit is further configured to:
determining a first delivery area based on the delivery point in response to the target delivery mode being the first delivery mode, wherein the first delivery area is an area with the delivery point as a center; determining the delivery central point from the first delivery area;
responding to the target putting mode as the second putting mode, and determining a second putting area based on the putting point and the putting distance, wherein the second putting area takes the putting point as a center, the second putting area is larger than the first putting area, and the area of the second putting area and the putting distance are in positive correlation; determining the delivery center point from the second delivery area.
Optionally, the apparatus further comprises:
and the switching module is used for responding to the triggering operation of the target skill control and switching the virtual environment picture from the third person view angle to the first person view angle.
Optionally, the apparatus further comprises:
the generating module is used for responding to the virtual throwing object to take effect and generating m sub virtual throwing objects based on the virtual throwing object, wherein m is a positive integer;
the second control module is used for controlling the m virtual throwing objects to move along a preset track;
and the third control module is used for responding to the collision of the sub-virtual throwing object in the motion process and controlling the sub-virtual throwing object to take effect at a collision point, and the sub-virtual throwing object is used for changing the attribute value of the virtual object in the action range.
Optionally, the apparatus further comprises:
a second receiving module, configured to receive a selection operation of a target skill prop in at least one skill prop, and equip the target skill prop for the first virtual object, where the target skill prop is used to add a target skill to a virtual object;
and the third display module is used for displaying the target skill control corresponding to the target skill through a virtual environment interface.
Optionally, the third display module includes:
the switching unit is used for responding to the achievement of the skill cooling duration and switching the display state of the target skill control from the non-triggerable state to a triggerable state;
the first control module 1501 includes:
and the control unit is used for responding to the triggering operation of the target skill control in the triggerable state and controlling the first virtual object to hold the delivery point marking prop.
Optionally, the target skill prop has a detection function on a target virtual object, the target virtual object is a second virtual object in a wing-mounted state, and the second virtual object and the first virtual object belong to different camps;
the device further comprises:
and the fourth display module is used for responding to the target virtual object in the preset area and displaying prompt information in the virtual environment interface.
Optionally, the fourth display module includes:
a third display unit, configured to display, in response to that the target virtual object is located in a first preset area, a location identifier of the target virtual object in a map display control, where the first preset area is an area where the location of the first virtual object is located as a center;
and the fourth display unit is used for responding to the fact that the target virtual object is located in a second preset area, displaying a preset special effect on the edge of the virtual environment interface, wherein the second preset area is an area with the position of the first virtual object as the center, and the second preset area is located in the first preset area.
Optionally, the target skill prop has a function of increasing the filling speed of a target shooting prop, and the target shooting prop is used for reducing the attribute value of the hit virtual object;
the device further comprises:
the fourth control module is used for responding to the selection operation of the target shooting prop and controlling the first virtual object to hold the target shooting prop;
and the filling module is used for responding to filling operation and filling and supplying the target shooting property according to a target filling speed, wherein the target filling speed is greater than a default filling speed, and the default filling speed is the speed of filling the target shooting property when the target skill property is not equipped.
To sum up, in the embodiment of the present application, increase virtual skill for first virtual object, when user control first virtual object triggered target skill, mark the stage property through the put in point that can launch virtual ray, directly select the put in point of virtual throw thing in virtual environment, utilize virtual ray to select the put in point, make the user can make clear and determine the concrete position of put in point in virtual environment, and utilize virtual carrier to put in, can improve and put in the rate of accuracy, the skew ideal put in point of virtual throw thing when avoiding controlling first virtual object and directly putting in, and then improved the practicality of virtual throw thing.
In addition, the launching point can be selected in the virtual environment by marking the item with the launching point of the virtual ray, and compared with a mode of selecting the launching point in a map control, the interaction and the interestingness of launching operation are improved.
Referring to fig. 16, a block diagram of a terminal 1600 according to an exemplary embodiment of the present application is shown. The terminal 1600 may be a portable mobile terminal such as: the mobile phone comprises a smart phone, a tablet computer, a motion Picture Experts Group Audio Layer 3 (MP 3) player and a motion Picture Experts Group Audio Layer 4 (MP 4) player. Terminal 1600 may also be referred to by other names such as user equipment, portable terminal, etc.
Generally, terminal 1600 includes: a processor 1601, and a memory 1602.
In some embodiments, the terminal 1600 may also optionally include: peripheral interface 1603 and at least one peripheral. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1604, a touch screen display 1605, a camera 1606, audio circuitry 1607, a positioning component 1608, and a power supply 1609.
The Radio Frequency circuit 1604 is used for receiving and transmitting Radio Frequency (RF) signals, also known as electromagnetic signals. The radio frequency circuitry 1604 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1604 converts the electrical signal into an electromagnetic signal to be transmitted, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1604 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1604 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or Wireless Fidelity (WiFi) networks. In some embodiments, the rf circuit 1604 may also include Near Field Communication (NFC) related circuits, which are not limited in this application.
The touch display 1605 is used to display a UI. The UI may include graphics, text, icons, video, and any combination thereof. The touch display 1605 also has the ability to capture touch signals on or over the surface of the touch display 1605. The touch signal may be input to the processor 1601 as a control signal for processing. The touch display 1605 is used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the touch display 1605 may be one, providing the front panel of the terminal 1600; in other embodiments, the touch display screens 1605 can be at least two, respectively disposed on different surfaces of the terminal 1600 or in a folded design; in still other embodiments, the touch display 1605 can be a flexible display disposed on a curved surface or on a folded surface of the terminal 1600. Even the touch display screen 1605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The touch screen 1605 may be made of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The camera assembly 1606 is used to capture images or video. Optionally, camera assembly 1606 includes a front camera and a rear camera. Generally, a front camera is used for realizing video call or self-shooting, and a rear camera is used for realizing shooting of pictures or videos. In some embodiments, the number of the rear cameras is at least two, and each of the rear cameras is any one of a main camera, a depth-of-field camera and a wide-angle camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting function and a Virtual Reality (VR) shooting function. In some embodiments, camera assembly 1606 can also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1607 is used to provide an audio interface between a user and the terminal 1600. The audio circuitry 1607 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1601 for processing or inputting the electric signals to the radio frequency circuit 1604 to achieve voice communication. For stereo sound acquisition or noise reduction purposes, the microphones may be multiple and disposed at different locations of terminal 1600. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1601 or the radio frequency circuit 1604 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1607 may also include a headphone jack.
The positioning component 1608 is configured to locate a current geographic Location of the terminal 1600 for purposes of navigation or Location Based Service (LBS). The Positioning component 1608 may be a Positioning component based on the Global Positioning System (GPS) of the united states, the beidou System of china, or the galileo System of russia.
In some embodiments, terminal 1600 also includes one or more sensors 1610. The one or more sensors 1610 include, but are not limited to: acceleration sensor 1611, gyro sensor 1612, pressure sensor 1613, fingerprint sensor 1614, optical sensor 1615, and proximity sensor 1616.
Acceleration sensor 1611 may detect acceleration in three coordinate axes of a coordinate system established with terminal 1600. For example, the acceleration sensor 1611 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1601 may control the touch display screen 1605 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1611. The acceleration sensor 1611 may also be used for acquisition of motion data of a game or a user.
Gyroscope sensor 1612 can detect the organism direction and the turned angle of terminal 1600, and gyroscope sensor 1612 can gather the 3D action of user to terminal 1600 with acceleration sensor 1611 in coordination. From the data collected by the gyro sensor 1612, the processor 1601 may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1613 may be disposed on a side bezel of terminal 1600 and/or underlying touch display 1605. When the pressure sensor 1613 is disposed on the side frame of the terminal 1600, a user's holding signal of the terminal 1600 may be detected, and left-right hand recognition or shortcut operation may be performed according to the holding signal. When the pressure sensor 1613 is disposed at the lower layer of the touch display 1605, the operability control on the UI interface can be controlled according to the pressure operation of the user on the touch display 1605. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1614 is used to collect a fingerprint of the user to identify the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1601 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1614 may be disposed on the front, back, or side of the terminal 1600. When a physical key or a vendor Logo (Logo) is provided on the terminal 1600, the fingerprint sensor 1614 may be integrated with the physical key or the vendor Logo.
The optical sensor 1615 is used to collect ambient light intensity. In one embodiment, the processor 1601 may control the display brightness of the touch display screen 1605 based on the ambient light intensity collected by the optical sensor 1615. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1605 is increased; when the ambient light intensity is low, the display brightness of the touch display 1605 is turned down. In another embodiment, the processor 1601 may also dynamically adjust the shooting parameters of the camera assembly 1606 based on the ambient light intensity collected by the optical sensor 1615.
A proximity sensor 1616, also referred to as a distance sensor, is typically disposed on the front side of terminal 1600. The proximity sensor 1616 is used to collect the distance between the user and the front surface of the terminal 1600. In one embodiment, the processor 1601 controls the touch display 1605 to switch from the light screen state to the rest screen state when the proximity sensor 1616 detects that the distance between the user and the front surface of the terminal 1600 is gradually decreased; when the proximity sensor 1616 detects that the distance between the user and the front surface of the terminal 1600 is gradually increased, the touch display 1605 is controlled by the processor 1601 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 12 is not intended to be limiting of terminal 1600, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
The present embodiments also provide a computer-readable storage medium storing at least one instruction, which is loaded and executed by a processor to implement the method for delivering a virtual projectile in a virtual environment according to the above embodiments.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the terminal reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the terminal to perform the method of delivering a virtual projectile in a virtual environment provided in the various alternative implementations of the above aspects.
Those skilled in the art will recognize that, in one or more of the examples described above, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable storage medium. Computer-readable storage media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.
Claims (17)
1. A method of delivering a virtual projectile in a virtual environment, the method comprising:
in response to the triggering operation of the target skill control, controlling a first virtual object to hold a drop point marking prop, wherein the drop point marking prop is used for marking a drop point of a virtual throwing object in a virtual environment, and the virtual throwing object is used for changing the attribute value of the virtual object in an action range;
marking a prop through the release point to emit a virtual ray, and displaying the release point in the virtual environment based on the virtual ray;
and responding to a throwing operation, and controlling a virtual carrier to throw the virtual throwing object into the virtual environment based on the throwing point.
2. The method of claim 1, wherein said launching a virtual ray through said drop point marking prop and displaying said drop point in said virtual environment based on said virtual ray comprises:
emitting the virtual ray through the launching point mark prop, wherein the direction of the virtual ray is consistent with the direction of the launching point mark prop;
and determining the intersection point of the virtual ray and the virtual object in the virtual environment as the release point, and displaying a release point identifier at the release point.
3. The method of claim 2, wherein determining the point of intersection of the virtual ray with a virtual object in the virtual environment as the drop point and displaying a drop point identification at the drop point comprises:
determining the intersection point as the release point and displaying the release point identification at the release point in response to the fact that the distance between the intersection point and the position of the first virtual object is smaller than a first distance threshold;
the method further comprises the following steps:
and in response to the distance between the intersection point and the position of the first virtual object being larger than the first distance threshold, displaying an illegal release identifier at the intersection point.
4. The method of claim 2, wherein after the launching of a virtual ray through the drop point marking prop and displaying the drop point in the virtual environment based on the virtual ray, the method comprises:
receiving control operation on the drop point marked prop, and adjusting the orientation of the drop point marked prop;
and updating the position of the release point in the virtual environment based on the adjusted orientation of the release point marking prop.
5. The method of any of claims 1 to 4, wherein said controlling a virtual vehicle to launch the virtual projectile into the virtual environment based on the launch point in response to a launch operation comprises:
responding to the releasing operation, determining a target releasing mode based on the releasing points, wherein different releasing modes correspond to different releasing precisions;
determining a release central point based on the release point and the target release mode;
and controlling the virtual carrier to throw the virtual throwing object into the virtual environment based on the throwing center point.
6. The method of claim 5, wherein determining a target placement based on the placement points comprises:
determining a first launching mode as the target launching mode in response to the launching distance being smaller than a second distance threshold, wherein the launching distance is the distance between the launching point and the position of the first virtual object;
determining a second delivery mode as the target delivery mode in response to the delivery distance being greater than the second distance threshold;
and the throwing precision of the first throwing mode is higher than that of the second throwing mode.
7. The method of claim 6, wherein determining a placement center point based on the placement point and the target placement comprises:
determining a first delivery area based on the delivery point in response to the target delivery mode being the first delivery mode, wherein the first delivery area is an area with the delivery point as a center; determining the delivery central point from the first delivery area;
responding to the target putting mode as the second putting mode, and determining a second putting area based on the putting point and the putting distance, wherein the second putting area takes the putting point as a center, the second putting area is larger than the first putting area, and the area of the second putting area and the putting distance are in positive correlation; determining the delivery center point from the second delivery area.
8. The method of any of claims 1 to 4, further comprising:
and responding to the triggering operation of the target skill control, and switching the virtual environment picture from the third person view angle to the first person view angle.
9. The method of any of claims 1 to 4, wherein after said controlling a virtual vehicle to launch said virtual projectile into said virtual environment based on said launch point, said method further comprises:
in response to the virtual throwing object taking effect, generating m sub-virtual throwing objects based on the virtual throwing object, wherein m is a positive integer;
controlling the m virtual throwing objects to move along a preset track;
and in response to the collision of the sub-virtual throwing object in the motion process, controlling the sub-virtual throwing object to take effect at a collision point, wherein the sub-virtual throwing object is used for changing the attribute value of the virtual object in the action range.
10. The method according to any one of claims 1 to 4, wherein before controlling the first virtual object to hold the drop point marking prop in response to the triggering operation of the target skill control, the method comprises:
receiving selection operation of a target skill item in at least one skill item, and equipping the target skill item for the first virtual object, wherein the target skill item is used for adding a target skill for the virtual object;
and displaying the target skill control corresponding to the target skill through a virtual environment interface.
11. The method of claim 10, wherein displaying the target skill control corresponding to the target skill through a virtual environment interface comprises:
displaying the target skill control in a non-triggerable state through a virtual environment interface;
in response to reaching a skill cooling duration, switching a display state of the target skill control from the non-triggerable state to a triggerable state;
the step of controlling the first virtual object to hold the drop point marking prop in response to the triggering operation of the target skill control comprises the following steps:
and controlling the first virtual object to hold the drop point marking prop in response to the triggering operation of the target skill control in the triggerable state.
12. The method according to claim 10, wherein the target skill prop has a reconnaissance function on a target virtual object, the target virtual object being a second virtual object in a wing-mounted state, the second virtual object belonging to a different camp than the first virtual object;
after receiving a selection operation of a target skill item of at least one skill item, equipping the first virtual object with the target skill item, the method further comprises:
and responding to the target virtual object in the preset area, and displaying prompt information in a virtual environment interface.
13. The method of claim 12, wherein the displaying a prompt in a virtual environment interface in response to the target virtual object being present within the predetermined area comprises:
responding to the target virtual object being located in a first preset area, and displaying a position identifier of the target virtual object in a map display control, wherein the first preset area is an area with the position of the first virtual object as the center;
and responding to the target virtual object being located in a second preset area, and displaying a preset special effect on the edge of the virtual environment interface, wherein the second preset area is an area with the position of the first virtual object as the center, and the second preset area is located in the first preset area.
14. The method of claim 10, wherein the target skill prop has the function of increasing the packing speed of a target shooting prop used to reduce the value of an attribute of a virtual object being hit;
the method further comprises the following steps:
responding to the selection operation of a target shooting prop, and controlling the first virtual object to hold the target shooting prop;
responding to a filling operation, and filling and supplying the target shooting prop according to a target filling speed, wherein the target filling speed is greater than a default filling speed, and the default filling speed is the speed of filling the target shooting prop when the target skill prop is not equipped.
15. A device for delivering a virtual projectile in a virtual environment, the device comprising:
the first control module is used for responding to triggering operation of the target skill control, controlling a first virtual object to hold a throwing point marking prop, wherein the throwing point marking prop is used for marking a throwing point of a virtual throwing object in a virtual environment, and the virtual throwing object is used for changing an attribute value of the virtual object in an action range;
the first display module is used for emitting virtual rays through the launching point mark prop and displaying the launching point in the virtual environment based on the virtual rays;
and the releasing module is used for responding to releasing operation and controlling the virtual carrier to release the virtual throwing object to the virtual environment based on the releasing point.
16. A terminal, characterized in that the terminal comprises a processor and a memory; the memory has stored therein at least one instruction, at least one program, set of codes, or set of instructions that is loaded and executed by the processor to implement a method of delivering a virtual projectile in a virtual environment as claimed in any one of claims 1 to 14.
17. A computer-readable storage medium having at least one computer program stored thereon, the computer program being loaded and executed by a processor to implement a method of delivering a virtual projectile in a virtual environment as claimed in any one of claims 1 to 14.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110439760.2A CN113041622B (en) | 2021-04-23 | 2021-04-23 | Method, terminal and storage medium for throwing virtual throwing object in virtual environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110439760.2A CN113041622B (en) | 2021-04-23 | 2021-04-23 | Method, terminal and storage medium for throwing virtual throwing object in virtual environment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113041622A true CN113041622A (en) | 2021-06-29 |
CN113041622B CN113041622B (en) | 2023-04-28 |
Family
ID=76520039
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110439760.2A Active CN113041622B (en) | 2021-04-23 | 2021-04-23 | Method, terminal and storage medium for throwing virtual throwing object in virtual environment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113041622B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113633972A (en) * | 2021-08-31 | 2021-11-12 | 腾讯科技(深圳)有限公司 | Using method, device, terminal and storage medium of virtual prop |
CN114385004A (en) * | 2021-12-15 | 2022-04-22 | 北京五八信息技术有限公司 | Interaction method and device based on augmented reality, electronic equipment and readable medium |
WO2023005234A1 (en) * | 2021-07-30 | 2023-02-02 | 网易(杭州)网络有限公司 | Virtual resource delivery control method and apparatus, computer device, and storage medium |
WO2023160068A1 (en) * | 2022-02-28 | 2023-08-31 | 腾讯科技(深圳)有限公司 | Virtual subject control method and apparatus, device, and medium |
WO2024037559A1 (en) * | 2022-08-18 | 2024-02-22 | 北京字跳网络技术有限公司 | Information interaction method and apparatus, and human-computer interaction method and apparatus, and electronic device and storage medium |
-
2021
- 2021-04-23 CN CN202110439760.2A patent/CN113041622B/en active Active
Non-Patent Citations (1)
Title |
---|
极客芬达: "《bilibili》", 2 March 2021 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023005234A1 (en) * | 2021-07-30 | 2023-02-02 | 网易(杭州)网络有限公司 | Virtual resource delivery control method and apparatus, computer device, and storage medium |
CN113633972A (en) * | 2021-08-31 | 2021-11-12 | 腾讯科技(深圳)有限公司 | Using method, device, terminal and storage medium of virtual prop |
CN113633972B (en) * | 2021-08-31 | 2023-07-21 | 腾讯科技(深圳)有限公司 | Virtual prop using method, device, terminal and storage medium |
CN114385004A (en) * | 2021-12-15 | 2022-04-22 | 北京五八信息技术有限公司 | Interaction method and device based on augmented reality, electronic equipment and readable medium |
WO2023160068A1 (en) * | 2022-02-28 | 2023-08-31 | 腾讯科技(深圳)有限公司 | Virtual subject control method and apparatus, device, and medium |
WO2024037559A1 (en) * | 2022-08-18 | 2024-02-22 | 北京字跳网络技术有限公司 | Information interaction method and apparatus, and human-computer interaction method and apparatus, and electronic device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN113041622B (en) | 2023-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110694261B (en) | Method, terminal and storage medium for controlling virtual object to attack | |
CN110448891B (en) | Method, device and storage medium for controlling virtual object to operate remote virtual prop | |
KR102619439B1 (en) | Methods and related devices for controlling virtual objects | |
CN110917619B (en) | Interactive property control method, device, terminal and storage medium | |
CN112076467B (en) | Method, device, terminal and medium for controlling virtual object to use virtual prop | |
CN113041622B (en) | Method, terminal and storage medium for throwing virtual throwing object in virtual environment | |
CN112870715B (en) | Virtual item putting method, device, terminal and storage medium | |
CN111714893A (en) | Method, device, terminal and storage medium for controlling virtual object to recover attribute value | |
CN112316421B (en) | Equipment method, device, terminal and storage medium of virtual item | |
CN110507990B (en) | Interaction method, device, terminal and storage medium based on virtual aircraft | |
WO2021147496A1 (en) | Method and apparatus for using virtual prop, and device and storage meduim | |
CN111921190B (en) | Prop equipment method, device, terminal and storage medium for virtual object | |
CN112138384A (en) | Using method, device, terminal and storage medium of virtual throwing prop | |
CN111589150A (en) | Control method and device of virtual prop, electronic equipment and storage medium | |
CN111001159A (en) | Virtual item control method, device, equipment and storage medium in virtual scene | |
CN113713382A (en) | Virtual prop control method and device, computer equipment and storage medium | |
CN112717410B (en) | Virtual object control method and device, computer equipment and storage medium | |
CN111744186A (en) | Virtual object control method, device, equipment and storage medium | |
CN112402964B (en) | Using method, device, equipment and storage medium of virtual prop | |
CN113117330A (en) | Skill release method, device, equipment and medium for virtual object | |
CN112057857A (en) | Interactive property processing method, device, terminal and storage medium | |
CN112933601A (en) | Virtual throwing object operation method, device, equipment and medium | |
CN113713383A (en) | Throwing prop control method and device, computer equipment and storage medium | |
CN114130031A (en) | Using method, device, equipment, medium and program product of virtual prop | |
CN112044073A (en) | Using method, device, equipment and medium of virtual prop |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 40045971 Country of ref document: HK |
|
GR01 | Patent grant | ||
GR01 | Patent grant |