CN113680061B - Virtual prop control method, device, terminal and storage medium - Google Patents

Virtual prop control method, device, terminal and storage medium Download PDF

Info

Publication number
CN113680061B
CN113680061B CN202111032763.0A CN202111032763A CN113680061B CN 113680061 B CN113680061 B CN 113680061B CN 202111032763 A CN202111032763 A CN 202111032763A CN 113680061 B CN113680061 B CN 113680061B
Authority
CN
China
Prior art keywords
target
prop
virtual
virtual environment
controlling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111032763.0A
Other languages
Chinese (zh)
Other versions
CN113680061A (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111032763.0A priority Critical patent/CN113680061B/en
Publication of CN113680061A publication Critical patent/CN113680061A/en
Application granted granted Critical
Publication of CN113680061B publication Critical patent/CN113680061B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/577Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application discloses a control method, a device, a terminal and a storage medium for virtual props, and belongs to the technical fields of computers and Internet. The method comprises the following steps: displaying a user interface, wherein a virtual environment picture and a prop use control are displayed in the user interface; controlling the virtual object to release the target prop in the virtual environment in response to the operation of using the control for the prop; controlling the target prop to move on the ground of the virtual environment and tracking the target object; and responding to the condition that the target object in the virtual environment meets the triggering condition of the target prop, and controlling the target prop to conduct range attack on the area where the target object is located. The virtual prop with the target tracking function and the range injury function is provided, the virtual prop automatically reaches and locks the attack target by utilizing the autonomous movement and the target tracking function of the virtual prop, so that throwing operation is replaced, the virtual prop is prevented from being discovered by an enemy when being used, and the use effectiveness of the virtual prop is fully improved.

Description

Virtual prop control method, device, terminal and storage medium
Technical Field
The present disclosure relates to the field of computers and internet technologies, and in particular, to a method, an apparatus, a terminal, and a storage medium for controlling a virtual prop.
Background
In a shooting game, a prop having a range-injury function is provided, and a user can conduct range attack on a target area by using the prop having the range-injury function.
In the related art, a throwing prop with a range injury function is provided, when a user controls a virtual object to release the throwing prop, a parabolic throwing process is needed, the starting point of the parabola is a position point of the virtual object controlled by the user, and the falling point is a target area. The user can conduct range injury on the target area by controlling the virtual object to throw the throwing prop to the target area.
However, the release of the throwing prop requires a parabolic throwing process, and the action of the virtual object is obvious and is easily found and prepared by an adversary, so that the use effectiveness of the throwing prop is poor.
Disclosure of Invention
The embodiment of the application provides a control method, a device, a terminal and a storage medium for virtual props, which can improve the use effectiveness of the virtual props. The technical scheme is as follows:
according to an aspect of the embodiments of the present application, there is provided a method for controlling a virtual prop, the method including:
Displaying a user interface, wherein a virtual environment picture and a prop use control are displayed in the user interface, and the virtual environment picture is a picture for observing a virtual environment from the view angle of a virtual object;
controlling the virtual object to release a target prop in the virtual environment in response to the operation of using a control for the prop, wherein the target prop is a prop with a target tracking function and a range injury function;
controlling the target prop to move on the ground of the virtual environment and tracking a target object;
and responding to the target object in the virtual environment to meet the triggering condition of the target prop, and controlling the target prop to conduct range attack on the area where the target object is located.
According to an aspect of the embodiments of the present application, there is provided a control device for a virtual prop, the device including:
the display module is used for displaying a user interface, wherein a virtual environment picture and a prop use control are displayed in the user interface, and the virtual environment picture is a picture for observing a virtual environment from the view angle of a virtual object;
the releasing module is used for responding to the operation of using a control for the prop and controlling the virtual object to release a target prop in the virtual environment, wherein the target prop is a prop with a target tracking function and a range injury function;
The moving module is used for controlling the target prop to move on the ground of the virtual environment and tracking a target object;
and the attack module is used for responding to the condition that the target object in the virtual environment meets the triggering condition of the target prop and controlling the target prop to attack the range of the area where the target object is located.
According to an aspect of an embodiment of the present application, there is provided a terminal, including a processor and a memory, where the memory stores at least one instruction, at least one section of program, a code set, or an instruction set, and the at least one instruction, the at least one section of program, the code set, or the instruction set is loaded and executed by the processor to implement the method for controlling a virtual prop described above.
According to an aspect of the embodiments of the present application, there is provided a computer readable storage medium having stored therein at least one instruction, at least one program, a code set, or an instruction set, which is loaded and executed by the processor to implement the method for controlling a virtual prop described above.
According to an aspect of embodiments of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the terminal reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the terminal executes the control method of the virtual prop.
The technical scheme provided by the embodiment of the application can bring the following beneficial effects:
by providing the virtual prop with the target tracking function and the range injury function, the virtual prop can automatically reach and lock the attack target by utilizing the autonomous movement and the target tracking function of the virtual prop, so that throwing operation is replaced, the virtual prop is prevented from being discovered by an enemy when in use, the use effectiveness of the virtual prop is fully improved, and the virtual prop can generate range attack on a certain area in the virtual environment under the condition that the triggering condition is met, so that the better prop use effect is achieved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an implementation environment for an embodiment provided herein;
FIG. 2 is a flow chart of a method for controlling a virtual prop provided in one embodiment of the present application;
FIG. 3 is a schematic illustration of a user interface provided in one embodiment of the present application;
FIG. 4 is a schematic illustration of target prop release provided by one embodiment of the present application;
FIG. 5 is a schematic diagram of a target prop selection interface provided by one embodiment of the present application;
FIG. 6 is a flow chart of a method for controlling a virtual prop provided in one embodiment of the present application;
FIG. 7 is a schematic diagram of a target prop movement direction provided by one embodiment of the present application;
FIG. 8 is a schematic diagram of a target prop detection zone provided by one embodiment of the present application;
FIG. 9 is a schematic diagram of a method for determining a target object according to one embodiment of the present application;
FIG. 10 is a schematic diagram of a method for determining a target object according to one embodiment of the present application;
FIG. 11 is an interface schematic diagram of a range attack by a target prop provided in one embodiment of the present application;
FIG. 12 is an interface schematic diagram of a target prop attack range provided by one embodiment of the present application;
FIG. 13 is an interface schematic diagram of a target prop attack range provided by one embodiment of the present application;
FIG. 14 is a block diagram of a control device for virtual props provided in one embodiment of the present application;
FIG. 15 is a block diagram of a control device for virtual props provided in one embodiment of the present application;
Fig. 16 is a block diagram of a terminal according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Referring to fig. 1, a schematic diagram of an implementation environment of an embodiment of the present application is shown. The implementation environment of the scheme can be realized as a control system of the virtual prop. The implementation environment may include: a terminal 10 and a server 20.
The terminal 10 may be an electronic device such as a cell phone, tablet computer, game console, electronic book reader, multimedia playing device, wearable device, PC (Personal Computer ) or the like. A client of a target application (e.g., a game application) may be installed in the terminal 10.
In the embodiment of the present application, the target application refers to an application that provides a shooting scene. The target application can provide a virtual environment in which virtual characters substituted and operated by the user are active, such as walking, shooting, etc. Optionally, the target application includes a fire class application. Illustratively, the shooting class applications include shooting game class applications, virtual Reality (VR) class shooting applications, augmented Reality (Augmented Reality, AR) class shooting applications, three-dimensional map programs, military simulation programs, social class applications, interactive entertainment class applications, and so forth. Typically, the target application may be a TPS (Third-Personal Shooting Game, third-party shooting game), FPS (First-Person Shooting Game, first-party shooting game), MOBA (Multiplayer Online Battle Arena, multiplayer online tactical competition) game, multiplayer gunfight survival game, and so forth. In addition, for different target applications, the forms of the virtual objects provided by the target applications may be different, and the corresponding functions may be different, which may be configured in advance according to actual requirements, which is not limited in the embodiment of the present application.
The virtual environment is a scene displayed (or provided) when a client of a target application program (such as a game application program) runs on a terminal, and the virtual environment refers to a created scene for a virtual object to perform activities (such as game competition), such as a virtual house, a virtual island, a virtual map and the like. The virtual environment may be a simulation environment for the real world, a semi-simulation and semi-imaginary environment, or a pure imaginary environment. The virtual environment may be a two-dimensional virtual environment, a 2.5-dimensional virtual environment, or a three-dimensional virtual environment, which is not limited in the embodiment of the present application. Different time periods may display (or provide) different virtual environments when the client of the target application is running on the terminal.
The virtual object refers to a virtual role controlled by the user account in the target application program. Taking the target application as an example of a game application, the virtual object refers to a user account or a game character controlled by AI (Artificial Intelligence ) in the game application. The virtual object may be in the form of a character, which may be an animal, a cartoon, or other form, and embodiments of the present application are not limited in this regard. The virtual object may be displayed in a three-dimensional form or a two-dimensional form, which is not limited in the embodiment of the present application. Alternatively, when the virtual environment is a three-dimensional virtual environment, the virtual object is a three-dimensional stereoscopic model created based on an animated skeleton technique. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
In this embodiment of the present application, the virtual object performs the fight using a virtual weapon. Virtual weapons refer to weapons provided by the system for virtual object combat, including virtual firearms, virtual sword, virtual cannonballs, and the like. The virtual weapon may be displayed in three dimensions or two dimensions, which are not limited in this embodiment. Optionally, when the virtual environment is a three-dimensional virtual environment, the virtual weapon is a three-dimensional volumetric model. Each virtual weapon has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
The server 20 is used to provide background services for clients of target applications in the terminal 10. For example, the server 20 may be a background server of the target application program described above. The server 20 may be a server, a server cluster comprising a plurality of servers, or a cloud computing service center. Alternatively, the server 20 provides background services for target applications in a plurality of terminals 10 at the same time.
The terminal 10 and the server 20 can communicate with each other via a network 30. The network 30 may be a wired network or a wireless network.
In the embodiment of the method, the execution subject of each step may be a terminal, such as a client of the above-mentioned target application running in the terminal. In some embodiments, the target application is an application developed based on a three-dimensional virtual environment engine, for example, the virtual environment engine is a Unity engine, and the virtual environment engine can construct a three-dimensional virtual environment, virtual objects, virtual props and the like, so as to bring more immersive game experience to the user.
Referring to fig. 2, a flowchart of a method for controlling a virtual prop according to an embodiment of the present application is shown. The method may include the following steps (210-240):
step 210, displaying a user interface, wherein a virtual environment picture and a prop use control are displayed in the user interface, and the virtual environment picture is a picture for observing the virtual environment from the perspective of the virtual object.
The virtual environment screen is a screen for observing the virtual environment at the view angle of the virtual object, and optionally, the view angle of the virtual object may be a first person view angle or a third person view angle. Elements in the virtual environment, such as virtual buildings, virtual props, virtual objects and the like, are displayed in the virtual environment picture. In addition to the virtual environment picture, the user interface also displays a prop use control, which can be a button, a sliding block, a sliding bar and the like, so that the user can operate.
In an embodiment of the present application, the client displays a user interface. Optionally, the user interface includes a virtual environment screen, and a control display layer positioned above the virtual environment screen. The virtual environment picture is a display picture corresponding to the virtual environment and is used for displaying the virtual environment and elements in the virtual environment. The control display layer is used for displaying operation controls so as to realize a man-machine interaction function. Alternatively, the operation control may include a button, a slider, a sliding bar, or the like, which is not limited in the embodiment of the present application. In the embodiment of the application, the control display layer displays a prop use control. Illustratively, referring to FIG. 3, the client displays a user interface 310, with a virtual environment screen and prop use controls 320 displayed in the user interface 310. Optionally, a virtual object 330 is displayed in the virtual environment screen.
Step 220, in response to the operation of using the control for the prop, controlling the virtual object to release the target prop in the virtual environment, wherein the target prop is a prop with a target tracking function and a range injury function.
The operations of using the control for the prop include clicking, long pressing, sliding, and the like, which are not limited in this application. In one example, in response to a click operation using a control for a prop, a client controls a virtual object to release a target prop in a virtual environment. In another example, in response to a long press operation using a control for a prop, a client controls a virtual object to release a target prop in a virtual environment.
Optionally, the client controls the position of the virtual object to release the target prop in the virtual environment to be set by the system. For example, when the system sets the virtual object to release the target prop in the virtual environment, the target prop is released at any point in a set area centered on the virtual object, for example, when the system sets the virtual object to release the target prop in the virtual environment, the target prop is released at any point in a circular area centered on the virtual object, and then, in response to an operation of using the control for the prop, the client controls the virtual object in the virtual environment, and releases the target prop at any point in the circular area centered on the virtual object. Alternatively, in the case where the virtual environment is a three-dimensional virtual environment, the set area is also a three-dimensional set area. Illustratively, in the case where the virtual environment is a three-dimensional virtual environment, the system sets the virtual object to release the target prop at any point within a cylindrical region centered on the virtual object when the target prop is released in the virtual environment.
Optionally, the client controls the position of the virtual object in the virtual environment where the target prop is released to be selected by the user through the prop use control. For example, the operation of the prop use control is a sliding operation, and the client controls the virtual object to release the target prop to the target position in the virtual environment in response to the touch position moving from the prop use control to the target position. Alternatively, the target location may be any point in the virtual environment, as this application is not limited in this regard.
The target prop refers to a prop with a target tracking function and a range injury function, and optionally, the target prop moves on the ground of the virtual environment after being released. Illustratively, referring to FIG. 4, in response to an operation for a prop use control, a client controls virtual object 330 to release a target prop 340 in a virtual environment. Target prop 340 moves on the ground of the virtual environment after release.
Alternatively, the form of the target prop may be a robotic form, a virtual animal form, etc., which is not limited in this application.
Optionally, the morphology of the target prop is controlled by the system. Illustratively, the system randomly selects one of a plurality of target prop configurations, e.g., the system randomly selects one of a robotic configuration, a virtual cat configuration, a virtual snake configuration. Optionally, the morphology of the target prop is set by the user. Illustratively, the user autonomously selects among a plurality of configurations of the target prop provided by the system, e.g., the system provides three different configurations of the robotic configuration, the virtual cat configuration, and the virtual snake configuration of the target prop, one of the three configurations being autonomously selected by the user.
Alternatively, the attack mode of the target prop may be explosion, release of toxic gas, etc., which is not limited in this application. Optionally, the attack mode of the target prop is controlled by the system. Illustratively, the system randomly selects one of a plurality of attack patterns of the target prop, for example, the system randomly selects one of the attack patterns in the explosion and release of toxic gas. Optionally, the attack mode of the target prop is set by the user. The user may select one of the attack modes of the target prop provided by the system, for example, the system may provide two attack modes of explosion of the target prop and release of toxic gas.
Optionally, the form of the target prop and the attack mode of the target prop can be freely combined. Illustratively, the target prop is in the form of a robot, and the attack mode is explosion, then the target prop may be referred to as an explosion robot. An exemplary target prop is in the form of a virtual cat and the attack is the release of toxic gases, then the target prop may be referred to as a toxic gas cat.
At step 230, the target prop is controlled to move over the ground of the virtual environment and track the target object.
Optionally, the client controls the target prop to move and track the target object on the ground of the virtual environment. The ground of the virtual environment refers to the virtual ground portion in the virtual scene displayed by the client. Illustratively, the client displays the virtual house scene, and the ground of the virtual environment refers to the virtual ground in the virtual house scene.
Optionally, the target prop moves on the ground of the virtual environment according to the planned route. Illustratively, the system provides a planned travel route for the target prop, which moves according to the planned travel route provided by the system. Optionally, the target prop moves randomly on the ground of the virtual environment according to the actual situation. For example, if the target prop encounters an obstacle while moving, the movement is continued bypassing the obstacle, and if the target prop does not encounter an obstacle while moving, the movement is straight.
Alternatively, the tracking target of the target prop may be an hostile virtual object. Alternatively, the hostile virtual object may be AI-controlled or other user-controlled, which is not limited in this application. In the embodiment of the application, the target prop moves on the ground of the virtual environment according to the detection track. Detecting whether a target object meeting tracking conditions exists in a detection area of the target prop in the process of moving the target prop according to the detection track; and if the target object meeting the tracking condition exists, controlling the target prop to track the target object for movement.
The detection track refers to a movement track obtained by the target prop according to a detection result of the movement track of the target object in the detection area. Optionally, the detection trajectory is the same as a movement trajectory of the target object within the detection area. Optionally, the detection track is the shortest movement track of the target prop reaching the current position of the target object.
The detection region is a region in which the target prop can ascertain whether or not there is a target object satisfying the tracking condition. The target prop can acquire the attribute information of the target object in the detection area. The attribute information of the target object includes, but is not limited to, a distance between the target object and the target prop, a life value of the target object, a moving speed of the target object, a number of clicks of the target object, a level of the target object, and a virtual resource value owned by the target object. Alternatively, the detection region is a region centered on the target prop position, such as a circular region, a rectangular region, or the like centered on the target prop position. Alternatively, the detection region is a region radiating outwardly from the target prop location, such as a sector-shaped region, a triangle-shaped region, or the like radiating outwardly from the target prop location.
And step 240, in response to the target object in the virtual environment meeting the triggering condition of the target prop, controlling the target prop to perform range attack on the area where the target object is located.
The triggering condition refers to a condition that triggers the target prop range injury function. Optionally, the triggering condition of the target prop includes that the direct distance between the target prop and the target object is smaller than a threshold value. Alternatively, the threshold value may be set by the system, or may be set by user-definition, which is not limited in this application. Illustratively, in response to the distance between the target prop and the target object in the virtual environment being less than the threshold value, the client controls the target prop to conduct range attack on the region in which the target object is located.
Optionally, the triggering condition of the target prop includes that the moving speed of the target object is smaller than a threshold value. Alternatively, the threshold value may be set by the system, or may be set by user-definition, which is not limited in this application.
Optionally, step 220 is preceded by the further step of: and displaying a target prop selection interface. The target prop selection interface is provided with a target prop selection control. And in response to the operation of selecting the control for the target prop, displaying a target prop information interface. The information interface of the target prop is displayed with equipment control and camouflage control, and information of the target prop. Equipment controls refer to controls for controlling virtual object equipment target props. The disguised control means a control for controlling the appearance of the replacement of the target object. Optionally, the information of the target prop includes an attack value of the target prop, a endurance value of the target prop, an attack range of the target prop, an attack mode of the target prop, and the like, which is not limited in the present application.
Illustratively, as shown in FIG. 5, the client displays a target prop selection interface 510, with a blasting robot selection control 520 displayed in the target prop selection interface 510. The user clicks the explosion robot selection control 520, and the client displays an explosion robot information interface 530, wherein information of the equipment control 540, the disguised control 550 and the explosion robot is displayed in the explosion robot information interface 530.
In response to an operation for the equipment control, the virtual object equipment target prop is controlled. When the client displays the user interface, a target prop use control displayed in the user interface is used for controlling the virtual object to use the target prop.
Illustratively, as shown in fig. 5, the user clicks on the equipment control 540 and the client controls the virtual object equipment blasting robot. When the client displays the user interface, the target prop use control displayed in the user interface is used for controlling the virtual object to use the blasting robot.
And controlling the target prop to camouflage in response to the operation of the camouflage control. Optionally, camouflage includes changing the color of the target prop, the shape of the target prop, and the size of the target prop. Illustratively, as shown in FIG. 5, the user clicks a disguise control 550 and the client controls the blasting robot to disguise. For example, after the user clicks the disguise control 550, the client changes the color of the blasting robot from red to black.
It should be noted that, based on the specific implementation of the target application program, the target prop may be selected for use in the game before the game is opened to the game (such as the game), or may be selected for use after the game is opened to the game (such as the game), which is not limited in this application.
In summary, the technical solution provided in the embodiments of the present application, by providing a virtual prop with a target tracking function and a range damage function, and by using the autonomous movement and the target tracking function of the virtual prop, the virtual prop automatically reaches and locks an attack target, thereby replacing throwing operation, avoiding the use of the prop from being found by an enemy, fully improving the use effectiveness of the virtual prop, and generating a range attack on a certain area in the virtual environment when the virtual prop meets a triggering condition, thereby achieving a better prop use effect.
Referring to fig. 6, a flowchart of a method for controlling a virtual prop according to an embodiment of the present application is shown.
And controlling the virtual object to release the target prop in the virtual environment in response to the operation of the prop use control. Illustratively, as shown in FIG. 4, the user clicks on a prop use control and the client controls virtual object 330 to release target prop 340 in the virtual environment. In one example, after the client controls the virtual object to release the target prop, the position of the target prop in the virtual environment is displayed through the virtual map. In another example, the client does not display the location of the target prop after controlling the virtual object to release the target prop.
And controlling the target prop to move on the ground of the virtual environment according to the detection track. Optionally, after the virtual object releases the target prop in the virtual environment, the target prop automatically moves according to the detection track. In the embodiment of the application, the control target prop moves linearly towards the target direction on the ground of the virtual environment. Optionally, the initial target direction of the target prop is set by the system. In an exemplary embodiment, the system sets the initial target direction of the target prop to be directly in front of the virtual object, and after the virtual object releases the target prop in the virtual environment, the target prop moves linearly toward the directly in front of the virtual object. Optionally, the initial target direction of the target prop is selected by the user. In one example, a user selects an initial target direction of a target prop through a direction control. Illustratively, the direction control may be a rocker control, and the user selects the initial direction of the target prop by selecting the pointing direction of the rocker. In another example, the user selects an initial target direction for the target prop by a sliding operation using a control for the prop. Illustratively, the sliding direction of the sliding operation of the user using the control for the prop is set as the initial target direction of the target prop. The diversity of the movement track of the target prop is increased, multiple tactical choices are provided for the game, and better combat experience is provided for the user.
In the embodiment of the application, in response to collision of the target prop with an obstacle in the virtual environment, a reflection direction corresponding to the target direction is determined, and the target prop is controlled to linearly move on the ground of the virtual environment towards the reflection direction. In an exemplary embodiment, as shown in fig. 7, in response to the collision of the target prop with an obstacle in the virtual environment, the target direction in which the target prop is currently moving is taken as an incident direction 710, a normal direction 720 of the obstacle is determined, a reflection direction 730 is obtained through the incident direction 710 and the normal direction 720, and the client controls the target prop to linearly move toward the reflection direction 730. The angle between the incident direction 710 and the normal direction 720 is an angle 1, the angle between the reflected direction 730 and the normal direction 720 is an angle 2, and the angles 1 and 2 are equal in size.
In the embodiment of the application, in the process that the target prop moves according to the detection track, whether a target object meeting the tracking condition exists in a detection area of the target prop is detected; and if the target object exists, controlling the target prop to track the target object for movement. The detection region is a region in which the target prop can detect whether or not there is a target object satisfying the tracking condition.
Alternatively, the detection region is a region centered on the target prop position, such as a circular region, a rectangular region, or the like centered on the target prop position. Alternatively, the detection region is a region radiating outwardly from the target prop location, such as a sector-shaped region, a triangle-shaped region, or the like radiating outwardly from the target prop location.
Optionally, the detection area is displayed in a virtual environment, for example, in terms of color, brightness, etc. Illustratively, as shown in fig. 8, the detection area 350 of the target prop 340 refers to a sector-shaped area with the position point of the target prop as the origin, and the detection area 350 is displayed in a highlighted form.
And responding to the condition that the target object in the virtual environment meets the triggering condition of the target prop, and controlling the target prop to conduct range attack on the area where the target object is located. The triggering condition refers to a condition that triggers the target prop range injury function. Optionally, the triggering condition includes a distance between the target prop and the target object being less than a threshold value. Alternatively, the threshold value may be set by the system, or may be user-defined. The detection area of the target prop is a sector area taking a position point of the target prop as an origin, the radius of the sector area is R, and the distance between the position point of the target object and the position point of the target prop is R, wherein R is smaller than or equal to R. And when r is smaller than the threshold value, the client controls the target prop to conduct range attack on the area where the target object is located. For example, if the threshold value is R/2, when R is smaller than R/2, the client controls the target prop to perform range attack on the area where the target object is located.
Optionally, in response to the target prop being injured, calculating an injury value to which the target prop is subjected; and under the condition that the damage value born by the target prop is larger than the durable value, controlling the target prop to attack the current area in a range. Illustratively, the target object discovers the target prop and attacks the target prop before the target prop meets the trigger condition. In response to damage to the target prop, the client calculates a damage value born by the target prop, and under the condition that the damage value born by the target prop is larger than the durable value, the client controls the target prop to conduct range attack on the current area. Alternatively, the endurance value is set by the system.
Optionally, calculating the damage to the target object under the condition that the target object exists in the attack range of the target prop; and under the condition that the target object does not exist in the attack range of the target prop, the injury calculation is not carried out. Optionally, the injury is calculated from a distance between a location point of the target object and a location point of the target prop. For example, the closer the distance between the location point of the target object and the location point of the target prop, the higher the damage suffered by the target object, and the maximum value of the damage suffered by the target object does not exceed the damage threshold of the target prop. Optionally, the injury threshold of the target prop is set by the system.
In summary, the technical scheme provided by the application provides the target prop which moves on the ground of the virtual environment according to the detection track, and the target prop moves on the ground and is not easy to be found by an adversary, so that the virtual object can achieve a larger injury effect by using the virtual prop with the range injury function; meanwhile, the target prop has a target tracking function, can automatically track a target object, can cause a great deal of damage to the target object after the triggering condition of the target prop is met, and improves the damage effect which can be obtained by the virtual prop with a range damage function.
Referring to fig. 9, a schematic diagram of a target object determining method according to an embodiment of the present application is shown.
The tracking condition refers to a condition that triggers a target tracking function of the target prop. In one example, the detection area is a sector area taking a position point of the target prop as an origin, and a first straight line passing through the origin and bisecting the sector area is determined; if the distance between the position point of the target object and the position point of the target prop is smaller than or equal to the set distance, and the included angle between the first straight line and the second straight line which are taken as the top point of the position point of the target prop is smaller than or equal to the set angle, determining that the target object meets the tracking condition. The second straight line refers to a straight line passing through the position point of the target prop and the position point of the target object. Optionally, the set distance and the set angle are set by the system. Optionally, the above set distance and set angle are set by a user.
Illustratively, as shown in fig. 9, the detection area 350 is a sector area with a position point of the target prop 340 as an origin, a length of a first straight line L, L passing through the origin and bisecting the sector area, is a set distance, and a magnitude of the angle s is a set angle. When the target object 360 is located at the position 360a, the distance between the position point of the target object 360a and the position point of the target prop 340 is smaller than the set distance, the straight line La passing through the position point of the target prop 340 and the position point of the target object 360a is determined to be the second straight line, at this time, the included angle between the second straight line La and the first straight line L is an angle a, and the size of the angle a is larger than the size of the angle s, so that when the target object 360 is located at the position 360a, the tracking condition is not satisfied. When the target object 360 is located at the position 360b, the distance between the position point of the target object 360b and the position point of the target prop 340 is greater than the set distance, so that the tracking condition is not satisfied when the target object 360 is located at the position 360 b. When the target object 360 is located at the position 360c, the distance between the position point of the target object 360c and the position point of the target prop 340 is smaller than the set distance, the straight line Lc passing through the position point of the target prop 340 and the position point of the target object 360c is determined to be the second straight line, at this time, the included angle between the second straight line Lc and the first straight line L is the angle c, and the size of the angle c is smaller than the size of the angle s, so when the target object 360 is located at the position 360c, the tracking condition is satisfied, and the target prop 340 is controlled to track the target object 360c for movement.
Optionally, in the case that a plurality of target objects meeting the tracking condition exist in the detection area, selecting one target object from the plurality of target objects for tracking according to attribute information respectively corresponding to the plurality of target objects; wherein the attribute information includes at least one of: the distance between the target object and the target prop, the life value of the target object and the moving speed of the target object. Illustratively, as shown in fig. 10, the detection area 350 of the target prop 340 includes a target object 360m, a target object 360n, and a target object 360p, wherein the life value of the target object 360n is the smallest, and the moving speed of the target object 360m is the slowest. In one example, the target object 360p is selected for tracking by selecting the one of the plurality of target objects that has the shortest distance to target prop 340. In another example, if one target object with the smallest life value is selected from the plurality of target objects for tracking, the target object 360n is selected for tracking. In another example, one target object with the slowest moving speed is selected from a plurality of target objects to track, and then the target object 360m is selected to track.
In summary, according to the technical scheme provided by the application, the target prop moves on the ground of the virtual environment by tracking the target object, the target prop moves on the ground and is not easy to be found by an adversary, meanwhile, the target object is tracked to move, and a large amount of damage can be caused to the target object after the triggering condition of the target prop is met.
Referring to fig. 11, an interface schematic diagram of range attack performed by a target prop according to an embodiment of the present application is shown.
And responding to the condition that the target object in the virtual environment meets the triggering condition of the target prop, and controlling the target prop to conduct range attack on the area where the target object is located. Optionally, in response to the target object in the virtual environment meeting the trigger condition of the target prop, controlling the target prop to move from the ground of the virtual environment to the air of the virtual environment; controlling the target prop to generate an attack effect in the air of the virtual environment in an attack range taking the position point of the target prop as a reference; the attack range comprises a position point where the target object is located.
Illustratively, as shown in fig. 11, in response to a target object 360 in the virtual environment satisfying a trigger condition of a target prop 340, the client controls the target prop 340 to move from the ground of the virtual environment to the air of the virtual environment, and controls the target prop 340 to generate an attack effect, such as an explosion effect, within an attack range based on a location point of the target prop 340 in the air of the virtual environment.
The air of the virtual environment refers to the portion above the virtual ground portion in the virtual scene displayed by the client. Illustratively, the client displays the virtual house scene, and the ground of the virtual environment refers to the portion of the virtual house scene above the virtual ground.
Optionally, in response to the target object in the virtual environment satisfying the trigger condition of the target prop, the system controls the target prop to move from the ground of the virtual environment to the air of the virtual environment. Optionally, in response to the target object in the virtual environment satisfying the trigger condition of the target prop, the user controls the target prop to control movement of the target prop from the ground of the virtual environment into the air of the virtual environment. Illustratively, the user controls movement of the target prop from the ground of the virtual environment to the air of the virtual environment by triggering the prop control. The trigger prop control may be a button, slider bar, etc. for operation by a user.
Optionally, a location point at which the target prop moves from the ground of the virtual environment to the air of the virtual environment is set by the system. Illustratively, the system sets the target prop to move vertically from the ground of the virtual environment to the air of the virtual environment that is h from the ground of the virtual environment. Illustratively, the system sets the target prop to move in a parabolic fashion from the ground of the virtual environment to the air of the virtual environment at a distance h from the ground of the virtual environment. Alternatively, the shape of the parabola may be set by the system or by the user. Optionally, the location point at which the target prop moves from the ground of the virtual environment to the air of the virtual environment is set by the user. Illustratively, the user selects any location point in the air of the virtual environment, and the client controls movement of the target prop from the ground of the virtual environment to the air of the virtual environment in response to any location point selection operation for the air of the virtual environment.
Optionally, the attack range of the target prop is a three-dimensional space region centered on a location point of the target prop. Alternatively, the three-dimensional space region may be a rectangular parallelepiped region, a square region, a spherical region, or the like, which is not limited in this application. Illustratively, as shown in fig. 12, the attack range of target prop 340 is a rectangular parallelepiped area 370 centered on the location point of target prop 340.
Optionally, the attack range of the target prop is a conical area which takes the position point of the target prop as an origin and radiates to the ground. Alternatively, the tapered region may be a conical region, a pyramidal region, or the like, which is not limited in this application. Illustratively, as shown in FIG. 13, the range of attack of target prop 340 is a conical region 370 centered at the location point of target prop 340.
In summary, according to the technical scheme provided by the application, the target prop moves from the ground of the virtual environment to the air of the virtual environment, so that the range attack is performed in the air of the virtual environment, the attack range of the target prop is enlarged, and a better attack effect can be obtained.
The following are device embodiments of the present application, which may be used to perform method embodiments of the present application. For details not disclosed in the device embodiments of the present application, please refer to the method embodiments of the present application.
Referring to fig. 14, a block diagram of a control device for a virtual prop according to an embodiment of the present application is shown. The device has the function of realizing the control method of the virtual object, and the function can be realized by hardware or by executing corresponding software by the hardware. The apparatus 1400 includes a display module 1410, a release module 1420, a mobile module 1430, and an attack module 1440.
The display module 1410 is configured to display a user interface, where a virtual environment screen and a prop use control are displayed, and the virtual environment screen is a screen for observing a virtual environment from a perspective of a virtual object.
A release module 1420 to control the virtual object to release a target prop in the virtual environment in response to an operation of the prop use control, the target prop being a prop having a target tracking function and a range damage function.
And the moving module 1430 is used for controlling the target prop to move on the ground of the virtual environment and track the target object.
In some embodiments, as shown in fig. 15, the movement module 1430 includes a first movement unit 1431, a detection unit 1432, and a tracking unit 1433.
A first movement unit 1431 is configured to control the target prop to move on the ground of the virtual environment according to a detection track.
And the detection unit 1432 is used for detecting whether the target object meeting the tracking condition exists in the detection area of the target prop in the process of moving the target prop according to the detection track.
And the tracking unit 1433 is used for controlling the target prop to track the target object to move if the target object exists.
In some embodiments, the first moving unit 1431 is further configured to control the target prop to perform a linear movement towards a target direction on the ground of the virtual environment; determining a reflection direction corresponding to the target direction in response to the collision of the target prop with an obstacle in the virtual environment; and controlling the target prop to linearly move towards the reflecting direction on the ground of the virtual environment.
In some embodiments, the detection region is a sector region with a location point of the target prop as an origin.
In some embodiments, the detecting unit 1432 is further configured to determine a first straight line passing through the origin and bisecting the sector area; if the distance between the position point of the target object and the position point of the target prop is smaller than or equal to a set distance, and the included angle between the position point of the target prop and the two sides of the first straight line and the second straight line is smaller than or equal to a set angle, determining that the target object meets the tracking condition; wherein the second straight line is a straight line passing through the position point of the target prop and the position point of the target object.
In some embodiments, as shown in fig. 15, the mobile module 1430 further includes a selection unit 1434.
A selection unit 1434, configured to, when there are a plurality of target objects that satisfy the tracking condition in the detection area, select one target object from the plurality of target objects according to attribute information respectively corresponding to the plurality of target objects, and track the target object; wherein the attribute information includes at least one of: the distance between the target object and the target prop, the life value of the target object and the moving speed of the target object.
In some embodiments, as shown in fig. 15, the mobile module 1430 further includes a display unit 1435.
And a display unit 1435 configured to display the mark of the detection area in the virtual environment.
And the attack module 1440 is configured to control, in response to a target object in the virtual environment meeting a trigger condition of the target prop, the target prop to perform range attack on an area where the target object is located.
In some embodiments, the triggering condition includes a distance between the target prop and the target object being less than a threshold value.
In some embodiments, as shown in fig. 15, the attack module 1440 includes a second mobile unit 1441 and an attack unit 1442.
A second moving unit 1441 is configured to control the movement of the target prop from the ground of the virtual environment to the air of the virtual environment.
An attack unit 1442, configured to control the target prop to generate an attack effect in an attack range with a location point of the target prop as a reference in the air of the virtual environment; the attack range comprises a position point where the target object is located.
In some embodiments, the attack range is a three-dimensional spatial region centered on a location point of the target prop; alternatively, the attack range is a conical region radiating to the ground with a point of the target prop as an origin.
In some embodiments, as shown in fig. 15, the apparatus 1400 further comprises an injury module 1450.
An injury module 1450 for calculating an injury value sustained by the target prop in response to the target prop being injured; and under the condition that the damage value born by the target prop is larger than the durable value, controlling the target prop to conduct range attack on the current area.
It should be noted that, in the apparatus provided in the foregoing embodiment, when implementing the functions thereof, only the division of the foregoing functional modules is used as an example, in practical application, the foregoing functional allocation may be implemented by different functional modules, that is, the internal structure of the device is divided into different functional modules, so as to implement all or part of the functions described above. In addition, the apparatus and the method embodiments provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the apparatus and the method embodiments are detailed in the method embodiments and are not repeated herein.
Referring to fig. 16, a block diagram of a terminal 1600 provided in one embodiment of the present application is shown. The terminal 1600 may be an electronic device such as a cell phone, tablet computer, game console, electronic book reader, multimedia player device, wearable device, PC, or the like. The terminal is used for implementing the control method of the virtual prop provided in the embodiment. The terminal may be the terminal 10 in the game execution environment shown in fig. 1. Specifically, the present invention relates to a method for manufacturing a semiconductor device.
In general, terminal 1600 includes: a processor 1601, and a memory 1602.
Processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing ), an FPGA (Field Programmable Gate Array, field programmable gate array), a PLA (Programmable Logic Array ). The processor 1601 may also include a host processor, which is a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ); a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1601 may be integrated with a GPU (Graphics Processing Unit, image processor) for use in responsible for rendering and rendering of content to be displayed by the display screen. In some embodiments, the processor 1601 may also include an AI processor for processing computing operations related to machine learning.
Memory 1602 may include one or more computer-readable storage media, which may be non-transitory. Memory 1602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1602 is used to store at least one instruction, at least one program, set of codes, or set of instructions configured to be executed by one or more processors to implement the control method of virtual objects described above.
In some embodiments, terminal 1600 may also optionally include: a peripheral interface 1603, and at least one peripheral. The processor 1601, memory 1602, and peripheral interface 1603 may be connected by bus or signal lines. The individual peripheral devices may be connected to the peripheral device interface 1603 by buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1604, a display screen 1605, a camera 1606, audio circuitry 1607, positioning components 1608, and a power supply 1609.
Those skilled in the art will appreciate that the structure shown in fig. 16 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
In an exemplary embodiment, a computer readable storage medium is also provided, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored, where the at least one instruction, the at least one program, the set of codes, or the set of instructions, when executed by a processor, implement the method for controlling a virtual prop described above.
Alternatively, the computer-readable storage medium may include: ROM (Read Only Memory), RAM (Random Access Memory ), SSD (Solid State Drives, solid state disk), or optical disk, etc. The random access memory may include ReRAM (Resistance Random Access Memory, resistive random access memory) and DRAM (Dynamic Random Access Memory ), among others.
In an exemplary embodiment, a computer program product or a computer program is also provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the terminal reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the terminal executes the control method of the virtual prop.
It should be understood that references herein to "a plurality" are to two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. In addition, the step numbers described herein are merely exemplary of one possible execution sequence among steps, and in some other embodiments, the steps may be executed out of the order of numbers, such as two differently numbered steps being executed simultaneously, or two differently numbered steps being executed in an order opposite to that shown, which is not limited by the embodiments of the present application.
The foregoing description of the exemplary embodiments of the present application is not intended to limit the invention to the particular embodiments disclosed, but on the contrary, the intention is to cover all modifications, equivalents, alternatives, and alternatives falling within the spirit and scope of the invention.

Claims (12)

1. A method for controlling a virtual prop, the method comprising:
displaying a user interface, wherein a virtual environment picture and a prop use control are displayed in the user interface, and the virtual environment picture is a picture for observing a virtual environment from the view angle of a virtual object;
Controlling the virtual object to release a target prop in the virtual environment in response to the operation of using a control for the prop, wherein the target prop is a prop with a target tracking function and a range injury function;
controlling the target prop to move on the ground of the virtual environment and tracking a target object;
responding to the damage to the target prop, and calculating the damage value born by the target prop;
controlling the target prop to attack the current area under the condition that the damage value born by the target prop is larger than or equal to the durable value;
controlling the target prop to move from the ground of the virtual environment to the air of the virtual environment in response to the target object in the virtual environment meeting the triggering condition of the target prop;
controlling the target prop to generate an attack effect in an attack range taking the position point of the target prop as a reference in the air of the virtual environment; the attack range comprises a position point where the target object is located.
2. The method of claim 1, wherein the controlling the target prop to move and track the target object on the ground of the virtual environment comprises:
Controlling the target prop to move on the ground of the virtual environment according to a detection track;
detecting whether the target object meeting tracking conditions exists in a detection area of the target prop or not in the process that the target prop moves according to the detection track;
and if the target object meeting the tracking condition exists, controlling the target prop to track the target object for movement.
3. The method of claim 2, wherein the controlling the target prop to move on the ground of the virtual environment according to a detection trajectory comprises:
controlling the target prop to linearly move towards a target direction on the ground of the virtual environment;
determining a reflection direction corresponding to the target direction in response to the collision of the target prop with an obstacle in the virtual environment;
and controlling the target prop to linearly move towards the reflecting direction on the ground of the virtual environment.
4. The method of claim 2, wherein the detection zone is a sector-shaped zone having a location point of the target prop as an origin.
5. The method of claim 4, wherein detecting whether the target object satisfying tracking conditions is present in a detection area of the target prop comprises:
Determining a first straight line passing through the origin and bisecting the sector area;
if the distance between the position point of the target object and the position point of the target prop is smaller than or equal to a set distance, and the included angle between the position point of the target prop and the two sides of the first straight line and the second straight line is smaller than or equal to a set angle, determining that the target object meets the tracking condition;
wherein the second straight line is a straight line passing through the position point of the target prop and the position point of the target object.
6. The method according to claim 2, wherein the method further comprises:
when a plurality of target objects meeting the tracking condition exist in the detection area, selecting one target object from the plurality of target objects to track according to attribute information respectively corresponding to the plurality of target objects;
wherein the attribute information includes at least one of: the distance between the target object and the target prop, the life value of the target object, the moving speed of the target object, the killing number of the target object, the grade of the target object and the virtual resource value owned by the target object.
7. The method according to claim 2, wherein the method further comprises:
and marking and displaying the detection area in the virtual environment.
8. The method of claim 1, wherein the triggering condition includes a distance between the target prop and the target object being less than a threshold value.
9. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the attack range is a three-dimensional space region taking a position point of the target prop as a center;
or alternatively, the process may be performed,
the attack range is a conical area which takes the position point of the target prop as an origin and radiates to the ground.
10. A control device for a virtual prop, the device comprising:
the display module is used for displaying a user interface, wherein a virtual environment picture and a prop use control are displayed in the user interface, and the virtual environment picture is a picture for observing a virtual environment from the view angle of a virtual object;
the releasing module is used for responding to the operation of using a control for the prop and controlling the virtual object to release a target prop in the virtual environment, wherein the target prop is a prop with a target tracking function and a range injury function;
The moving module is used for controlling the target prop to move on the ground of the virtual environment and tracking a target object;
the injury module is used for responding to the injury of the target prop and calculating the injury value born by the target prop; controlling the target prop to attack the current area under the condition that the damage value born by the target prop is larger than or equal to the durable value;
the attack module is used for responding to the condition that a target object in the virtual environment meets the triggering condition of the target prop and controlling the target prop to move from the ground of the virtual environment to the air of the virtual environment; controlling the target prop to generate an attack effect in an attack range taking the position point of the target prop as a reference in the air of the virtual environment; the attack range comprises a position point where the target object is located.
11. A terminal comprising a processor and a memory, wherein the memory has a program stored therein, the program being loaded and executed by the processor to implement the method of controlling a virtual prop according to any one of claims 1 to 9.
12. A computer-readable storage medium, in which a program is stored, the program being loaded and executed by a processor to implement the method of controlling a virtual prop according to any one of claims 1 to 9.
CN202111032763.0A 2021-09-03 2021-09-03 Virtual prop control method, device, terminal and storage medium Active CN113680061B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111032763.0A CN113680061B (en) 2021-09-03 2021-09-03 Virtual prop control method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111032763.0A CN113680061B (en) 2021-09-03 2021-09-03 Virtual prop control method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN113680061A CN113680061A (en) 2021-11-23
CN113680061B true CN113680061B (en) 2023-07-25

Family

ID=78585264

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111032763.0A Active CN113680061B (en) 2021-09-03 2021-09-03 Virtual prop control method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113680061B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110448891A (en) * 2019-08-08 2019-11-15 腾讯科技(深圳)有限公司 Control the method, apparatus and storage medium of virtual objects operation remote dummy stage property
CN110585695A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Method, apparatus, device and medium for using near-war property in virtual environment
CN110694261A (en) * 2019-10-21 2020-01-17 腾讯科技(深圳)有限公司 Method, terminal and storage medium for controlling virtual object to attack
CN110917619A (en) * 2019-11-18 2020-03-27 腾讯科技(深圳)有限公司 Interactive property control method, device, terminal and storage medium
CN111589146A (en) * 2020-04-27 2020-08-28 腾讯科技(深圳)有限公司 Prop operation method, device, equipment and storage medium based on virtual environment
CN111773696A (en) * 2020-07-13 2020-10-16 腾讯科技(深圳)有限公司 Virtual object display method, related device and storage medium
CN112057857A (en) * 2020-09-11 2020-12-11 腾讯科技(深圳)有限公司 Interactive property processing method, device, terminal and storage medium
CN112076467A (en) * 2020-09-17 2020-12-15 腾讯科技(深圳)有限公司 Method, device, terminal and medium for controlling virtual object to use virtual prop
CN112107861A (en) * 2020-09-18 2020-12-22 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
CN112107860A (en) * 2020-09-18 2020-12-22 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
CN112121414A (en) * 2020-09-29 2020-12-25 腾讯科技(深圳)有限公司 Tracking method and device in virtual scene, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9770664B2 (en) * 2013-04-05 2017-09-26 Gree, Inc. Method and apparatus for providing online shooting game

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110448891A (en) * 2019-08-08 2019-11-15 腾讯科技(深圳)有限公司 Control the method, apparatus and storage medium of virtual objects operation remote dummy stage property
CN110585695A (en) * 2019-09-12 2019-12-20 腾讯科技(深圳)有限公司 Method, apparatus, device and medium for using near-war property in virtual environment
CN110694261A (en) * 2019-10-21 2020-01-17 腾讯科技(深圳)有限公司 Method, terminal and storage medium for controlling virtual object to attack
CN110917619A (en) * 2019-11-18 2020-03-27 腾讯科技(深圳)有限公司 Interactive property control method, device, terminal and storage medium
CN111589146A (en) * 2020-04-27 2020-08-28 腾讯科技(深圳)有限公司 Prop operation method, device, equipment and storage medium based on virtual environment
CN111773696A (en) * 2020-07-13 2020-10-16 腾讯科技(深圳)有限公司 Virtual object display method, related device and storage medium
CN112057857A (en) * 2020-09-11 2020-12-11 腾讯科技(深圳)有限公司 Interactive property processing method, device, terminal and storage medium
CN112076467A (en) * 2020-09-17 2020-12-15 腾讯科技(深圳)有限公司 Method, device, terminal and medium for controlling virtual object to use virtual prop
CN112107861A (en) * 2020-09-18 2020-12-22 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
CN112107860A (en) * 2020-09-18 2020-12-22 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
CN112121414A (en) * 2020-09-29 2020-12-25 腾讯科技(深圳)有限公司 Tracking method and device in virtual scene, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113680061A (en) 2021-11-23

Similar Documents

Publication Publication Date Title
WO2020238592A1 (en) Method and apparatus for generating mark information in virtual environment, electronic device, and storage medium
JP7362191B2 (en) Virtual object control method, device, terminal and computer program
CN111481932B (en) Virtual object control method, device, equipment and storage medium
JP7331124B2 (en) Virtual object control method, device, terminal and storage medium
CN110465087B (en) Virtual article control method, device, terminal and storage medium
CN110478895B (en) Virtual article control method, device, terminal and storage medium
CN110585712A (en) Method, device, terminal and medium for throwing virtual explosives in virtual environment
CN110585731B (en) Method, device, terminal and medium for throwing virtual article in virtual environment
CN111589126A (en) Virtual object control method, device, equipment and storage medium
JP2022539289A (en) VIRTUAL OBJECT AIMING METHOD, APPARATUS AND PROGRAM
CN112717392B (en) Mark display method, device, terminal and storage medium
CN110732135A (en) Virtual scene display method and device, electronic equipment and storage medium
CN111905363B (en) Virtual object control method, device, terminal and storage medium
JP2023164787A (en) Picture display method and apparatus for virtual environment, and device and computer program
CN110801629B (en) Method, device, terminal and medium for displaying virtual object life value prompt graph
CN113633975B (en) Virtual environment picture display method, device, terminal and storage medium
JP7384521B2 (en) Virtual object control method, device, computer equipment and computer program
CN113680061B (en) Virtual prop control method, device, terminal and storage medium
WO2023071808A1 (en) Virtual scene-based graphic display method and apparatus, device, and medium
CN113617030B (en) Virtual object control method, device, terminal and storage medium
CN111905380B (en) Virtual object control method, device, terminal and storage medium
US20220054944A1 (en) Virtual object control method and apparatus, terminal, and storage medium
CN112402965A (en) Position monitoring and anti-monitoring method, device, terminal and storage medium
CN117298580A (en) Virtual object interaction method, device, equipment, medium and program product
CN112426725A (en) Virtual object control method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40055276

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant