CN111589150B - Control method and device of virtual prop, electronic equipment and storage medium - Google Patents

Control method and device of virtual prop, electronic equipment and storage medium Download PDF

Info

Publication number
CN111589150B
CN111589150B CN202010321380.4A CN202010321380A CN111589150B CN 111589150 B CN111589150 B CN 111589150B CN 202010321380 A CN202010321380 A CN 202010321380A CN 111589150 B CN111589150 B CN 111589150B
Authority
CN
China
Prior art keywords
target
virtual
launcher
prop
motion track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010321380.4A
Other languages
Chinese (zh)
Other versions
CN111589150A (en
Inventor
冯啟垚
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010321380.4A priority Critical patent/CN111589150B/en
Publication of CN111589150A publication Critical patent/CN111589150A/en
Application granted granted Critical
Publication of CN111589150B publication Critical patent/CN111589150B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

The application discloses a control method and device of a virtual prop, electronic equipment and a storage medium, and belongs to the technical field of computers. This application embodiment is through responding to the transmission operation to virtual stage property, control virtual stage property transmission firing object, move according to the target motion orbit at this firing object, when moving to the target location, show the target animation in the target area that this target location corresponds, reach target location department at the firing object, can cause the electric shock influence to the virtual object in an area, the user need not the accurate virtual object that aims like this, the virtual object can receive the electric shock influence in this target area, thereby user operation has been simplified, the operation degree of difficulty has been reduced, and then improve the frequency of use of this virtual stage property, improve user's viscosity.

Description

Control method and device of virtual prop, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for controlling a virtual prop, an electronic device, and a storage medium.
Background
With the development of multimedia technology and the diversification of terminal functions, more and more games can be played on the terminal. The shooting game is a more popular game, the terminal can display a virtual scene in the interface and display a virtual object in the virtual scene, and the virtual object can control the virtual prop to fight against other virtual objects.
At present, taking a shooting type virtual item as an example, the control method of the virtual item is generally as follows: the user clicks a firing key, the terminal detects the user operation, the virtual prop launcher bullet of the firearm class of the terminal shoots a virtual object, and if hitting enemies, the enemies can be injured.
Above-mentioned virtual stage property needs hit enemy and just can cause the injury to enemy, and the required precision of operation is higher, and the operation degree of difficulty is higher to lead to the use frequency reduction of virtual stage property, user's viscosity reduction. Therefore, a virtual prop operation method is needed to reduce the operation difficulty and increase the viscosity of the user.
Disclosure of Invention
The embodiment of the application provides a control method and device of a virtual prop, electronic equipment and a storage medium, which can improve the accuracy of a motion track of the virtual prop. The technical scheme is as follows:
in one aspect, a method for controlling a virtual item is provided, where the method includes:
responding to the launching operation of the virtual prop, and controlling a launcher of the virtual prop to move in a virtual scene according to a target motion track;
responding to the condition that the emitting object reaches a target position on the target motion track, and acquiring a target area corresponding to the target position;
displaying a target animation in the target area, wherein the target animation is used for representing that the virtual object in the target area is affected by electric shock.
In one possible implementation, the method further includes:
and in response to the virtual prop being in the target state, setting a launching operation button of the virtual prop to a trigger prohibition state.
In one possible implementation, the controlling of the reduction of the moving speed of the virtual object includes any one of:
acquiring a difference value between the current moving speed of the virtual object and a speed adjusting value, and taking the difference value as the reduced moving speed of the virtual object;
setting a moving speed of the virtual object to zero.
In one possible implementation manner, the controlling, in response to the launching operation of the virtual prop, a launcher of the virtual prop to move in a virtual scene according to a target motion trajectory includes:
responding to launching operation of the virtual prop, and acquiring a target motion track of a launcher of the virtual prop based on a visual angle direction of a current virtual scene;
and controlling the launcher of the virtual prop to move in the virtual scene according to the target motion track.
In one aspect, a control device for a virtual prop is provided, the device including:
the control module is used for responding to the launching operation of the virtual prop and controlling the launcher of the virtual prop to move in a virtual scene according to a target motion track;
and the display module is used for responding to the fact that the launcher reaches a target position on the target motion track, and displaying a target animation in a target area corresponding to the target position, wherein the target animation is used for representing electric shock influence on a virtual object in the target area.
In one possible implementation, the display module is configured to:
acquiring a target area corresponding to the target position;
and displaying the target animation in the target area.
In one possible implementation manner, the display module is configured to acquire, as the target area, an area of a target size centered on the target position.
In one possible implementation, the display module is configured to perform any one of:
acquiring a circular area taking the target position as a center and a target radius as a radius as the target area;
and acquiring a spherical area taking the target position as a center and the target radius as a radius as the target area.
In one possible implementation, the determination of the target position includes any one of:
determining a drop point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determining the drop point as the target position;
determining the position, on the target motion track, of the transmitting object, of which the transmitting time reaches a time threshold value, as the target position according to the target motion track and the transmitting time of the transmitting object;
determining a landing point of the launcher according to the target motion track and the position of the ground in a virtual scene, and determining the landing point as the target position in response to the launcher reaching the landing point of the launcher and the time of the launcher being transmitted being less than a time threshold;
determining a drop point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determining the position on the target motion track, at which the time of launching the launcher reaches a time threshold, as the target position in response to the launcher not reaching the drop point and the time of launching the launcher reaching the time threshold;
and determining a collision position of the transmitter colliding with the obstacle on the target motion trail as the target position in response to the obstacle being included on the target motion trail.
In one possible implementation, the control module is to:
in response to the start of the launching operation of the virtual item, setting the state of the virtual item to a target state, wherein the target state is used for representing the energy filling the virtual item;
and in response to that the time length of the virtual prop in the target state reaches a target time length and the launching operation is finished, executing the step of controlling the launcher of the virtual prop to move in the virtual scene according to a target motion track.
In one possible implementation manner, the display module is further configured to display a progress prompt in a graphical user interface in response to the virtual prop being in the target state, where the progress prompt is used for prompting an energy filling progress of the virtual prop.
In one possible implementation, the apparatus further includes:
and the setting module is used for setting the state of the virtual prop to be an idle state in response to the fact that the time length of the virtual prop in the target state is less than the target time length and the launching operation is finished.
In one possible implementation, the setting module is further configured to:
and in response to the virtual prop being in the target state, setting a launching operation button of the virtual prop to a trigger prohibition state.
In one possible implementation, the control module is further configured to control the virtual life value of at least one virtual object to decrease in response to the at least one virtual object being included within the target region.
In one possible implementation manner, the control module is further configured to control the moving speed of any virtual object in the target area to be reduced in response to the virtual life value of the virtual object being greater than zero.
In one possible implementation manner, the display module is further configured to display that any virtual object in the target area is in an elimination state in response to the virtual life value of the virtual object being zero.
In one possible implementation, the control module is further configured to perform any one of:
acquiring a difference value between the current moving speed of the virtual object and a speed adjusting value, and taking the difference value as the reduced moving speed of the virtual object;
setting a moving speed of the virtual object to zero.
In one possible implementation, the control module is configured to:
responding to launching operation of the virtual prop, and acquiring a target motion track of a launcher of the virtual prop based on a visual angle direction of a current virtual scene;
and controlling the launcher of the virtual prop to move in the virtual scene according to the target motion track.
In one possible implementation, the display module is further configured to display a electric-shock paralysis animation in a graphical user interface in response to the currently-controlled virtual object being within the target region of the projectile of any of the virtual props, the electric-shock paralysis animation being used to represent an effect of the currently-controlled virtual object being subjected to an electric shock resulting in paralysis.
In a possible implementation manner, the control module is further configured to, in response to that a currently controlled virtual object is located in the target area of the launcher of any virtual prop, receive any control instruction for the virtual object within a target time period, and ignore the control instruction.
In one aspect, an electronic device is provided and includes one or more processors and one or more memories, where at least one program code is stored in the one or more memories and loaded by the one or more processors and executed to implement the operations performed by the method for controlling a virtual item according to any of the above aspects and any possible implementation manner of any of the above aspects.
In one aspect, a storage medium is provided, and at least one program code is stored in the storage medium, and is loaded and executed by a processor to implement the operations performed by the method for controlling a virtual item according to any one of the above aspects and any one of the possible implementations of any one of the above aspects.
The beneficial effects that technical scheme that this application embodiment brought include at least:
this application embodiment is through responding to the transmission operation to virtual stage property, control virtual stage property transmission firing object, move according to the target motion orbit at this firing object, when moving to the target location, show the target animation in the target area that this target location corresponds, reach target location department at the firing object, can cause the electric shock influence to the virtual object in an area, the user need not the accurate virtual object that aims like this, the virtual object can receive the electric shock influence in this target area, thereby user operation has been simplified, the operation degree of difficulty has been reduced, and then improve the frequency of use of this virtual stage property, improve user's viscosity.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is an implementation environment schematic diagram of a control method for a virtual item according to an embodiment of the present application;
fig. 2 is a flowchart of a method for controlling a virtual prop according to an embodiment of the present application;
fig. 3 is a flowchart of a method for controlling a virtual prop according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
FIG. 7 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
FIG. 8 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
FIG. 9 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
fig. 10 is a flowchart of a method for controlling a virtual prop according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a terminal interface provided in an embodiment of the present application;
FIG. 12 is a schematic view of a ray provided by an embodiment of the present application;
FIG. 13 is a schematic illustration of one explosive range provided by an embodiment of the present application;
fig. 14 is a schematic structural diagram of a control device of a virtual prop according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of a terminal 1500 according to an embodiment of the present disclosure;
fig. 16 is a schematic structural diagram of a server 1600 according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
In this application, the terms "first," "second," and the like are used for distinguishing identical or similar items with substantially identical functions and functionalities, and it should be understood that "first," "second," and "n" have no logical or temporal dependency, and no limitation on the number or execution order.
The term "at least one" in this application refers to one or more than one, "at least two" refers to two or more than two, e.g., at least two node devices refers to two or more than two node devices.
Hereinafter, terms related to the present application are explained.
Virtual scene: is a virtual scene that is displayed (or provided) by an application program when the application program runs on a terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts, cities, etc., and a user may control a virtual object to move in the virtual scene.
Virtual object: refers to a movable object in a virtual scene. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the virtual scene. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. The virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene and occupying a portion of the space in the virtual scene.
Alternatively, the virtual object may be a Player Character controlled by an operation on the client, an Artificial Intelligence (AI) set in the virtual scene fight by training, or a Non-Player Character (NPC) set in the virtual scene interaction. Alternatively, the virtual object may be a virtual character playing a game in a virtual scene. Optionally, the number of virtual objects participating in the interaction in the virtual scene may be preset, or may be dynamically determined according to the number of clients participating in the interaction.
Taking a shooting game as an example, the user may control a virtual object to freely fall, glide, or open a parachute to fall in the sky of the virtual scene, to run, jump, crawl, bow to go ahead on land, or to swim, float, or dive in the sea, or the like, and the user may control a virtual object to move in the virtual scene by riding a virtual vehicle, such as a virtual car, a virtual aircraft, or a virtual yacht, which is only exemplified by the above-mentioned scenes, but the present invention is not limited thereto. The user can also control the virtual object to interact with other virtual objects in a fighting mode and other modes through the virtual object, for example, the virtual object can include multiple types, such as a throwing type virtual object such as a grenade, a mine tied in a bundle, a smoke bomb, a burning bottle or a viscous grenade (called "viscous mine" for short), and also can be a shooting type virtual object such as a machine gun, a pistol, a rifle and the like, and the type of the virtual object is not specifically limited in the application.
Fig. 1 is a schematic implementation environment diagram of a method for controlling a virtual item provided in an embodiment of the present application, and referring to fig. 1, the implementation environment includes: a first terminal 120, a server 140, and a second terminal 160.
The first terminal 120 is installed and operated with an application program supporting a virtual scene. The application program may be any one of a First-Person shooter game (FPS), a third-Person shooter game, a Multiplayer Online Battle Arena game (MOBA), a virtual reality application program, a three-dimensional map program, or a Multiplayer gunfight type survival game. The first terminal 120 may be a terminal used by a first user, who uses the first terminal 120 to operate a first virtual object located in a virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first virtual object is a first virtual character, such as a simulated persona or an animated persona. Illustratively, the first virtual object may be a first virtual animal, such as a simulated monkey or other animal.
The first terminal 120 and the second terminal 160 are connected to the server 140 through a wireless network or a wired network.
The server 140 may include at least one of a server, a plurality of servers, a cloud computing platform, or a virtualization center. The server 140 is used to provide background services for applications that support virtual scenarios. Alternatively, the server 140 may undertake primary computational tasks and the first and second terminals 120, 160 may undertake secondary computational tasks; alternatively, the server 140 undertakes the secondary computing work and the first terminal 120 and the second terminal 160 undertakes the primary computing work; alternatively, the server 140, the first terminal 120, and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
The second terminal 160 is installed and operated with an application program supporting a virtual scene. The application program can be any one of an FPS, a third person shooting game, an MOBA, a virtual reality application program, a three-dimensional map program or a multi-person gun battle type survival game. The second terminal 160 may be a terminal used by a second user, who uses the second terminal 160 to operate a second virtual object located in the virtual scene for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as a simulated character or an animated character. Illustratively, the second virtual object may be a second virtual animal, such as a simulated monkey or other animal.
Optionally, the first virtual object controlled by the first terminal 120 and the second virtual object controlled by the second terminal 160 are in the same virtual scene, and the first virtual object may interact with the second virtual object in the virtual scene. In some embodiments, the first virtual object and the second virtual object may be in a hostile relationship, for example, the first virtual object and the second virtual object may belong to different teams and organizations, and the hostile virtual objects may interact with each other in a mutual shooting manner on land.
In other embodiments, the first virtual object and the second virtual object may be in a teammate relationship, for example, the first virtual character and the second virtual character may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights.
Alternatively, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application of different operating system platforms. The first terminal 120 may generally refer to one of a plurality of terminals, and the second terminal 160 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 120 and the second terminal 160. The device types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smart phone, a tablet computer, an electronic book reader, an MP3 (Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3) player, an MP4 (Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4) player, a laptop portable computer, and a desktop computer. For example, the first terminal 120 and the second terminal 160 may be smart phones, or other handheld portable gaming devices. The following embodiments are illustrated with the terminal comprising a smartphone.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
Fig. 2 is a flowchart of a method for controlling a virtual item provided in an embodiment of the present application, where the method is applied to the terminal, and referring to fig. 2, the method may include the following steps:
201. and the terminal responds to the launching operation of the virtual prop and controls the launcher of the virtual prop to move in the virtual scene according to the target motion track.
The virtual item is a virtual item capable of interacting with a virtual object. The user can control the virtual object to use in the virtual scene or control the virtual prop by operating on the terminal. In this application embodiment, this virtual stage property can launch the transmitter, causes the injury to virtual object through the transmitter.
The target motion track of the launcher of the virtual prop can be related to the visual angle direction during launching operation, and a user can adjust the visual angle direction of a virtual scene through visual angle adjusting operation during launching operation, so that the target motion track of the launcher is adjusted. Specifically, when the user can perform a launching operation and the terminal detects the launching operation, the target motion trajectory of the launcher of the virtual prop can be acquired based on the view direction of the current virtual scene in response to the launching operation on the virtual prop, so that the launcher of the virtual prop is controlled to move in the virtual scene according to the target motion trajectory.
The target motion track of the virtual prop can be related to the visual angle direction during launching operation, and a user can adjust the visual angle direction of a virtual scene through visual angle adjusting operation during launching operation, so that the target motion track of the virtual prop is adjusted. Specifically, when the user can perform a launching operation and the terminal detects the launching operation, the target motion trajectory of the virtual prop can be acquired based on the view direction of the current virtual scene in response to the launching operation on the virtual prop, so that the launcher of the virtual prop is controlled to move in the virtual scene according to the target motion trajectory.
In one particular example, the virtual prop can launch a carrier (launcher), such as a ball of electricity. The user can select this virtual stage property, holds the button of firing, and the release is handed when adjusting suitable launch angle, and the electric ball can be launched according to current launch angle to the terminal, and this electric ball can fly in the air according to the target motion trajectory that current launch angle corresponds.
202. And the terminal responds to the launcher reaching a target position on the target motion track, and displays a target animation in a target area corresponding to the target position, wherein the target animation is used for representing that the virtual object in the target area is affected by electric shock.
The target position is a position where the emitting object influences the surroundings, and is a position on the target motion track. After the launcher is launched out, the launcher can move according to a target motion track and move to the target position, and the target area corresponding to the target position can be influenced.
The target area is an area around the target position, and the terminal can acquire the corresponding target area when the launcher reaches the target position, and display the target animation in the target area, that is, the launcher can cause electric shock influence on the virtual object in the target area. For example, if one virtual object is included within the target region, the emissions can cause damage to the one virtual object. If multiple virtual objects are included in the target area, the emissions can cause damage to the multiple virtual objects.
In the embodiment of the present application, the virtual prop does not necessarily hit the virtual object to cause damage to the virtual object, and a launcher of the virtual prop can cause damage to an area, which can also be referred to as a range attack. Therefore, the terminal can acquire a target area corresponding to the target position, namely, determine the attack range of the launcher when the launcher performs range attack, so as to execute the following steps and display the animation which affects the target area.
In the process, the launcher of the virtual prop is set to be capable of causing range attack, so that a user does not need to accurately aim at a virtual object, and the virtual object can be in the target area, so that the user operation is simplified, the operation difficulty is reduced, the use frequency of the virtual prop is improved, and the viscosity of the user is improved.
This application embodiment is through responding to the transmission operation to virtual stage property, control virtual stage property transmission firing object, move according to the target motion orbit at this firing object, when moving to the target location, show the target animation in the target area that this target location corresponds, reach target location department at the firing object, can cause the electric shock influence to the virtual object in an area, the user need not the accurate virtual object that aims like this, the virtual object can receive the electric shock influence in this target area, thereby user operation has been simplified, the operation degree of difficulty has been reduced, and then improve the frequency of use of this virtual stage property, improve user's viscosity.
Fig. 3 is a flowchart of a method for controlling a virtual item, provided in an embodiment of the present application, and is applied to the terminal, referring to fig. 3, where the method may include the following steps:
301. the terminal responds to the starting of the transmitting operation of the virtual item, and the state of the virtual item is set to a target state, wherein the target state is used for representing the energy filling the virtual item.
The user can control the virtual object to obtain the virtual prop through operation, and the terminal can display the virtual prop at a position corresponding to the virtual object in the virtual scene. For example, the virtual item is a shooting type virtual item, the virtual item can launch a carrier, the user can control the virtual object to obtain the shooting type virtual item, and the terminal can display the shooting type virtual item on the hand or back of the virtual object.
The virtual prop can include two acquisition modes, specifically as follows:
and acquiring a first acquisition mode by picking up the virtual object.
In the first obtaining mode, the terminal may display a plurality of virtual items in a virtual scene, where the virtual items may be items that can interact with a virtual object, for example, shooting type virtual items or throwing type virtual items. When the user sees the virtual prop, the virtual object can be controlled to pick up the virtual prop through the picking-up operation.
Specifically, the virtual prop may be displayed on the ground or a virtual article in a virtual scene, when a distance between a virtual object corresponding to the terminal and the virtual prop is smaller than a target threshold, a pickup option of the virtual prop is displayed in the virtual scene, when a trigger operation on the pickup option is detected, the terminal may control the virtual object to pick up the virtual prop, and after the pickup is completed, the virtual prop is displayed on a target portion of the virtual object in the virtual scene, thereby symbolizing that the virtual object is equipped with the virtual prop.
The target portion may be a hand, a shoulder, a back, or the like, and the target portion is not limited in the embodiments of the present application. The target threshold may be set by a person skilled in the art according to requirements, and is not limited in this embodiment of the application.
And obtaining the virtual object call in the second obtaining mode.
In the second obtaining mode, the terminal may display a call control in the virtual scene, and when the user wants to call the virtual prop, the terminal may perform a triggering operation on the call control, and then the terminal may receive a triggering signal for the call control, generate a creating instruction, and create the virtual prop in response to the creating instruction. The calling control is used for calling the virtual prop to enter the virtual scene, and the shape of the calling control can be a button which is displayed in a suspended mode in the virtual scene.
After the virtual object is equipped with the virtual prop, the user can select the virtual prop, launch the operation under the condition that the virtual object controls the virtual prop, and different from other shooting type virtual props, the virtual prop can not execute shooting when launching the operation, but enters a target state, and energy is filled in the virtual prop. The target state may be referred to herein as the stored state. It can be understood that the virtual prop is used by storing power and then launching the projectile.
In a possible implementation manner, the terminal may further display a launch control of the virtual prop on the graphical user interface, where the trigger control is configured to detect a launch operation of the user, so as to control the virtual object to control the virtual prop to launch the launcher through the launch control.
In one possible implementation, when the virtual prop is filled with energy, a progress prompt may also be displayed to prompt the energy filling progress. Specifically, the terminal may display a progress prompt in the graphical user interface in response to the virtual item being in the target state, where the progress prompt is used to prompt an energy filling progress of the virtual item. Through the display of the progress prompt, the user can intuitively know the energy filling progress of the virtual prop, so that whether to continue to accumulate the power or give up accumulating the power and fight with the virtual object through other virtual props is determined according to the progress and the current environment. The progress prompt is displayed visually and clearly, and is beneficial to assisting a user to compete with other virtual objects, so that the display quantity of the related information of the virtual prop is increased, and the display effect of the related information of the virtual prop is improved.
In one possible implementation, the virtual item may fail to launch the projectile when in the target state, and may be launched until power is built up, i.e., when no longer in the target state. Specifically, the terminal may set the launching operation button of the virtual item to a trigger prohibition state in response to the virtual item being in the target state. Therefore, the power of the virtual prop is large, and the virtual prop cannot be launched all the time like other virtual props through the limitation of a target state, so that the gap between the virtual prop and other virtual props can be effectively balanced, and the fairness of electronic sports is ensured.
302. And the terminal responds that the time length of the virtual prop in the target state reaches the target time length, and the launching operation is finished, and controls the launcher of the virtual prop to move in the virtual scene according to the target motion track.
The launching operation lasts for a target duration, the duration that the virtual prop can be in the target state reaches the target duration, and at the moment, the filling of energy by the virtual prop is completed, and a launcher can be launched.
The target motion track of the virtual prop can be related to the visual angle direction during launching operation, and a user can adjust the visual angle direction of a virtual scene through visual angle adjusting operation during launching operation, so that the target motion track of the virtual prop is adjusted. Specifically, when the time length when the virtual prop is in the target state reaches the target time length and the launching operation is finished, the terminal may obtain the target motion trajectory of the launcher of the virtual prop based on the view angle direction of the current virtual scene, and control the launcher of the virtual prop to move in the virtual scene according to the target motion trajectory.
In the process of acquiring the target motion trajectory, an included angle between the view angle direction of the current virtual scene and the horizontal direction in the virtual scene can be used as a launch angle, and the target motion trajectory can be acquired according to the launch angle and the stress information of the virtual prop. For example, the force information may be a vertical downward gravity, and for example, the force is a vertical downward gravity and an air resistance opposite to the movement direction of the virtual prop. The embodiment of the application does not specifically limit the stress information of the virtual prop. In one possible implementation, the projectile may have a special effect of an electric ball, and the target motion trajectory is a parabola.
It should be noted that, the foregoing steps 301 and 302 are processes of controlling a launcher of a virtual item to move in a virtual scene according to a target motion trajectory in response to a launching operation of the virtual item, and the foregoing processes are only described by taking the virtual item having a target state as an example.
Correspondingly, the process may specifically be: the terminal responds to the launching operation of the virtual prop, obtains the target motion track of the launcher of the virtual prop based on the visual angle direction of the current virtual scene, and controls the launcher of the virtual prop to move in the virtual scene according to the target motion track.
The launching process of the virtual item is described below by a specific example. For example, as shown in fig. 4, a virtual item is displayed on the ground in a virtual scene, a user controls a virtual object to approach the virtual item, when the distance between the virtual object and the virtual item is smaller than a target threshold, the terminal may display a pickup option 401 of the virtual item, the pickup option may include a name 402 (e.g., XX) of the virtual item, and may also display a pickup priority 403 of the virtual item, for example, the virtual item is more severe, and thus the priority is set to be high, and at this time, a recommendation to pick up the virtual item may be displayed (e.g., the pickup option of the virtual item is highlighted). The user can carry out a picking operation, and the terminal can pick the virtual prop to equip the virtual prop on the virtual object in response to the picking operation. As shown in fig. 5, the terminal may display the launch control 404 in the user graphical interface, where the number of the launch controls 404 may be one, or may be two or more, and this is not limited in this embodiment of the application. When the user presses launch control 404 while the virtual object controls the virtual item, the virtual item enters a stored state (i.e., a target state). The terminal can display the progress prompt 405 in the graphical user interface, and the progress prompt 405 can be a progress circle, and the current accumulated amount and the accumulated amount can be known through the progress circle. During the power up process, the terminal may also display the launch control 404 in a target display style, such as a disable operation style. The terminal can display the candidate motion trail according to the emission angle corresponding to the current visual angle. If the user wants to adjust, the user can perform the adjustment operation of the view angle, and the terminal can adjust the candidate motion trajectory according to the adjustment operation of the view angle, and can stop the triggering operation of the launching control 404 until the user is satisfied, so that the virtual prop is controlled to launch the launcher. The candidate motion trail when transmitting is the target motion trail. Specifically, as shown in fig. 6, the accumulation is completed, and the progress indicator 405 (progress circle) becomes one hundred percent, so that the projectile can be emitted. As shown in fig. 7, the launcher may be a special effect of an electric globe 406, the target motion trajectory may be a parabola, and the terminal may display that the special effect of the electric globe moves according to the parabola.
It should be noted that the launching control may be a trigger control of another virtual item or a trigger control of another action before the user selects the throwing type virtual item. For example, if the user selects a virtual item of the firearm class, the launch control may also be referred to as a gun firing control. If the user does not select any virtual prop, the launch control may be a punch control. And if the user selects the throwing-type virtual prop, the launching control can be called as a throwing control.
In a possible implementation manner, the terminal may display the launch control according to the state of the currently controlled virtual item and the display style corresponding to the state. For example, when the launch control is a gun firing control, the display style of the launch control is as follows: the button center shows a bullet. If the touch control is a throwing control, the display style of the launching control is as follows: the grenade is displayed in the center of the button. If the launching control is a punch control, the display style of the launching control is as follows: the center of the button displays a fist.
In a possible implementation manner, in this step 302, the terminal may further display a target motion trajectory of the launcher of the virtual item, so that the user can observe the motion condition of the launcher of the virtual item more intuitively and clearly. Specifically, the terminal may respond to the launching operation of the virtual prop, obtain and display a target motion trajectory of a launcher of the virtual prop, and display that the launcher moves in a virtual scene along the target motion trajectory.
It should be noted that, in step 302, for the case that the launching operation lasts for the target duration and the virtual item filling energy is complete, there is also a possible case that, when the virtual item has not completed power storage or energy filling (i.e. the filling energy is not complete), the terminal may respond that the duration that the virtual item is in the target state is less than the target duration, and the launching operation is complete, and set the state of the virtual item to the idle state. That is, if the user does not want to continue using the virtual prop, the launching operation can be stopped, and the launching operation is not continued, so that the virtual prop can be no longer in the target state, but can be changed back to the idle state, and thus the user can judge whether to continue to accumulate the force or give up accumulating the force to perform other operations according to the use requirement of the user, a flexible operation mode is provided, and the operation difficulty is low.
303. And the terminal responds to the target position of the launcher on the target motion track, and acquires a target area corresponding to the target position.
Wherein the target position may be determined based on a target motion trajectory of the virtual prop. The target area may be determined based on the target location.
For the acquiring process of the target area, the terminal may use an area including the target position as the target area, where the target area is an influence area of the emitting object, and the emitting object may cause an electric shock influence on a virtual object in the target area. In one possible implementation, the terminal may acquire an area of a target size centered on the target position as the target area.
The target size of the target area may be set by a related technician as required, and the target area may be a circle, a square, or another shape.
In a specific possible embodiment, the target area may be a circular area or a spherical area, and specifically, the terminal may acquire a circular area with the target position as a center and a target radius as a radius as the target area, or acquire a spherical area with the target position as a center and a target radius as a radius as the target area.
For the determination of the target position, the projectile may include a plurality of triggering modes that change, for example, when the projectile falls on the ground, the change may have a shock effect on the virtual object in the target area, and for example, when the time of the projectile that has been emitted reaches a time threshold, the change may have a shock effect on the virtual object in the target area. As another example, changes in the projectile upon collision with other objects have a shocking effect on virtual objects within the target area. Five possible implementations of determining the target location are provided below for different triggering modes:
in the first mode, the terminal determines the falling point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determines the falling point as the target position. In the first mode, the projectile may explode when landing, causing an electric shock to the virtual object in the target area. The target location is also the drop point of the projectile.
And secondly, the terminal determines the position of the target motion track where the transmitted time of the transmitting object reaches a time threshold value as the target position according to the target motion track and the transmitted time of the transmitting object. In the second mode, when the time of the emission of the projectile reaches the time threshold, the virtual object in the target area is shocked by the explosion. The terminal can count the transmitted time of the transmitting object, compare the transmitted time with a time threshold value, and take the position of the transmitting object as a target position when the transmitted time and the time threshold value are the same.
And thirdly, the terminal determines the falling point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determines the falling point as the target position in response to the launcher reaching the falling point of the launcher and the time of the launcher being less than a time threshold.
And fourthly, the terminal determines the falling point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determines the position of the launcher on the target motion track, at which the time of launching the launcher reaches the time threshold, as the target position in response to the launcher not reaching the falling point and the time of launching the launcher reaching the time threshold.
In the third and fourth modes, the launcher has two triggering modes, deformation can occur when the launcher falls to the ground, and the time that the launched time reaches the time threshold value can also change. Thus, the server can determine whether the transmitter lands first or whether the transmitted time reaches a time threshold before landing. Once either is satisfied, the target location may be determined.
In the first, third, and fourth modes, the process of determining the drop point of the launcher by the server may specifically be: and the server acquires the intersection point of the target motion track and the ground in the virtual scene, and takes the intersection point as the falling point of the launcher.
And fifthly, the terminal responds to the situation that the target motion trail includes the obstacle, and determines the collision position of the collision between the launcher and the obstacle on the target motion trail as the target position. In the fifth mode, the projectile may change to cause damage to the virtual object in the target area when colliding with the obstacle. Wherein the obstacle may comprise a plurality of types, such as a virtual object, or a virtual item in a virtual scene.
In the above-mentioned modes two, three, and four, the launcher may also have a fixed launch time, and when the launch time of the launcher reaches a time threshold, the virtual prop may change. Specifically, after the terminal emits the missile, the remaining launching duration of the missile can be displayed in the user graphical interface.
304. The terminal displays a target animation in the target area, wherein the target animation is used for representing that the virtual object in the target area is affected by electric shock.
And after the terminal obtains the target area, the target animation can be displayed in the target area, and the target animation can be used for the effect that the missile is exploded and the electric energy is dispersed. For example, as shown in fig. 8, after the missile lands on the ground, the effect of the missile exploding the electric energy is displayed in an area 801 (i.e., a target area) around the landing position.
It should be noted that, the above steps 303 and 304 are steps of displaying a target animation in a target area corresponding to a target position in response to the projectile reaching the target position on the target motion trajectory, where the projectile can cause a click effect on the target area within a preset range of the target position.
If at least one virtual object is included in the target area, a shock effect can be caused to the at least one virtual object. Different effects may also be provided for this shock effect.
In one possible implementation, the shock impact is to lower the virtual life value of the virtual object. Specifically, the terminal may control the virtual life value of at least one virtual object to decrease in response to the target area including the at least one virtual object. For example, the shock effect may cause a loss of virtual life value of 50 points. The target area includes two virtual objects, wherein one virtual object with a virtual life value of 200 originally is affected by an electric shock, and the virtual life value is changed to 150. Another virtual object with a virtual life value of 40 originally is affected by the electric shock, the virtual life value becomes 0, and the virtual object is eliminated.
In one possible implementation, in addition to the loss of virtual life value described above, the shock effect may also cause other effects, for example, the speed of movement of the virtual object can be reduced. Specifically, the terminal may control the moving speed of any virtual object in the target area to be decreased in response to the virtual life value of the virtual object being greater than zero. For example, after a certain virtual object is affected by an electric shock, the virtual life value becomes 150, the original moving speed is 20, but due to the effect of the electric shock, the moving speed becomes 0, so that the paralysis effect of the virtual object after the electric shock is truly simulated, the simulation effect is vivid, and the user experience can be effectively improved.
When the moving speed of the virtual object is controlled to be reduced, the moving speed can be reduced by a fixed value, which indicates that the moving speed is influenced to a certain extent after the electric shock is received. Accordingly, the process of controlling the reduction of the moving speed of the virtual object may be: and the terminal acquires the difference value between the current moving speed of the virtual object and the speed adjusting value, and takes the difference value as the reduced moving speed of the virtual object.
When the moving speed is reduced, the moving speed can also be reduced to zero, which indicates that the patient is paralyzed after receiving electric shock and cannot move. Accordingly, the process of controlling the reduction of the moving speed of the virtual object may be: the terminal sets the moving speed of the virtual object to zero.
Of course, the virtual life value of the virtual object with reduced moving speed is greater than zero, if the virtual life value becomes zero due to the effect of electric shock, the virtual object is eliminated, and the terminal may display that the virtual object is in the eliminated state in response to that the virtual life value of any virtual object in the target area is zero.
The terminal corresponding to the virtual object affected by the electric shock can display a shock paralysis animation in a user graphic interface to represent the effect of paralysis caused by the electric shock on the currently controlled virtual object. In one possible implementation, the virtual subject may also be inoperable when electrocution is experienced, such that the terminal corresponding to the virtual subject affected by the electric shock may also ignore control operations detected when the virtual subject is electrocution.
If the virtual object controlled by the current terminal is also in the target area of the launcher of any virtual prop, the virtual object is also affected by electric shock. Specifically, the terminal may display a electrocution paralysis animation in the graphical user interface in response to the currently controlled virtual object being within the target area of the launcher of any of the virtual props, the electrocution paralysis animation being used to represent the effect of the currently controlled virtual object being subjected to an electric shock causing paralysis. For example, as shown in fig. 9, the terminal may display a electrocortication animation in the user graphical interface, all with an electrical spark 901 in the screen.
In one possible implementation, the electric-shock paralytic effect may last for a target time period during which the virtual subject may be inoperable because of the electric-shock paralytic effect. Specifically, the terminal may receive a control instruction of any pair of virtual objects within a target time period in response to that the currently controlled virtual object is located in the target area of the launcher of any virtual item, and ignore the control instruction.
This application embodiment is through responding to the transmission operation to virtual stage property, control virtual stage property transmission firing object, move according to the target motion orbit at this firing object, when moving to the target location, show the target animation in the target area that this target location corresponds, reach target location department at the firing object, can cause the electric shock influence to the virtual object in an area, the user need not the accurate virtual object that aims like this, the virtual object can receive the electric shock influence in this target area, thereby user operation has been simplified, the operation degree of difficulty has been reduced, and then improve the frequency of use of this virtual stage property, improve user's viscosity.
A specific example is provided below by way of the embodiment shown in fig. 10, and is specifically as follows.
1. As shown in fig. 10, a player (i.e., user) finds and picks up a gun weapon (i.e., a virtual prop), and each weapon model on the ground is hung with a collision box for detecting a collision target. For example, as shown in FIG. 11, the firearm may have a crash box 1101, which may be square in shape, that triggers the pick-up logic when a crash box on character 1102 (i.e., a virtual object) comes into contact with the crash box, and the player may then retrieve the firearm.
2. When the player obtains the weapon, the player presses the firing key (i.e. the firing control 404), the weapon can enter a power accumulation state (i.e. a target state), the weapon cannot be fired under the condition that the power accumulation is not full, the player can release the hand to cancel the power accumulation, and the player can release the hand to enter a firing and firing state after the power accumulation is finished.
3. When the player looses his hand, the player can launch a ball (i.e. a launcher, an electric carrier) from the muzzle position, and the ball will form a parabolic track (i.e. a target motion track) according to the launching direction, speed and configured gravity acceleration, and the larger the gravity, the smaller the distance the ball can fly.
4. A very short line segment can be sent out to carry out ray detection in the process of electric ball flight, and whether collision can occur or whether landing can be judged through the ray detection. Specifically, the ray may use the current position of the electric ball as a starting point, and the length and direction of the ray are determined by the target distance and the moving direction of the electric ball. For example, ray AB as shown in fig. 12.
5. The electric ball falls to the ground and explodes, and the explosion range is a range (namely a target area) with the explosion center as the origin and the radius of R, as shown by a range 1300 shown in fig. 13.
6. If there is a virtual object in the explosive range, the object is injured, and if the object is not dead after the injury, the object is paralyzed, that is, the movement is limited in a certain time, such as deceleration, besides the deduction of the virtual life value. And when the effect time (namely the target time period) is over, the normal state is recovered.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
Fig. 14 is a schematic structural diagram of a control device for a virtual prop according to an embodiment of the present application, please refer to fig. 14, where the device includes:
a control module 1401, configured to control a launcher of the virtual item to move in a virtual scene according to a target motion trajectory in response to a launching operation of the virtual item;
a display module 1402, configured to, in response to the projectile reaching the target position on the target motion trajectory, display a target animation in a target area corresponding to the target position, where the target animation is used to represent that a shock is applied to a virtual object in the target area.
In one possible implementation, the display module 1402 is configured to:
acquiring a target area corresponding to the target position;
and displaying the target animation in the target area.
In one possible implementation, the display module 1402 is configured to obtain a target area with a target size centered on the target position as the target area.
In one possible implementation, the display module 1402 is configured to perform any one of the following:
acquiring a circular area with the target position as the center and the target radius as the target area;
and acquiring a spherical area with the target position as the center and the target radius as the target area.
In one possible implementation, the determining of the target position includes any one of:
determining a drop point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determining the drop point as the target position;
determining the position of the target motion track where the transmitted time of the transmitting object reaches a time threshold as the target position according to the target motion track and the transmitted time of the transmitting object;
determining a drop point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determining the drop point as the target position in response to the launcher reaching the drop point of the launcher and the launched time of the launcher being less than a time threshold;
determining a drop point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determining the position on the target motion track, at which the time of launching the launcher reaches a time threshold, as the target position in response to the launcher not reaching the drop point and the time of launching the launcher reaching the time threshold;
and determining a collision position of the launcher colliding with the obstacle on the target motion trail as the target position in response to the target motion trail including the obstacle.
In one possible implementation, the control module 1401 is configured to:
in response to the start of the launching operation of the virtual item, setting the state of the virtual item to a target state, wherein the target state is used for representing the energy for filling the virtual item;
and responding to the fact that the time length of the virtual prop in the target state reaches the target time length and the launching operation is finished, and executing the step of controlling the launcher of the virtual prop to move in the virtual scene according to the target motion track.
In one possible implementation, the display module 1402 is further configured to display a progress prompt in a graphical user interface in response to the virtual prop being in the target state, the progress prompt prompting an energy filling progress of the virtual prop.
In one possible implementation, the apparatus further includes:
and the setting module is used for setting the state of the virtual prop to be an idle state in response to the fact that the duration of the virtual prop in the target state is less than the target duration and the launching operation is finished.
In one possible implementation, the setting module is further configured to:
and setting a launching operation button of the virtual prop to a trigger-forbidden state in response to the virtual prop being in the target state.
In one possible implementation, the control module 1401 is further configured to control the virtual life value of at least one virtual object to be decreased in response to the at least one virtual object being included in the target area.
In one possible implementation, the control module 1401 is further configured to control the moving speed of any virtual object in the target area to be decreased in response to the virtual life value of the virtual object being greater than zero.
In one possible implementation, the display module 1402 is further configured to display any virtual object in the target area in an obsolete state in response to the virtual life value of the virtual object being zero.
In one possible implementation, the control module 1401 is further configured to perform any one of:
obtaining a difference value between the current moving speed of the virtual object and a speed adjusting value, and taking the difference value as the reduced moving speed of the virtual object;
the moving speed of the virtual object is set to zero.
In one possible implementation, the control module 1401 is configured to:
responding to the launching operation of the virtual prop, and acquiring a target motion track of a launcher of the virtual prop based on the visual angle direction of the current virtual scene;
and controlling the launcher of the virtual prop to move in the virtual scene according to the target motion track.
In one possible implementation, the display module 1402 is further configured to display a electric-shock paralysis animation in the graphical user interface, in response to the currently controlled virtual object being within the target area of the projectile of any of the virtual props, the electric-shock paralysis animation being indicative of an effect of the currently controlled virtual object being subjected to an electric shock resulting in paralysis.
In one possible implementation manner, the control module 1401 is further configured to, in response to that the currently controlled virtual object is in the target area of the launcher of any virtual prop, receive a control instruction of any pair of the virtual objects in a target time period, and ignore the control instruction.
This application embodiment is through responding to the transmission operation to virtual stage property, control virtual stage property transmission firing object, move according to the target motion orbit at this firing object, when moving to the target location, show the target animation in the target area that this target location corresponds, reach target location department at the firing object, can cause the electric shock influence to the virtual object in an area, the user need not the accurate virtual object that aims like this, the virtual object can receive the electric shock influence in this target area, thereby user operation has been simplified, the operation degree of difficulty has been reduced, and then improve the frequency of use of this virtual stage property, improve user's viscosity.
It should be noted that: the control device of the virtual item provided in the above embodiment is only illustrated by the division of the above functional modules when controlling the virtual item, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the electronic device is divided into different functional modules to complete all or part of the functions described above. In addition, the control device of the virtual item and the control method of the virtual item provided by the above embodiment belong to the same concept, and the specific implementation process is detailed in the control method of the virtual item, and is not described again here.
The electronic device may be provided as a terminal shown in fig. 15 described below, or may be provided as a server shown in fig. 16 described below, which is not limited in this embodiment of the present application.
Fig. 15 is a schematic structural diagram of a terminal 1500 according to an embodiment of the present application, where the terminal 1500 may be: a smart phone, a tablet computer, an MP3 (Moving Picture Experts Group Audio Layer III, moving Picture Experts Group Audio Layer IV, moving Picture Experts Group Audio Layer 4) player, a notebook computer, or a desktop computer. Terminal 1500 may also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or other names.
In general, terminal 1500 includes: a processor 1501 and memory 1502.
Processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1501 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, processor 1501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 1502 may include one or more computer-readable storage media, which may be non-transitory. The memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1502 is used to store at least one program code for execution by processor 1501 to implement method steps at the terminal side of the control methods of virtual props provided by various embodiments herein.
In some embodiments, the terminal 1500 may further include: a peripheral interface 1503 and at least one peripheral. The processor 1501, memory 1502, and peripheral interface 1503 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1503 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1504, a touch screen display 1505, a camera assembly 1506, an audio circuit 1507, and a power supply 1509.
The peripheral interface 1503 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1501 and the memory 1502. In some embodiments, the processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuitry 1504 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1504 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1504 can communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1505 is a touch display screen, the display screen 1505 also has the ability to capture touch signals on or over the surface of the display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. In this case, the display screen 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1505 may be one, providing the front panel of terminal 1500; in other embodiments, display 1505 may be at least two, each disposed on a different surface of terminal 1500 or in a folded design; in still other embodiments, display 1505 may be a flexible display disposed on a curved surface or a folded surface of terminal 1500. Even further, the display 1505 may be configured in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1505 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1506 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1507 may include a microphone and speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1501 for processing or inputting the electric signals to the radio frequency circuit 1504 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of the terminal 1500. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1507 may also include a headphone jack.
Power supply 1509 is used to power the various components in terminal 1500. The power supply 1509 may be alternating current, direct current, disposable or rechargeable. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may support wired charging or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 may detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 1500. For example, the acceleration sensor 1511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1501 may control the touch screen display 1505 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for acquisition of motion data of a game or a user.
The gyroscope sensor 1512 can detect the body direction and the rotation angle of the terminal 1500, and the gyroscope sensor 1512 and the acceleration sensor 1511 cooperate to collect the 3D motion of the user on the terminal 1500. The processor 1501 may implement the following functions according to the data collected by the gyro sensor 1512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1513 may be disposed on a side bezel of terminal 1500 and/or underneath touch display 1505. When the pressure sensor 1513 is disposed on the side frame of the terminal 1500, the holding signal of the user to the terminal 1500 may be detected, and the processor 1501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at a lower layer of the touch display 1505, the processor 1501 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1505. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The optical sensor 1515 is used to collect ambient light intensity. In one embodiment, processor 1501 may control the brightness of the display on touch screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1505 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1505 is turned down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also known as a distance sensor, is typically provided on the front panel of the terminal 1500. The proximity sensor 1516 is used to collect a distance between the user and the front surface of the terminal 1500. In one embodiment, when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually decreases, the touch display 1505 is controlled by the processor 1501 to switch from the bright screen state to the mute screen state; when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually becomes larger, the processor 1501 controls the touch display 1505 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 15 does not constitute a limitation of terminal 1500, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be employed.
Fig. 16 is a schematic structural diagram of a server 1600 provided in this embodiment of the present application, where the server 1600 may generate relatively large differences due to different configurations or performances, and may include one or more processors (CPUs) 1601 and one or more memories 1602, where the memory 1602 stores at least one program code, and the at least one program code is loaded and executed by the processors 1601 to implement the method steps on the server side in the control method for the virtual prop provided in the foregoing embodiments. Of course, the server 1600 may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server 1600 may also include other components for implementing device functions, which are not described herein.
In an exemplary embodiment, there is also provided a computer-readable storage medium, such as a memory including at least one program code, which is executable by a processor in an electronic device to perform the method for controlling a virtual prop in the above embodiments. For example, the computer-readable storage medium may be a ROM (Read-Only Memory), a RAM (Random-Access Memory), a CD-ROM (Compact Disc Read-Only Memory), a magnetic tape, a floppy disk, an optical data storage device, and the like.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (20)

1. A control method of a virtual prop is characterized by comprising the following steps:
in a virtual scene, a calling control is displayed in a suspended mode, and a virtual prop is created in response to the triggering operation of the calling control, wherein the calling control is used for calling the virtual prop to enter the virtual scene;
setting the state of the virtual prop to be a target state in response to pressing of a firing button of the virtual prop, and displaying a candidate motion track according to a firing angle of a launcher of the virtual prop, wherein the firing button is a trigger control of other virtual props or a trigger control of other actions before acquiring the virtual prop, the display style of the firing button is determined according to the currently controlled state of the virtual prop, the display styles of the firing button are different if the states of the virtual props are different, and the target state is used for representing energy which is filling the virtual prop; responding to the visual angle adjustment operation, and displaying a candidate motion track corresponding to the adjusted emission angle;
responding to the fact that the time length of the virtual prop in the target state reaches the target time length, releasing a firing button of the virtual prop, taking a candidate motion track corresponding to the current launching angle as a target motion track, displaying the target motion track and controlling the launcher to move in the virtual scene according to the target motion track; the method comprises the following steps that a line segment is emitted by taking the current position of an emitting object as a starting point in the moving process of the emitting object to carry out ray detection, the direction of a ray is determined according to the moving direction of the emitting object, and the ray detection is used for judging whether the emitting object is collided or not or whether the emitting object falls to the ground or not;
responding to the launcher to reach a target position on the target motion track, displaying a target animation in a target area corresponding to the target position, wherein the target animation is used for representing electric shock influence on each virtual object in the target area, the target animation comprises the effect that the launcher explodes so that electric energy is dispersed, and the electric shock influence comprises the reduction of the moving speed of the virtual object with the virtual life value larger than zero in the target area;
the method further comprises the following steps: responding to the fact that the duration of the virtual prop in the target state is less than the target duration, and releasing a firing button of the virtual prop, and setting the state of the virtual prop to be an idle state;
the method further comprises the following steps: responding to the virtual prop being in the target state, displaying a progress prompt in a user graphical interface, wherein the progress prompt is used for prompting the energy filling progress of the virtual prop, the progress prompt comprises a progress circle, and the progress circle is used for indicating the current power storage amount and the power not stored;
the method further comprises the following steps: responsive to a currently-controlled virtual subject being within the target region of a projectile of any of the virtual props, displaying a electrocaralysis animation in the user graphical interface, the electrocaralysis animation to represent an effect of the currently-controlled virtual subject being electrocuted; and in response to the currently controlled virtual object being in the target area of the launcher of any virtual prop, receiving any control instruction of the virtual object within a target time period, and ignoring the control instruction.
2. The method of claim 1, wherein displaying a target animation in a target area corresponding to the target location comprises:
acquiring a target area corresponding to the target position;
and displaying the target animation in the target area.
3. The method of claim 2, wherein the obtaining a target area corresponding to the target position comprises:
and acquiring a region with the target position as the center and the target size as the target region.
4. The method according to claim 3, wherein the acquiring a region of a target size centered on the target position as the target region comprises:
acquiring a circular area with the target position as the center and the target radius as the target area; alternatively, the first and second electrodes may be,
and acquiring a spherical area with the target position as the center and the target radius as the target area.
5. The method of claim 1, wherein the target location is determined by any one of:
determining a drop point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determining the drop point as the target position;
determining the position, on the target motion track, of the transmitting object, of which the transmitting time reaches a time threshold value, as the target position according to the target motion track and the transmitting time of the transmitting object;
determining a landing point of the launcher according to the target motion track and the position of the ground in a virtual scene, and determining the landing point as the target position in response to the launcher reaching the landing point of the launcher and the time of the launcher being transmitted being less than a time threshold;
determining a drop point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determining the position on the target motion track, at which the time of launching the launcher reaches a time threshold, as the target position in response to the launcher not reaching the drop point and the time of launching the launcher reaching the time threshold;
and determining a collision position of the transmitter colliding with the obstacle on the target motion trail as the target position in response to the obstacle being included on the target motion trail.
6. The method of claim 1, further comprising:
controlling a virtual life value of at least one virtual object to decrease in response to the at least one virtual object being included within the target region.
7. The method of claim 6, further comprising:
and responding to the condition that the virtual life value of any virtual object in the target area is zero, and displaying that the virtual object is in a deselected state.
8. The method of claim 1, further comprising:
and in response to the virtual prop being in the target state, setting a firing button of the virtual prop to a trigger-forbidden state.
9. The method according to claim 1, wherein the controlling of the reduction of the moving speed of the virtual object with the virtual vital value larger than zero in the target area comprises any one of the following steps:
acquiring a difference value between the current moving speed of the virtual object and a speed adjusting value, and taking the difference value as the reduced moving speed of the virtual object;
setting a moving speed of the virtual object to zero.
10. An apparatus for controlling a virtual prop, the apparatus comprising:
means for performing the steps of: in a virtual scene, displaying a calling control in a floating manner, and creating a virtual prop in response to the triggering operation of the calling control, wherein the calling control is used for calling the virtual prop to enter the virtual scene;
the control module is used for responding to pressing of a firing button of the virtual prop, setting the state of the virtual prop to be a target state, and displaying a candidate motion trail according to a launching angle of a launcher of the virtual prop, wherein the firing button is a trigger control of other virtual props or a trigger control of other actions before the virtual prop is acquired, the display style of the firing button is determined according to the currently controlled state of the virtual prop, if the states of the virtual props are different, the display styles of the firing button are different, and the target state is used for representing energy filling the virtual prop; responding to the visual angle adjustment operation, and displaying a candidate motion track corresponding to the adjusted emission angle; responding to the fact that the time length of the virtual prop in the target state reaches the target time length, releasing a firing button of the virtual prop, taking a candidate motion track corresponding to the current launching angle as a target motion track, and controlling the launcher to move in the virtual scene according to the target motion track; the method comprises the following steps that a line segment is emitted by taking the current position of an emitting object as a starting point in the moving process of the emitting object to carry out ray detection, the direction of a ray is determined according to the moving direction of the emitting object, and the ray detection is used for judging whether the emitting object is collided or not or whether the emitting object falls to the ground or not;
means for performing the steps of: displaying the target motion track;
the display module is used for responding to the fact that the launcher reaches a target position on the target motion track, displaying a target animation in a target area corresponding to the target position, wherein the target animation is used for representing that electric shock influence is caused on each virtual object in the target area, the target animation comprises the effect that the launcher explodes to enable electric energy to be dispersed, and the electric shock influence comprises the effect of controlling the moving speed of the virtual object with the virtual life value larger than zero in the target area to be reduced;
the setting module is used for responding that the duration of the virtual prop in the target state is less than the target duration, releasing a firing button of the virtual prop and setting the state of the virtual prop to be an idle state;
the display module is further configured to: responding to the virtual prop being in the target state, displaying a progress prompt in a user graphical interface, wherein the progress prompt is used for prompting the energy filling progress of the virtual prop, the progress prompt comprises a progress circle, and the progress circle is used for indicating the current power storage amount and the power not stored;
the display module is further configured to: in response to a currently controlled virtual object being within the target region of a projectile of any of the virtual props, displaying a shock paralysis animation in the graphical user interface, the shock paralysis animation being indicative of an effect of the currently controlled virtual object being subjected to a shock resulting in paralysis;
the control module is further configured to receive a control instruction for any pair of the virtual objects within a target time period in response to that the currently controlled virtual object is within the target area of the launcher of any virtual item, and ignore the control instruction.
11. The apparatus of claim 10, wherein the display module is configured to:
acquiring a target area corresponding to the target position;
displaying the target animation in the target area.
12. The apparatus of claim 11, wherein the display module is configured to obtain a target area with a target size centered on the target position.
13. The apparatus of claim 12, wherein the display module is configured to perform any one of:
acquiring a circular area with the target position as the center and the target radius as the target area;
and acquiring a spherical area with the target position as the center and the target radius as the target area.
14. The apparatus of claim 10, wherein the target location is determined by any one of:
determining a drop point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determining the drop point as the target position;
determining the position, on the target motion track, of the transmitting object, of which the transmitting time reaches a time threshold value, as the target position according to the target motion track and the transmitting time of the transmitting object;
determining a landing point of the launcher according to the target motion track and the position of the ground in a virtual scene, and determining the landing point as the target position in response to the launcher reaching the landing point of the launcher and the time of the launcher being transmitted being less than a time threshold;
determining a drop point of the launcher according to the target motion track and the position of the ground in the virtual scene, and determining the position on the target motion track, at which the time of launching the launcher reaches a time threshold, as the target position in response to the launcher not reaching the drop point and the time of launching the launcher reaching the time threshold;
and determining a collision position of the transmitter colliding with the obstacle on the target motion trail as the target position in response to the obstacle being included on the target motion trail.
15. The apparatus of claim 10, wherein the control module is further configured to control the virtual life value of at least one virtual object to decrease in response to the at least one virtual object being included in the target region.
16. The apparatus of claim 15, wherein the control module is further configured to display the virtual object in an obsolete state in response to a virtual life value of any virtual object within the target area being zero.
17. The apparatus of claim 10, wherein the setting module is further configured to set a firing button of the virtual item to a disabled triggered state in response to the virtual item being in the target state.
18. The apparatus of claim 10, wherein the control module is further configured to perform any of:
acquiring a difference value between the current moving speed of the virtual object and a speed adjusting value, and taking the difference value as the reduced moving speed of the virtual object;
setting a moving speed of the virtual object to zero.
19. An electronic device, comprising one or more processors and one or more memories having stored therein at least one program code, the at least one program code being loaded and executed by the one or more processors to implement the operations performed by the method of controlling a virtual prop according to any one of claims 1 to 9.
20. A storage medium, characterized in that at least one program code is stored in the storage medium, and the at least one program code is loaded and executed by a processor to implement the operations executed by the control method of the virtual prop according to any one of claims 1 to 9.
CN202010321380.4A 2020-04-22 2020-04-22 Control method and device of virtual prop, electronic equipment and storage medium Active CN111589150B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010321380.4A CN111589150B (en) 2020-04-22 2020-04-22 Control method and device of virtual prop, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010321380.4A CN111589150B (en) 2020-04-22 2020-04-22 Control method and device of virtual prop, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111589150A CN111589150A (en) 2020-08-28
CN111589150B true CN111589150B (en) 2023-03-24

Family

ID=72183515

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010321380.4A Active CN111589150B (en) 2020-04-22 2020-04-22 Control method and device of virtual prop, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111589150B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148187A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 Interaction method and device for augmented reality scene, electronic equipment and storage medium
CN112426725A (en) * 2020-11-23 2021-03-02 腾讯科技(深圳)有限公司 Virtual object control method, device, terminal and storage medium
CN112619164B (en) * 2020-12-22 2023-05-02 上海米哈游天命科技有限公司 Method, device, equipment and storage medium for determining flying height of transmission target
CN112717410B (en) * 2021-01-21 2023-03-14 腾讯科技(深圳)有限公司 Virtual object control method and device, computer equipment and storage medium
CN113384885B (en) * 2021-06-22 2024-04-09 网易(杭州)网络有限公司 Game object control method and device, storage medium and electronic equipment
CN113769385B (en) * 2021-09-17 2023-07-14 腾讯科技(深圳)有限公司 Virtual object transfer method and related device
CN113750530B (en) * 2021-09-18 2023-07-21 腾讯科技(深圳)有限公司 Prop control method, device, equipment and storage medium in virtual scene
CN113888724A (en) * 2021-09-30 2022-01-04 北京字节跳动网络技术有限公司 Animation display method, device and equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8834245B2 (en) * 2007-08-17 2014-09-16 Nintendo Co., Ltd. System and method for lock on target tracking with free targeting capability
CN109568963B (en) * 2017-09-29 2021-12-03 腾讯科技(深圳)有限公司 Virtual resource data processing method and device, computer equipment and storage medium
CN109764758B (en) * 2018-12-26 2022-03-18 安杰特(深圳)智能安全技术有限公司 Electric shock trigger, safety management method and device thereof, and readable storage medium
CN110465098B (en) * 2019-08-08 2020-09-25 腾讯科技(深圳)有限公司 Method, device, equipment and medium for controlling virtual object to use virtual prop
CN110585710B (en) * 2019-09-30 2020-12-25 腾讯科技(深圳)有限公司 Interactive property control method, device, terminal and storage medium
CN110841277B (en) * 2019-11-07 2021-08-06 腾讯科技(深圳)有限公司 Control method and device of virtual operation object based on touch screen and storage medium

Also Published As

Publication number Publication date
CN111589150A (en) 2020-08-28

Similar Documents

Publication Publication Date Title
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
CN110721468B (en) Interactive property control method, device, terminal and storage medium
CN110585710B (en) Interactive property control method, device, terminal and storage medium
CN111475573B (en) Data synchronization method and device, electronic equipment and storage medium
CN110538459A (en) Method, apparatus, device and medium for throwing virtual explosives in virtual environment
CN110507990B (en) Interaction method, device, terminal and storage medium based on virtual aircraft
CN112221141B (en) Method and device for controlling virtual object to use virtual prop
CN112057857B (en) Interactive property processing method, device, terminal and storage medium
CN111330274B (en) Virtual object control method, device, equipment and storage medium
CN111760284A (en) Virtual item control method, device, equipment and storage medium
CN110585706B (en) Interactive property control method, device, terminal and storage medium
CN112933601A (en) Virtual throwing object operation method, device, equipment and medium
CN112704875B (en) Virtual item control method, device, equipment and storage medium
CN112717410B (en) Virtual object control method and device, computer equipment and storage medium
CN112402964B (en) Using method, device, equipment and storage medium of virtual prop
CN111659122B (en) Virtual resource display method and device, electronic equipment and storage medium
CN111589137B (en) Control method, device, equipment and medium of virtual role
CN110960849B (en) Interactive property control method, device, terminal and storage medium
CN111111181A (en) Method, device and equipment for setting props in virtual environment and readable storage medium
CN114100128B (en) Prop special effect display method, device, computer equipment and storage medium
CN112755518B (en) Interactive property control method and device, computer equipment and storage medium
CN111744188B (en) Interactive prop processing method and device, terminal and storage medium
CN113117333B (en) Control method, device, terminal and storage medium of virtual flight vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40027330

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant