CN111589145B - Virtual article display method, device, terminal and storage medium - Google Patents

Virtual article display method, device, terminal and storage medium Download PDF

Info

Publication number
CN111589145B
CN111589145B CN202010324213.5A CN202010324213A CN111589145B CN 111589145 B CN111589145 B CN 111589145B CN 202010324213 A CN202010324213 A CN 202010324213A CN 111589145 B CN111589145 B CN 111589145B
Authority
CN
China
Prior art keywords
throwing
animation
virtual
time length
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010324213.5A
Other languages
Chinese (zh)
Other versions
CN111589145A (en
Inventor
冯啟垚
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010324213.5A priority Critical patent/CN111589145B/en
Publication of CN111589145A publication Critical patent/CN111589145A/en
Application granted granted Critical
Publication of CN111589145B publication Critical patent/CN111589145B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a virtual article display method, a virtual article display device, a virtual article display terminal and a virtual article storage medium, and relates to the technical field of computers. The method comprises the following steps: displaying a user interface; responding to the virtual article in a pre-throwing state, and playing a pre-throwing animation; responding to the end of playing of the pre-throwing animation, and displaying a throwing line; responding to the virtual article switched from the pre-throwing state to the throwing state, and playing the throwing animation; and canceling the display of the throwing line in response to the playing time length of the throwing animation reaching the first time length. Compared with the prior art, the throwing line is displayed when the player clicks the throwing control, and the throwing line disappears when the player releases the hand to stop touching the throwing control. According to the technical scheme provided by the embodiment of the application, the appearance and the disappearance of the throwing line are optimally configured, so that the appearance and the disappearance of the throwing line are more in line with the practical logic, and the accuracy of the appearance and the disappearance of the throwing line is improved.

Description

Virtual article display method, device, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a virtual article display method, a virtual article display device, a virtual article display terminal and a virtual article display storage medium.
Background
In some mobile-end shooting games, a player can control a virtual object to throw a virtual object, such as a virtual torpedo, a virtual bomb, a virtual smoke bomb, and the like, in a virtual scene provided by game play.
In the related art, a throwing control of a virtual article is included in the game match interface, a player can click the throwing control, and a throwing line is displayed in the game match interface to indicate a motion track from a starting point to a landing point after the virtual article is thrown. When the player looses hands and stops touching the throwing control, the throwing animation of the virtual object is played, and the virtual object throws the virtual object.
In the related art, during the process from appearance to disappearance of the cast line, a situation that the cast line does not conform to the realistic logic occurs, so that the appearance and disappearance of the cast line are not accurate enough.
Disclosure of Invention
The embodiment of the application provides a virtual article display method, a virtual article display device, a virtual article display terminal and a virtual article storage medium, which can be used for improving the accuracy of appearance and disappearance of a throwing line. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a method for displaying a virtual article, where the method includes:
displaying a user interface, wherein the user interface comprises a display picture corresponding to a virtual environment, and the virtual environment comprises a virtual article;
in response to the virtual item being in a pre-cast state, playing a pre-cast animation;
in response to the completion of the playing of the pre-cast-and-throw animation, displaying a cast line, the cast line being used to indicate a motion trajectory of the virtual article after being cast;
playing a throwing animation in response to the virtual item being switched from the pre-throwing state to a throwing state;
canceling the display of the throwing line in response to the playing duration of the throwing animation reaching a first duration.
In another aspect, an embodiment of the present application provides a display device for a virtual article, where the device includes:
the interface display module is used for displaying a user interface, the user interface comprises a display picture corresponding to a virtual environment, and the virtual environment comprises a virtual article;
the first playing module is used for responding to the virtual article in a pre-cast state and playing a pre-cast animation;
the throwing line display module is used for responding to the end of playing of the pre-throwing animation and displaying a throwing line, and the throwing line is used for indicating the motion track of the thrown virtual article;
the second playing module is used for responding to the virtual article switched from the pre-throwing state to the throwing state and playing the throwing animation;
and the cancellation display module is used for responding to the playing time length of the throwing animation reaching a first time length and canceling the throwing line.
In yet another aspect, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or a set of instructions, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the display method of the virtual article.
In yet another aspect, an embodiment of the present application provides a computer-readable storage medium, where at least one instruction, at least one program, a code set, or a set of instructions is stored in the computer-readable storage medium, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by a processor to implement the display method of the virtual article.
In still another aspect, the present application provides a computer program product, and when executed by a processor, the computer program product is configured to implement the display method for a virtual article.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
the throwing line is displayed after the playing of the pre-throwing animation is finished, and the throwing line is cancelled when the playing time of the throwing animation reaches the first time length. Compared with the prior art, the throwing line is displayed when the player clicks the throwing control, and the throwing line disappears when the player releases the hand to stop touching the throwing control. According to the technical scheme provided by the embodiment of the application, the appearance and the disappearance of the throwing line are optimally configured, so that the appearance and the disappearance of the throwing line are more in line with the practical logic, and the accuracy of the appearance and the disappearance of the throwing line is improved.
Drawings
FIG. 1 is a schematic illustration of an implementation environment provided by an embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal according to an embodiment of the present application;
FIG. 3 is a flow chart of a method for displaying a virtual item provided by one embodiment of the present application;
FIG. 4 is a schematic diagram illustrating a combat-type item of the present application;
fig. 5 is a schematic diagram illustrating a tactical type of article of the present application;
FIG. 6 is a schematic diagram illustrating a virtual good rig of the present application;
FIG. 7 is a schematic diagram illustrating a pre-cast animation of the present application;
FIG. 8 is a schematic diagram illustrating a throwing line display of the present application;
FIG. 9 is a schematic diagram illustrating the disappearance of a cast line of the present application;
FIG. 10 is a schematic diagram illustrating a throw special effects display of the present application;
FIG. 11 is a schematic diagram illustrating a user interface of the present application;
FIG. 12 illustrates a flow chart of a method for displaying a virtual item provided by an embodiment of the present application;
FIG. 13 is a schematic diagram illustrating another user interface of the present application;
FIG. 14 is a block diagram of a display device for a virtual article provided by one embodiment of the present application;
FIG. 15 is a block diagram of a display device for a virtual article provided in another embodiment of the present application;
fig. 16 is a block diagram illustrating a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, the related terms referred to in the present application will be explained.
1. Virtual environment
A virtual environment is a scene displayed (or provided) by a client of an application program (e.g., a game application program) when running on a terminal, and refers to a scene created for a virtual object to perform an activity (e.g., a game competition), such as a virtual house, a virtual island, a virtual map, and the like. The virtual environment may be a simulation scene of a real world, a semi-simulation semi-fictional scene, or a pure fictional scene. The virtual environment may be a two-dimensional virtual environment, a 2.5-dimensional virtual environment, or a three-dimensional virtual environment, which is not limited in this embodiment of the present application.
2. Virtual object
The virtual object refers to a virtual role controlled by the user account in the application program. Taking an application as a game application as an example, the virtual object refers to a game character controlled by a user account in the game application. The virtual object may be in the form of a character, an animal, a cartoon or other forms, which is not limited in this application. The virtual object may be displayed in a three-dimensional form or a two-dimensional form, which is not limited in the embodiment of the present application.
The operations that a user account can perform to control a virtual object may also vary from game application to game application. For example, in a shooting-type game application, the user account may control the virtual object to perform shooting, running, jumping, picking up a firearm, replacing a firearm, adding bullets to a firearm, and the like.
Of course, in addition to game applications, other types of applications may present virtual objects to a user and provide corresponding functionality to the virtual objects. For example, an AR (Augmented Reality) application, a social application, an interactive entertainment application, and the like, which are not limited in this embodiment. In addition, for different applications, the forms of the virtual objects provided by the applications may also be different, and the corresponding functions may also be different, which may be configured in advance according to actual requirements, and this is not limited in the embodiments of the present application.
Referring to fig. 1, a schematic diagram of an implementation environment provided by an embodiment of the application is shown. The implementation environment may include: a terminal 10 and a server 20.
The terminal 10 may be a device such as a mobile phone, a PC (Personal Computer), a tablet PC, an e-book reader, an electronic game machine, a Moving Picture Experts Group Audio Layer IV (MP 4) player, and the like.
The terminal 10 may have a client of a game application installed therein, such as a client of a Shooting-type game application, where the Shooting-type game application may be any one of an FPS (First Person Shooting) game application, a TPS (Third Person Shooting) game application, a Multiplayer Online Battle sports (MOBA) game application, a Multiplayer gun Battle survival game application, and the like. Alternatively, the game application may be a stand-alone application, such as a stand-alone 3D game application; or may be a web-enabled version of the application.
The server 20 is used to provide background services for clients of applications (e.g., game applications) in the terminal 10. For example, the server 20 may be a backend server for the above-described applications (e.g., gaming applications). The server 20 may be a server, a server cluster composed of a plurality of servers, or a cloud computing service center.
The terminal 10 and the server 20 can communicate with each other through the network 30. The network 30 may be a wired network or a wireless network.
In the embodiment of the method, the execution subject of each step may be a terminal. Please refer to fig. 2, which illustrates a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal 10 may include: a main board 110, an external input/output device 120, a memory 130, an external interface 140, a touch system 150, and a power supply 160.
The main board 110 has integrated therein processing elements such as a processor and a controller.
The external input/output device 120 may include a display component (e.g., a display screen), a sound playing component (e.g., a speaker), a sound collecting component (e.g., a microphone), various keys, and the like.
The memory 130 has program codes and data stored therein.
The external interface 140 may include a headset interface, a charging interface, a data interface, and the like.
The touch system 150 may be integrated into a display component or a key of the external input/output device 120, and the touch system 150 is used to detect a touch operation performed by a user on the display component or the key.
The power supply 160 is used to power various other components in the terminal 10.
In this embodiment, the processor in the motherboard 110 may generate a user interface (e.g., a game interface) by executing or calling program codes and data stored in the memory, and expose the generated user interface (e.g., the game interface) through the external input/output device 120. In the process of presenting a user interface (e.g., a game interface), a touch operation performed when a user interacts with the user interface (e.g., the game interface) may be detected by the touch system 150 and responded to.
Referring to fig. 3, a flowchart of a display method of a virtual article according to an embodiment of the present application is shown. The method may be applied in the terminal described above, such as in a client of an application of the terminal (e.g. a shooting-type game application). The method may include the steps of:
step 301, displaying a user interface.
The user may run a client of an application installed in the terminal, and the client may display the user interface. The user interface includes a virtual environment screen, which is a display screen corresponding to the virtual environment, and the virtual environment includes a virtual item, and optionally also includes a virtual object holding the virtual item. In addition, elements such as virtual buildings, virtual weapons, virtual props, etc. can also be included in the virtual environment. Details regarding the virtual environment and the virtual objects are described above and will not be repeated here.
In the embodiment of the present application, the virtual article may be a throwing type article, that is, the virtual article is thrown in the virtual environment to trigger the target function. For example, the virtual object may be a throwing-type object such as a virtual grenade, a viscous grenade, a virtual smoke bomb, a virtual flash bomb, etc.
Alternatively, the virtual items may be divided into multiple types, with different types of virtual items being used to trigger different target functions. Illustratively, the virtual objects may include combat-type objects and tactical-type objects, wherein the combat-type objects are intended to be thrown to cause damage to virtual objects, such as virtual grenades and viscous grenades (a viscous grenade looks have an adhesive effect and will adhere to a first touching virtual object), etc., and the virtual objects within the blast range are more damaged when thrown for a functional trigger period. Tactical type article is used for being thrown in order to cause the interference to virtual object, like virtual smog bullet and virtual flash bomb etc. is thrown and touches virtual object (virtual object) when triggering the interference (for being thrown when reaching function trigger time length, trigger smog diffuse effect and cause the interference), makes virtual object can't see the virtual environment clearly.
Illustratively, as shown in fig. 4, a schematic diagram of a combat-type item is illustratively shown. The combat-type items may include a laser tripper 41, a combat hatchet 42, a fragment grenade 43, and a viscous grenade 44.
Illustratively, as shown in fig. 5, a schematic diagram of a tactical type of item is illustrated. The tactical items may include a smoke bomb 51, a flash bomb 52, an explosion prevention device 53, a cold bomb 54, a shockbomb 55 and an electromagnetic pulse 56.
Optionally, each virtual object may be equipped with at least one virtual item (e.g., 2); further, optionally, each virtual object may be equipped with different types of virtual items, at most one of each type of virtual item.
Illustratively, as shown in fig. 6, the virtual object may be equipped with 2 types of virtual items, such as battle type items and tactical type items, and each type of virtual item can be equipped with only one, such as fragment grenade 43 in the battle type items and smoke bomb 51 in the tactical type items.
Optionally, some operation controls are also included in the user interface, and the operation controls are controls for the user to operate, and may include, for example, buttons, sliders, icons, and the like.
Optionally, the user interface includes a first viewing layer and a second viewing layer. The viewing layer is a layer for displaying interface content. The display level of the first view layer is higher than that of the second view layer, that is, the first view layer is located on the upper layer of the second view layer. The first view layer can be used for displaying an operation control for human-computer interaction of a user, and the second view layer can be used for displaying a virtual environment picture. Because the display level of the first view layer is higher than that of the second view layer, the operation control is displayed on the upper layer of the virtual environment picture, and therefore the operation control can be guaranteed to respond to the touch operation of the user. It should be noted that although the first view layer is located on the upper layer of the second view layer, the display of the content in the second view layer may not be blocked, for example, a part or all of the operation controls in the first view layer may be displayed in a semi-transparent state. As shown in fig. 7, the user interface includes a first view layer and a second view layer, the display content in the first view layer includes operation controls such as a virtual joystick 71, an attack button 72, a posture control button 73, a throwing object switching button 74, and a map thumbnail 75, and the display content in the second view layer includes a virtual environment screen 76 including a three-dimensional virtual environment and some people or objects in the virtual environment.
Step 302, in response to the virtual item being in a pre-cast state, playing a pre-cast animation.
When detecting that a virtual item held by the virtual object is in a pre-cast state, the client can play a pre-cast animation of the virtual item.
The pre-throwing state is a state before the virtual article is held by the virtual object and thrown. When the client receives a pre-casting instruction for the virtual article, the client controls the virtual article to enter a pre-casting state and starts to play the pre-casting animation. The pre-throwing instruction is an operation instruction for controlling the virtual article to enter a pre-throwing state, and for example, the pre-throwing instruction can be triggered by a pressing operation signal acting on a throwing control in a user interface. The pre-cast animation is used to show the operation of the virtual item in a pre-cast state. For some virtual objects, it is necessary to pull the pull ring of the virtual object manually, which is also called a bolt pulling process, and in shooting type games, the bolt pulling process is displayed by playing a pre-throwing animation.
Illustratively, as shown in FIG. 7, a schematic diagram of a pre-cast animation is illustratively shown. The virtual object is held with a virtual object such as a fragment grenade 43 in the right hand, and the pulling process of the fragment grenade 43 is completed by pulling a ring through the left hand.
It should be noted that the contents of the pre-cast animation may be different for different virtual articles, and the embodiment of the present application is not limited to this.
Step 303, in response to the playing of the pre-cast animation being finished, displaying the cast line.
When the pre-cast play is finished, a cast line can be displayed in the user interface. The throwing line is used for displaying the motion trail of the thrown virtual article, and the motion trail refers to a path of the client controlling the virtual article to move in the virtual environment. By displaying the throwing line in the user interface before the virtual article is actually thrown, the user can conveniently preview and check the motion trail and the landing point of the virtual article, the user can determine whether the motion trail and the landing point of the virtual article accord with the expectation of the user according to the throwing line, if so, the throwing of the virtual article can be triggered, and if not, the throwing line can be adjusted through operation, so that the motion trail and the landing point of the virtual article can be adjusted.
In the related art, the user displays the throwing line as soon as the throwing control is clicked, at which time the operation before the virtual item is thrown has not been completed (i.e., the pre-throw animation has not yet finished playing). Taking the pre-throwing animation as an example of playing the bolt animation, the related technology immediately displays the throwing line when the bolt animation is started to be played. However, according to the real logic, the user can throw the golf ball based on the throwing line after completing the operation before throwing the golf ball (such as pulling a bolt before throwing the grenade). Therefore, after the playing of the pre-cast animation is finished, the cast line is displayed, which is more in accordance with the reality logic than the related art.
Optionally, the starting point of the throwing line is a position at which the virtual object leaves the hands of the virtual object when the virtual object throws the virtual object; the end point of the throwing line is a landing point after the virtual article is thrown out. Optionally, the end point of the throwing line may be displayed with identifying information for indicating the landing point position of the virtual item, thereby allowing the user to better target the throw.
Illustratively, as shown in FIG. 8, a schematic diagram of a throwing line display is illustratively shown. After the pre-cast animation of the virtual item is finished playing, a cast line 81 may be displayed in the user interface.
Optionally, the above-mentioned throwing line displaying method may include the steps of: acquiring throwing line parameters; and displaying the throwing line according to the throwing line parameters.
Wherein, the throwing line parameters are used for determining the trajectory of the throwing line; the throwing line parameters may include at least one of: a throwing starting point, a throwing direction, a throwing initial speed and a throwing acceleration.
Wherein, the throwing starting point is the starting point of the throwing line; the throwing direction refers to the throwing direction of the virtual article, and can include the throwing direction in the horizontal direction and the throwing direction in the gravity direction, the throwing direction in the horizontal direction corresponds to the facing direction of the virtual object in the virtual environment, and the throwing direction in the gravity direction corresponds to the throwing height of the virtual prop; the initial throwing speed refers to the speed of the virtual article thrown from the starting throwing point, and may include the initial throwing speed in the horizontal direction or the initial throwing speed in the gravity direction; the throwing acceleration is an acceleration at the time when the virtual article is thrown from the throwing start point, and may include a throwing acceleration in the horizontal direction or a throwing acceleration in the gravity direction (i.e., a gravitational acceleration).
Through the throwing line parameters, a parabola can be calculated according to the relevant physical formula, the parabola is calculated in a frame, a point is collected at intervals in the frame, finally, some waypoints are obtained, then the waypoints are transmitted to a special effect line, and the special effect line forms a parabola according to the obtained position information, namely the throwing line.
Optionally, the throw line parameters may be adjusted by control of an interface control. For example, the user interface may include a throwing control of a virtual article, and the throwing direction may be adjusted by performing a long-press operation on the throwing control and sliding the throwing control in different directions (e.g., up, down, left, and right).
Optionally, before the throwing line is displayed, the method further includes: calling an animation state machine corresponding to the pre-cast animation to obtain animation duration of the pre-cast animation; and determining whether the playing of the pre-throwing animation is finished or not through an animation state machine corresponding to the pre-throwing animation.
The animation state machine corresponding to the pre-throwing animation is used for managing the pre-throwing animation, including basic information (such as animation duration) of the pre-throwing animation, and playing control and other related processing of the pre-throwing animation. The animation duration of the pre-throwing animation refers to the total playing duration of the pre-throwing animation.
Through the animation state machine corresponding to the pre-cast animation, the client can know the animation duration of the pre-cast animation and monitor the playing process of the pre-cast animation so as to judge whether the playing of the pre-cast animation is finished.
And step 304, responding to the virtual article switched from the pre-throwing state to the throwing state, and playing the throwing animation.
When the customer detects that the virtual item switches from the pre-cast state to the cast state, a cast animation of the virtual item may be played. When the client receives a throwing instruction for the virtual article, the client controls the virtual article to be switched from a pre-throwing state to a throwing state, and starts playing the throwing animation. The throwing instruction is an operation instruction for controlling the virtual article to enter a throwing state, and for example, the throwing instruction can be triggered by canceling a pressing operation signal acting on the throwing control.
The throwing state refers to a state in which the virtual object is thrown, and the throwing animation is used to show a process in which the virtual object is thrown. For throwing type virtual articles, it needs to be thrown to trigger the realization of target functions, such as explosion, smoke release and other attack functions. The throwing animation can show the action process that the swing arm of the virtual object throws out the virtual object and withdraws the arm after throwing out.
It should be noted that, for different virtual articles, the corresponding throwing animations may be the same or different, and this is not limited in the embodiment of the present application.
Optionally, the user interface further includes a throwing control for controlling the virtual object to throw the virtual item. In this case, the client controls the virtual article to enter a pre-cast state in response to receiving the operation signal corresponding to the cast control; and controlling the virtual article to be switched from the pre-throwing state to the throwing state in response to the detection of the disappearance of the operation signal.
That is, when the throwing control is included in the user interface, the user may operate the throwing control to trigger the virtual article to enter a pre-throwing state; correspondingly, when the client receives an operation signal corresponding to the throwing control, the client controls the virtual article to enter a pre-throwing state. When the user releases the hand and stops operating the throwing control, the virtual article is controlled to be switched from the pre-throwing state to the throwing state; correspondingly, when the client detects that the operation signal disappears, the client controls the virtual article to be switched from the pre-throwing state to the throwing state.
The operation signal can be generated by clicking the throwing control piece. For example, for a mobile phone end user configured with a touch screen, the user clicks the throwing control with a finger to generate an operation signal. For another example, for the PC end, the user may click the throwing control through a mouse to generate an operation signal; alternatively, the user generates the operation signal by pressing a key (e.g., R key) associated with the throwing control, and it should be noted that the key may be set by the user according to personal habits.
It should be noted that the operation on the throwing control may be a long-press operation, a single-click operation, a press operation, a slide operation, or the like, which is not limited in this embodiment of the present application.
Additionally, in some other embodiments, a cancel toss control may also be included in the user interface for the user to cancel tossing the virtual item.
Step 305, responsive to the play duration of the throwing animation reaching the first duration, canceling the display of the throwing line.
The client can monitor the playing process of the throwing animation, and cancel the display of the throwing line when the playing time of the throwing animation reaches the first time.
In the related art, when the user stops touching the throwing control with his hands loose, that is, when the client receives a throwing instruction, the throwing line disappears immediately, that is, the throwing animation has not started playing the throwing line and has disappeared, but according to the reality logic, a throwing motion (such as a swing arm motion) needs to be completed before the throwing is performed, and thus the disappearance of the throwing line needs to be delayed until the throwing motion is completed. Therefore, the throwing line is more real after the throwing animation is played for a certain period of time and disappears, and the throwing line is more in accordance with the reality logic compared with the related art.
Illustratively, as shown in FIG. 9, a schematic diagram of the disappearance of a throwing line is illustratively shown. After the playing time period of the throwing animation reaches a certain time period, the displayed throwing line 81 is cancelled in the user interface.
The game designer can set a throwing line cancellation moment at which the playing duration of the throwing animation reaches a first duration at which the throwing line displayed in the user interface disappears. For example, assuming that the animation time period of the throwing animation is 1s, the first time period set by the game designer is 0.2s, that is, when the throwing animation is played to 0.2s, the throwing line displayed in the user interface disappears.
In summary, according to the technical scheme provided by the embodiment of the application, the throwing line is displayed after the playing of the pre-throwing animation is finished, and the throwing line is cancelled when the playing time length of the throwing animation reaches the first time length. Compared with the prior art, the throwing line is displayed when the player clicks the throwing control piece, and the throwing line disappears when the player looses hands to stop touching the throwing control piece. According to the technical scheme provided by the embodiment of the application, the appearance and the disappearance of the throwing line are optimally configured, so that the appearance and the disappearance of the throwing line are more in line with the practical logic, and the accuracy of the appearance and the disappearance of the throwing line is improved.
In an alternative embodiment provided based on the above-mentioned fig. 3 embodiment, after the step 304, in response to the virtual article being switched from the pre-cast state to the cast state and the cast animation being played, the following steps may be further performed: and responding to the playing time of the throwing animation reaching the second time, and displaying the throwing special effect.
In the process of playing the throwing animation, the client can monitor the playing process of the throwing animation, and display the throwing special effect when the playing time of the throwing animation is monitored to reach the second time. The throwing special effect refers to a special effect of movement of the virtual article after being thrown out. Optionally, the second time period is greater than or equal to the first time period, for example, the first time period is 0.2 seconds, and the second time period is 0.2 seconds or 0.3 seconds. In this way, when the virtual object leaves the hand of the virtual object (is thrown by the virtual object), the throwing line disappears, and after the throwing line disappears, the throwing special effect starts to be displayed to show the motion process of the virtual object after being thrown, so that the virtual object is more in line with the real logic and more real.
Illustratively, as shown in fig. 10, a schematic diagram of a throw special effects display is illustratively shown. When the playing time of the throwing animation reaches the second time, a throwing special effect 101, namely a special effect of the movement of the virtual article after throwing is displayed in the user interface. The second duration may also be set by the game designer, for example, the game designer may set the second duration based on the animation duration of the throwing animation and the first duration, which may be greater than or equal to the first duration and less than the animation duration of the throwing animation. For example, the animation time period of the throwing animation is 1 second, the first time period is set to 0.2s, and the second time period is set to 0.2s or 0.3 s.
Optionally, before displaying the special throwing effect, the method may further include the following steps: acquiring animation duration and a zooming value of the throwing animation; and determining the second time length according to the animation time length and the zooming value of the throwing animation.
The client can call an animation state machine corresponding to the throwing animation, and the animation state machine corresponding to the throwing animation is used for managing the throwing animation, including basic information (such as animation duration) of the throwing animation, and playing control and other related processing of the throwing animation. The animation time length of the throwing animation refers to the total playing time length of the throwing animation. The zoom value may also be referred to as a scale value, and is used for setting a display time of the special throwing effect, and represents a ratio of the second time length in the animation time length. Optionally, the scaling value is a value greater than 0 and less than 1. For example, the second duration is the product of the animation duration of the throwing animation and the scaling value. The display time of the throwing object special effect can be determined according to the animation duration of the throwing animation and the zooming value. Wherein the zoom value can be set by a game designer. For example, assuming that the animation duration of the throwing animation is 2s in total and the zoom value is 0.5, the throwing special effect may be displayed when the throwing animation is played to 2 × 0.5= 1s. Thus, no matter how long the animation duration of the throwing animation is, the throwing animation can be scaled according to the scaling value without modifying the configuration due to the change of the animation duration. It should be noted that the first time length can also be determined by configuring a scaling value in a similar manner. In order for the second duration to be greater than or equal to the first duration, the scaling value used to determine the second duration should be greater than or equal to the scaling value used to determine the first duration.
To sum up, the technical scheme provided by the embodiment of the application displays the throwing special effect when the playing time of the throwing animation reaches the second time, so that the appearance of the throwing special effect is more consistent with the reality logic, and the accuracy of the appearance of the throwing special effect is improved.
In an optional embodiment provided based on the embodiment of fig. 3, the user interface further includes n article slot positions and a type switching control, the virtual articles assembled in the n article slot positions belong to different types, and n is a positive integer.
The article slot position is used for being equipped with a virtual article.
In this case, the method for controlling display of a virtual article may further include:
in response to receiving a selection signal corresponding to a target article slot position of the n article slot positions, controlling the virtual object to use a first virtual article assembled in the target article slot position;
and in response to receiving a trigger signal corresponding to the type switching control, controlling the virtual object to switch the held first virtual article to the second virtual article.
The first virtual article and the second virtual article are two different types of virtual articles.
The virtual object in the user interface can hold a virtual weapon, and when a user selects to use a target article slot position in the n article slot positions, the client can receive a selection signal corresponding to the target article slot position and control the virtual object to use a first virtual article assembled in the target article slot position; then, when the user wants to use another type of virtual article, the user can touch the type switching control to trigger and generate the trigger signal corresponding to the type switching control; correspondingly, the client controls the virtual object to switch the held first virtual article into the second virtual article when receiving the trigger signal corresponding to the type switching control. The second virtual item is a different type of virtual item than the first virtual item.
Illustratively, as shown in FIG. 11, a schematic diagram of a user interface is illustrated. The user interface 110 can include an item slot 111 therein, and the item slot 111 can be populated with a combat-type item, such as a fragment grenade 43. The user can click the article slot position 111 to control the virtual object to use the fragment grenade 43; thereafter, the user may click on the type switching control 112 to control the virtual object to switch the fragment grenade 43 in the held combat type item to a smoke shell in the tactical type item.
To sum up, according to the technical scheme provided by the embodiment of the application, the user can quickly switch between different types of virtual articles through the type switching control in the user interface, so that the switching efficiency of the virtual articles is improved, and the user experience is improved.
Referring to fig. 12, a flowchart of a display method of a virtual article according to an embodiment of the present application is exemplarily shown. The method may be applied in the terminal described above, such as in a client of an application of the terminal (e.g. a shooting-type game application). The method may include the steps of:
step 1201, controlling the virtual object to equip the virtual article.
Step 1202, detecting whether a touch operation corresponding to the article slot position is received.
Step 1203, in response to receiving the touch operation corresponding to the article slot, controlling the virtual object to use the virtual article.
Illustratively, as shown in FIG. 13, a schematic diagram of another user interface is illustrated. The user may click on the article slot 111, and correspondingly, the client controls the virtual object to use the virtual article, such as the fragment grenade 43, in response to receiving the touch operation corresponding to the article slot 111.
In step 1204, it is detected whether an operation signal corresponding to the throwing control is received.
And step 1205, responding to the received operation signal corresponding to the throwing control, and playing the pre-throwing animation.
In step 1206, it is detected whether the playing of the pre-throwing animation is finished.
Step 1207, in response to the end of the pre-cast animation play, displaying the cast line.
In step 1208, it is detected whether the operation signal disappears.
In response to detecting the disappearance of the operation signal, a throwing animation is played 1209.
Step 1210, detecting whether the playing time of the throwing animation reaches a first time.
Step 1211, in response to the play duration of the throwing animation reaching the first duration, canceling the display of the throwing line.
In step 1212, it is detected whether the playing time length of the throwing animation reaches the second time length.
Step 1213, responding to the playing time of the throwing animation reaching the second time, displaying the throwing special effect.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Referring to fig. 14, a block diagram of a display device of a virtual article according to an embodiment of the present application is shown. The device has the function of realizing the display method example of the virtual article, and the function can be realized by hardware or by hardware executing corresponding software. The device may be the terminal described above, or may be provided on the terminal. The apparatus 1400 may include: an interface display module 1401, a first play module 1402, a cast line display module 1403, a second play module 1404, and a cancel display module 1405.
An interface display module 1401, configured to display a user interface, where the user interface includes a display screen corresponding to a virtual environment, and the virtual environment includes a virtual article.
A first playing module 1402, configured to play a pre-cast animation in response to the virtual item being in a pre-cast state.
A throwing line display module 1403, configured to display a throwing line in response to the end of playing the pre-throwing animation, where the throwing line is used to indicate a motion trajectory of the virtual object after throwing.
A second playing module 1404, configured to play a throwing animation in response to the virtual item being switched from the pre-throwing state to a throwing state.
A cancellation display module 1405, configured to cancel display of the throwing line in response to a playing time length of the throwing animation reaching a first time length.
In summary, according to the technical scheme provided by the embodiment of the application, the throwing line is displayed after the playing of the pre-throwing animation is finished, and the throwing line is cancelled when the playing time length of the throwing animation reaches the first time length. Compared with the prior art, the throwing line is displayed when the player clicks the throwing control, and the throwing line disappears when the player releases the hand to stop touching the throwing control. According to the technical scheme provided by the embodiment of the application, the appearance and the disappearance of the throwing line are optimally configured, so that the appearance and the disappearance of the throwing line are more in line with the practical logic, and the accuracy of the appearance and the disappearance of the throwing line is improved.
In some possible designs, the cast line display module 1403 is for obtaining cast line parameters for determining a trajectory of the cast line; and displaying the throwing line according to the throwing line parameters.
In some possible designs, the cast line parameters include at least one of: a throwing starting point, a throwing direction, a throwing initial speed and a throwing acceleration.
In some possible designs, as shown in fig. 15, the apparatus 1400 further comprises: a special effects display module 1406.
The special effect display module 1406 is used for responding to the playing time length of the throwing animation reaching a second time length and displaying a throwing special effect; wherein, the throwing special effect refers to the special effect of the movement of the virtual article after being thrown.
In some possible designs, as shown in fig. 15, the apparatus 1400 further comprises: a parameter acquisition module 1407 and a duration determination module 1408.
A parameter obtaining module 1407, configured to obtain an animation duration and a zoom value of the throwing animation, where the zoom value is used to set a display time of the throwing special effect.
A duration determining module 1408, configured to determine the second duration according to the animation duration of the throwing animation and the zoom value.
In some possible designs, as shown in fig. 15, the apparatus 1400 further comprises: a state machine call module 1409, and a play detection module 1410.
The state machine calling module 1409 is configured to call an animation state machine corresponding to the pre-cast animation, and obtain an animation duration of the pre-cast animation.
And the play detection module 1410 is configured to determine whether the playing of the pre-cast animation is finished through an animation state machine corresponding to the pre-cast animation.
In some possible designs, the user interface further includes n article slot positions and a type switching control, the virtual articles assembled in the n article slot positions belong to different types, and n is a positive integer; as shown in fig. 15, the apparatus 1400 further includes: an item usage module 1411 and an item switching module 1412.
An article use module 1411, configured to control the virtual object to use the first virtual article assembled in the target article slot in response to receiving a selection signal corresponding to a target article slot of the n article slots.
An article switching module 1412, configured to control the virtual object to switch the held first virtual article to a second virtual article in response to receiving a trigger signal corresponding to the type switching control; wherein the first virtual article and the second virtual article are two different types of virtual articles.
In some possible designs, a throwing control is also included in the user interface; as shown in fig. 15, the apparatus 1400 further includes: a state entry module 1413 and a state switch module 1414.
A state entry module 1413 for controlling the virtual item to enter the pre-cast state in response to receiving an operation signal corresponding to the casting control.
A state switching module 1414, configured to control the virtual article to switch from the pre-throwing state to the throwing state in response to detecting that the operation signal disappears.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Referring to fig. 16, a block diagram of a terminal according to an embodiment of the present application is shown. Generally, terminal 1600 includes: a processor 1601, and a memory 1602.
Processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1601 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor 1601 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1602 may include one or more computer-readable storage media, which may be non-transitory. The memory 1602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1602 is used to store at least one instruction, at least one program, a set of codes, or a set of instructions for execution by the processor 1601 to implement a method of displaying a virtual article provided by method embodiments of the present application.
In some embodiments, the terminal 1600 may also optionally include: peripheral interface 1603 and at least one peripheral. Processor 1601, memory 1602 and peripheral interface 1603 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1603 via buses, signal lines, or circuit boards. Specifically, the peripheral device may include: at least one of a communication interface 1604, a display 1605, audio circuitry 1606, a camera assembly 1607, and a power supply 1609.
Those skilled in the art will appreciate that the configuration shown in fig. 16 is not intended to be limiting of terminal 1600, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
In an exemplary embodiment, a terminal is also provided. The terminal may be a terminal or a server. The terminal comprises a processor and a memory, wherein at least one instruction, at least one program, a code set or an instruction set is stored in the memory, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor to realize the display method of the virtual article.
In an exemplary embodiment, a computer readable storage medium is also provided, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, which when executed by a processor implements the above-mentioned display method of a virtual article.
In an exemplary embodiment, a computer program product for implementing the above virtual article display method when executed by a processor is also provided.
It should be understood that reference herein to "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (7)

1. A method for displaying a virtual item, the method comprising:
displaying a user interface, wherein the user interface comprises a display picture, a throwing control and a throwing canceling control which correspond to a virtual environment, the virtual environment comprises a virtual article, and the throwing canceling control is used for enabling a user to cancel throwing the virtual article;
in response to receiving an operation signal corresponding to the throwing control, controlling the virtual article to enter a pre-throwing state, wherein the pre-throwing state refers to a state before the virtual article is held by a virtual object and thrown;
in response to the virtual item being in the pre-cast state, playing a pre-cast animation;
in response to the end of the playing of the pre-throwing animation, displaying a throwing line, wherein the throwing line is used for indicating a motion track of the virtual article after throwing out, and a terminal point of the throwing line displays identification information which is used for indicating a landing point position of the virtual article;
in response to detecting that the operation signal disappears, controlling the virtual article to be switched from the pre-throwing state to a throwing state;
responding to the virtual article in the throwing state, and playing throwing animations, wherein the throwing animations corresponding to different virtual articles are different;
canceling the throwing line from being displayed in response to the playing duration of the throwing animation reaching a first duration; the first duration is determined based on an animation duration of the throwing animation and a zoom value used to determine the first duration, the first duration is a duration used to demonstrate that the virtual object swing arm throws the virtual object in the throwing animation and to retract an course of action of the arm after throwing, and the animation duration of the throwing animation does not affect the zoom value used to determine the first duration;
acquiring the animation time length of the throwing animation and a scaling value for determining a second time length; the zoom value used for determining the second time length represents the proportion of the second time length in the animation time length of the throwing animation, and the animation time length of the throwing animation does not influence the zoom value used for determining the second time length; the zoom value used for determining the second time length is used for setting the display moment of a throwing special effect, wherein the throwing special effect refers to a special effect of the movement of the virtual article after throwing, the second time length is longer than the first time length and shorter than the animation time length of the throwing animation, and the zoom value used for determining the second time length is longer than the zoom value used for determining the first time length;
calculating the product of the animation duration of the throwing animation and the scaling value for determining the second duration, and determining the product as the second duration;
and responding to the playing time of the throwing animation reaching the second time, and displaying the throwing special effect.
2. The method of claim 1, wherein the displaying a throwing line comprises:
acquiring a cast line parameter for determining a trajectory of the cast line, the cast line parameter including at least one of: a throwing starting point, a throwing direction, a throwing initial speed and a throwing acceleration;
and displaying the throwing line according to the throwing line parameters.
3. The method of claim 1, wherein prior to displaying the cast line, further comprising:
calling an animation state machine corresponding to the pre-cast animation to obtain animation duration of the pre-cast animation;
and determining whether the playing of the pre-cast animation is finished or not through an animation state machine corresponding to the pre-cast animation.
4. The method according to any one of claims 1 to 3, wherein the user interface further comprises n article slots and a type switching control, wherein virtual articles assembled in the n article slots belong to different types, and n is a positive integer;
the method further comprises the following steps:
in response to receiving a selection signal corresponding to a target article slot position of the n article slot positions, controlling the virtual object to use a first virtual article assembled in the target article slot position;
in response to receiving a trigger signal corresponding to the type switching control, controlling the virtual object to switch the held first virtual article to a second virtual article;
wherein the first virtual item and the second virtual item are two different types of virtual items.
5. A display device for a virtual article, the device comprising:
the interface display module is used for displaying a user interface, the user interface comprises a display picture, a throwing control and a throwing canceling control which correspond to a virtual environment, the virtual environment comprises a virtual article, and the throwing canceling control is used for enabling a user to cancel throwing the virtual article;
the state entering module is used for controlling the virtual article to enter a pre-throwing state in response to receiving an operation signal corresponding to the throwing control, wherein the pre-throwing state is a state before the virtual article is held by a virtual object and thrown;
the first playing module is used for responding to the virtual article in the pre-cast state and playing the pre-cast animation;
the throwing line display module is used for responding to the end of playing of the pre-throwing animation and displaying a throwing line, the throwing line is used for indicating the motion track of the virtual object after throwing, the terminal point of the throwing line displays identification information, and the identification information is used for indicating the landing point position of the virtual object;
the state switching module is used for responding to the detection that the operation signal disappears and controlling the virtual article to be switched from the pre-throwing state to the throwing state;
the second playing module is used for responding to the virtual article in the throwing state and playing throwing animations, and the throwing animations corresponding to different virtual articles are different;
the display canceling module is used for responding to the playing time length of the throwing animation reaching a first time length and canceling the throwing line; the first duration is determined based on an animation duration of the throwing animation and a zoom value used to determine the first duration, the first duration is a duration used to demonstrate that the virtual object swing arm throws the virtual object in the throwing animation and to retract an course of action of the arm after throwing, and the animation duration of the throwing animation does not affect the zoom value used to determine the first duration;
the parameter acquisition module is used for acquiring the animation time length of the throwing animation and determining the zooming value of a second time length; the zoom value used for determining the second time length represents the proportion of the second time length in the animation time length of the throwing animation, and the animation time length of the throwing animation does not influence the zoom value used for determining the second time length; the zoom value used for determining the second time length is used for setting the display moment of a throwing special effect, wherein the throwing special effect refers to a special effect of the movement of the virtual article after throwing, the second time length is longer than the first time length and shorter than the animation time length of the throwing animation, and the zoom value used for determining the second time length is longer than the zoom value used for determining the first time length;
the time length determining module is used for calculating the product of the animation time length of the throwing animation and the scaling value used for determining the second time length, and determining the product as the second time length;
and the special effect display module is used for responding to the fact that the playing time of the throwing animation reaches the second time, and displaying the throwing special effect.
6. A terminal, characterized in that it comprises a processor and a memory in which at least one instruction, at least one program, set of codes or set of instructions is stored, which is loaded and executed by the processor to implement the method according to any of claims 1 to 4.
7. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the method of any of claims 1 to 4.
CN202010324213.5A 2020-04-22 2020-04-22 Virtual article display method, device, terminal and storage medium Active CN111589145B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010324213.5A CN111589145B (en) 2020-04-22 2020-04-22 Virtual article display method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010324213.5A CN111589145B (en) 2020-04-22 2020-04-22 Virtual article display method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111589145A CN111589145A (en) 2020-08-28
CN111589145B true CN111589145B (en) 2023-03-24

Family

ID=72185478

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010324213.5A Active CN111589145B (en) 2020-04-22 2020-04-22 Virtual article display method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111589145B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112044071B (en) * 2020-09-04 2021-10-15 腾讯科技(深圳)有限公司 Virtual article control method, device, terminal and storage medium
CN112245917B (en) * 2020-11-13 2022-11-25 腾讯科技(深圳)有限公司 Virtual object control method, device, equipment and storage medium
CN112402960B (en) * 2020-11-19 2022-11-04 腾讯科技(深圳)有限公司 State switching method, device, equipment and storage medium in virtual scene
CN112546624A (en) * 2020-12-15 2021-03-26 竞技世界(北京)网络技术有限公司 Method, device and equipment for controlling special effect release and computer readable storage medium
CN113617028B (en) * 2021-08-13 2023-10-10 腾讯科技(深圳)有限公司 Control method, related device, equipment and storage medium for virtual prop

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014190216A1 (en) * 2013-05-22 2014-11-27 Thompson David S Fantasy sports interleaver
JP6788327B2 (en) * 2015-02-27 2020-11-25 株式会社ソニー・インタラクティブエンタテインメント Display control program, display control device, and display control method
CN108320322B (en) * 2018-02-11 2021-06-08 腾讯科技(成都)有限公司 Animation data processing method, animation data processing device, computer equipment and storage medium
CN109200582A (en) * 2018-08-02 2019-01-15 腾讯科技(深圳)有限公司 The method, apparatus and storage medium that control virtual objects are interacted with ammunition
CN110427111B (en) * 2019-08-01 2022-09-06 腾讯科技(深圳)有限公司 Operation method, device, equipment and storage medium of virtual prop in virtual environment
CN110478895B (en) * 2019-08-23 2020-08-11 腾讯科技(深圳)有限公司 Virtual article control method, device, terminal and storage medium
CN110585712A (en) * 2019-09-20 2019-12-20 腾讯科技(深圳)有限公司 Method, device, terminal and medium for throwing virtual explosives in virtual environment
CN110585706B (en) * 2019-09-30 2021-10-29 腾讯科技(深圳)有限公司 Interactive property control method, device, terminal and storage medium
CN110694273A (en) * 2019-10-18 2020-01-17 腾讯科技(深圳)有限公司 Method, device, terminal and storage medium for controlling virtual object to use prop
CN110935172B (en) * 2019-12-30 2021-03-16 腾讯科技(深圳)有限公司 Virtual object processing method, device, system and storage medium thereof

Also Published As

Publication number Publication date
CN111589145A (en) 2020-08-28

Similar Documents

Publication Publication Date Title
CN111589145B (en) Virtual article display method, device, terminal and storage medium
US12017141B2 (en) Virtual object control method and apparatus, device, and storage medium
CN110465087B (en) Virtual article control method, device, terminal and storage medium
CN110548288B (en) Virtual object hit prompting method and device, terminal and storage medium
CN110585712A (en) Method, device, terminal and medium for throwing virtual explosives in virtual environment
CN110585731B (en) Method, device, terminal and medium for throwing virtual article in virtual environment
CN109529356B (en) Battle result determining method, device and storage medium
CN111359206B (en) Virtual object control method, device, terminal and storage medium
CN111282284A (en) Virtual object control method, device, terminal and storage medium
CN110478895A (en) Control method, device, terminal and the storage medium of virtual objects
WO2022257653A1 (en) Virtual prop display method and apparatus, electronic device and storage medium
CN111905363B (en) Virtual object control method, device, terminal and storage medium
CN113546422A (en) Virtual resource delivery control method and device, computer equipment and storage medium
CN111318015B (en) Virtual article control method, device, terminal and storage medium
CN111318020A (en) Virtual object control method, device, equipment and storage medium
CN112057859B (en) Virtual object control method, device, terminal and storage medium
CN113694515B (en) Interface display method, device, terminal and storage medium
CN111905380B (en) Virtual object control method, device, terminal and storage medium
CN111298438B (en) Virtual object control method, device, equipment and storage medium
JP2024508682A (en) Virtual object control method, virtual object control device, computer equipment, and computer program
CN111643895A (en) Operation response method, device, terminal and storage medium
CN112843682B (en) Data synchronization method, device, equipment and storage medium
CN113617030B (en) Virtual object control method, device, terminal and storage medium
CN112245917B (en) Virtual object control method, device, equipment and storage medium
EP3984608A1 (en) Method and apparatus for controlling virtual object, and terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40027331

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant