CN111905380A - Virtual object control method, device, terminal and storage medium - Google Patents

Virtual object control method, device, terminal and storage medium Download PDF

Info

Publication number
CN111905380A
CN111905380A CN202010848012.5A CN202010848012A CN111905380A CN 111905380 A CN111905380 A CN 111905380A CN 202010848012 A CN202010848012 A CN 202010848012A CN 111905380 A CN111905380 A CN 111905380A
Authority
CN
China
Prior art keywords
touch operation
virtual object
control
touch
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010848012.5A
Other languages
Chinese (zh)
Other versions
CN111905380B (en
Inventor
杨金昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010848012.5A priority Critical patent/CN111905380B/en
Publication of CN111905380A publication Critical patent/CN111905380A/en
Application granted granted Critical
Publication of CN111905380B publication Critical patent/CN111905380B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a control method, a control device, a control terminal and a storage medium of a virtual object, and belongs to the technical field of computers and the Internet. The method comprises the following steps: displaying a virtual environment picture and a first operation control; in response to receiving the first touch operation, controlling the first virtual object to be switched from an unexecuted state of the first operation to a ready-to-execute state of the first operation; and in response to receiving the second touch operation, controlling the first virtual object to be switched from the ready-to-execute state of the first operation to the non-execute state of the first operation. According to the technical scheme, the user can flexibly control the first virtual object to cancel the execution of the first operation, a rescue scheme is provided for the false triggering condition of the first operation, the false execution probability of the first operation is reduced, when the user switches the first touch operation and the second touch operation, fingers do not need to be changed, and the operation difficulty is reduced.

Description

Virtual object control method, device, terminal and storage medium
Technical Field
The present application relates to the field of computer and internet technologies, and in particular, to a method, an apparatus, a terminal, and a storage medium for controlling a virtual object.
Background
Currently, in shooting type game hands, a user can control a virtual object to shoot using a virtual firearm in a virtual environment.
In the related art, a user may control a first virtual object to shoot using a virtual firearm through a firing button in a user interface. The virtual gun comprises a first virtual object, a second virtual object, a firing button, a visual angle adjusting button, a left thumb and a right thumb, wherein the firing button is used for controlling the first virtual object to enter a mirror-opening state for shooting by clicking the mirror-opening button through the right thumb before the first virtual object is controlled by the firing button for shooting, the visual angle is switched through the sliding operation of fingers of a user on a blank area of a right boundary surface to adjust the shooting direction of the first virtual object, and meanwhile, the user can control a virtual rocker in a user interface through the left thumb to enable the first virtual object to move while adjusting the shooting direction. In the process of adjusting the shooting direction while the first virtual object moves, the user can click the open mirror button by the index finger of the left hand/the right hand to control the virtual firearm used by the first virtual object to enter a closed mirror state and cancel shooting.
However, in the related art, if the user needs to control the virtual firearm used for the first virtual object to enter the closed state during the process of adjusting the shooting direction of the first virtual object by the thumb of the right hand and controlling the movement of the first virtual object by the thumb of the left hand, the user needs to click the open button with the index finger of the left/right hand, and the operation difficulty is high.
Disclosure of Invention
The embodiment of the application provides a control method, a control device, a control terminal and a storage medium of a virtual object, which can reduce operation difficulty. The technical scheme is as follows:
according to an aspect of an embodiment of the present application, there is provided a method for controlling a virtual object, the method including:
displaying a virtual environment picture and a first operation control, wherein the virtual environment picture is a display picture used for showing a virtual environment provided by a shooting application, and the first operation control is used for controlling a first virtual object in the virtual environment to execute a first operation;
in response to receiving a first touch operation, controlling the first virtual object to be switched from an unexecuted state of the first operation to a ready-to-execute state of the first operation; wherein the first touch operation is a touch operation for the first operation control;
in response to receiving a second touch operation, controlling the first virtual object to be switched from a ready-to-execute state of the first operation to an unexecuted state of the first operation; wherein the second touch operation and the first touch operation are executed by the same finger.
According to an aspect of an embodiment of the present application, there is provided an apparatus for controlling a virtual object, the apparatus including:
the system comprises a first control display module, a second control display module and a third control display module, wherein the first control display module is used for displaying a virtual environment picture and a first operation control, the virtual environment picture is used for displaying a virtual environment provided by a shooting application, and the first operation control is used for controlling a first virtual object in the virtual environment to execute a first operation;
the preparation state switching module is used for responding to the receiving of a first touch operation and controlling the first virtual object to be switched from the unexecuted state of the first operation to the preparation execution state of the first operation; wherein the first touch operation is a touch operation for the first operation control;
the unexecuted state switching module is used for responding to the reception of a second touch operation and controlling the first virtual object to be switched from the ready-to-execute state of the first operation to the unexecuted state of the first operation; wherein the second touch operation and the first touch operation are executed by the same finger.
According to an aspect of the embodiments of the present application, there is provided a terminal, including a processor and a memory, where at least one instruction, at least one program, a code set, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the control method of the virtual object.
According to an aspect of the embodiments of the present application, there is provided a computer-readable storage medium having at least one instruction, at least one program, a code set, or a set of instructions stored therein, which is loaded and executed by a processor to implement the control method of the above-mentioned virtual object.
According to an aspect of embodiments herein, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the control method of the virtual object.
The technical scheme provided by the embodiment of the application can bring the following beneficial effects:
the first virtual object is controlled to enter a first operation preparation execution state through the first touch operation, the first virtual object is controlled to be separated from the first operation preparation execution state through the second touch operation and enter a first operation non-execution state, so that when the first virtual object prepares to execute the first operation, a user can flexibly control the first virtual object to cancel the execution of the first operation, a rescue scheme is provided for the false triggering condition of the first operation, and the false execution probability of the first operation is reduced; moreover, the first touch operation and the second touch operation are operations executed by the same finger, that is, the first touch operation and the second touch operation are continuous operations, when the user switches the first touch operation and the second touch operation, the fingers do not need to be changed, the operation difficulty is reduced, that is, in the process that the user adjusts the execution direction of the first operation through the right thumb and controls the first virtual object to move through the left thumb, if the execution of the first operation needs to be cancelled, the right thumb is continuously used for executing the second touch operation, a feasible scheme is provided for realizing the movement, the operation execution and the operation cancellation of the first virtual object through the two fingers at the same time, and the operation difficulty is reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of an application execution environment provided by one embodiment of the present application;
FIG. 2 is a flowchart of a method for controlling a virtual object according to an embodiment of the present application;
FIG. 3 is a diagram illustrating a method for determining whether a touch operation is a first touch operation;
FIGS. 4 and 5 are schematic diagrams illustrating a state switching interface for two first virtual objects;
FIGS. 6 and 7 are diagrams illustrating two display manners of the second operation control;
FIG. 8 is a diagram illustrating another manner of displaying a second operational control;
FIGS. 9 and 10 are diagrams illustrating control flows of two kinds of first virtual objects;
FIG. 11 is a block diagram of a control apparatus for a virtual object according to an embodiment of the present application;
fig. 12 is a block diagram of a control apparatus for a virtual object according to another embodiment of the present application;
fig. 13 is a block diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Referring to fig. 1, a schematic diagram of an application execution environment according to an embodiment of the present application is shown. The application execution environment may include: a terminal 10 and a server 20.
The terminal 10 may be an electronic device such as a mobile phone, a tablet computer, a game console, an electronic book reader, a multimedia playing device, a wearable device, or the like. A client of the application may be installed in the terminal 10. The application program may be an application program that needs to be downloaded and installed, or may be an application program that is to be used on demand, which is not limited in this embodiment of the application.
In the embodiment of the present application, the application may be any application that can provide a virtual environment for a virtual object substituted and operated by a user to perform an activity in the virtual environment. Typically, the application is a Game application, such as Multiplayer Online tactical sports (MOBA), big fly survival (BR), Third-Person Shooting Game (TPS), First-Person Shooting Game (FPS), and Multiplayer gunfight type survival games, among others. Of course, in addition to game applications, other types of applications may present virtual objects to a user and provide corresponding functionality to the virtual objects. For example, the application program may be a Virtual Reality (VR) application program, an Augmented Reality (AR) application program, a three-dimensional map program, a military simulation program, a social contact application program, an interactive entertainment application program, and the like, which are not limited in this embodiment of the present application. In addition, for different applications, the forms of the virtual objects provided by the applications may also be different, and the corresponding functions may also be different, which may be configured in advance according to actual requirements, and this is not limited in the embodiments of the present application. Optionally, a client of the above application program runs in the terminal 10. In some embodiments, the application is an application developed based on a three-dimensional virtual environment engine, for example, the virtual environment engine is a Unity engine, and the virtual environment engine can construct a three-dimensional virtual environment, a virtual object, a virtual prop, and the like, so as to bring a more immersive game experience to the user.
The virtual environment is a scene displayed (or provided) by a client of an application program (such as a game application program) when the client runs on a terminal, and the virtual environment refers to a scene created for a virtual object to perform an activity (such as a game competition), such as a virtual house, a virtual island, a virtual map, a virtual building, and the like. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional environment, or a pure fictional environment. The virtual environment may be a two-dimensional virtual environment, a 2.5-dimensional virtual environment, or a three-dimensional virtual environment, which is not limited in this embodiment of the present application.
The virtual object may be a virtual character controlled by the user account in the application program, or may be a virtual character controlled by the computer program in the application program. Taking the application as a game application as an example, the virtual object may be a game character controlled by the user account in the game application, or may be a game monster controlled by a computer program in the game application. The virtual object may be in the form of a character, an animal, a cartoon or other forms, which is not limited in this application. The virtual object may be displayed in a three-dimensional form or a two-dimensional form, which is not limited in the embodiment of the present application. Optionally, when the virtual environment is a three-dimensional virtual environment, the virtual object is a three-dimensional stereo model created based on an animated skeleton technique. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
In one possible embodiment, the virtual objects include virtual objects in the same camp and virtual objects in different camps, that is, in the application, the virtual objects are divided into different camps. Illustratively, taking an MOBA game as an example, ten users play a matching game, which is divided into two teams of a red party and a blue party, and every five persons are in one group, at this time, virtual objects in the red party and the red party are in the same formation and in a teammate relationship, virtual objects in the blue party and the blue party are in the same formation and in a teammate relationship, and virtual objects in the red party and the blue party are in different formations and in a hostile relationship.
The server 20 is used to provide background services for clients of applications in the terminal 10. For example, the server 20 may be a backend server for the application described above. The server 20 may be a server, a server cluster composed of a plurality of servers, or a cloud computing service center. Optionally, the server 20 provides background services for applications in multiple terminals 10 simultaneously.
Alternatively, the terminal 10 and the server 20 may communicate with each other through the network 30.
Referring to fig. 2, a flowchart of a control method for a virtual object according to an embodiment of the present application is shown. The method is applicable to a terminal, and the execution subject of each step may be the terminal 10 (hereinafter referred to as "client") in the application execution environment shown in fig. 1. The method comprises the following steps (201-203):
step 201, displaying a virtual environment picture and a first operation control.
The virtual environment screen refers to a screen corresponding to a virtual environment displayed to a user in a user interface. The virtual environment picture may be a picture obtained by a virtual camera from a virtual environment, and the virtual environment refers to a scene for a virtual object to move. In one possible embodiment, the virtual camera captures a view of the virtual environment from a third perspective of the first virtual object. Optionally, the virtual camera is disposed obliquely above the first virtual object, and the client observes the virtual environment with the first virtual object as a center through the virtual camera to obtain a virtual environment picture with the first virtual object as a center. In another possible embodiment, the virtual camera captures a view of the virtual environment from a first perspective of the first virtual object. Optionally, the virtual camera is disposed right in front of the first virtual object, and the client observes the virtual environment from the perspective of the first virtual object through the virtual camera to obtain a virtual environment picture with the first virtual object as the first perspective. Of course, in other possible embodiments, the placement position of the virtual camera is adjustable in real time. Optionally, the user may adjust the position of the virtual camera through a control operation for the user interface, thereby acquiring a virtual environment picture at different positions. For example, the user adjusts the position of the virtual camera by dragging the virtual environment screen; for another example, the user clicks a certain position in the map display control, and the position of the virtual camera is adjusted by using the certain position as the adjusted position of the virtual camera. The map display control is a User Interface (UI) control for displaying a global map of the virtual environment. The first virtual object is a virtual object controlled by a user account corresponding to the client. It should be noted that the user account may control one or more virtual objects, which is not limited in this embodiment of the present application.
In this embodiment of the present application, the virtual environment screen is a display screen for displaying a virtual environment provided by a shooting application, and the client displays the virtual environment screen in a user interface, where the user interface is a display interface of the shooting application displayed to a user. The shooting type application is an application for providing a virtual gun for shooting for a first virtual object controlled by a user account, such as a third-person shooting game, a first-person shooting game, a multi-person gun battle type survival game, and the like. Optionally, after receiving the start instruction of the shooting application, the client may acquire a virtual environment screen through the virtual camera, and display the virtual environment screen in the user interface. It should be noted that, in the embodiment of the present application, the user interface further includes a first operation control.
In a possible implementation manner, the first operation control is directly displayed in a user interface, and the client displays the first operation control while displaying the virtual environment screen. Namely, after receiving the opening instruction of the shooting application, the client simultaneously displays the virtual environment picture and the first operation control.
In another possible implementation manner, the first operation control is hidden and displayed on the user interface, and the client displays the first operation control after receiving the display instruction of the first operation control. Alternatively, the display instruction may be triggered and generated by the user in a display area of the first operation control, and the display area may be any area in the user interface. In another possible implementation, the display instruction is generated by a user in a touch area of the first operation control by triggering, that is, when the client detects a touch operation on the first operation control, the display instruction of the first operation control is generated, and then the first operation control is displayed in the user interface. The touch area may be a surrounding area of the first operation control.
In an embodiment of the present application, the first operation is used to control a first virtual object in a virtual environment to execute the first operation. Wherein the first operation is any one of: shooting, shooting with open mirrors, throwing virtual articles. The open mirror shooting refers to that the first virtual object firstly enters an open mirror state when shooting by using the virtual firearm and then shoots. The open state refers to a state of magnifying and observing the virtual environment through the virtual sighting telescope, and the virtual sighting telescope may be collected and acquired by the first virtual object in the virtual environment, or may be a virtual sighting telescope allocated by the application program for the user account. Optionally, the user may click the first operation control in the user interface, or press a key (e.g., an R key) corresponding to the first operation control on the keyboard, to control the first virtual object to perform the first operation.
In a possible embodiment, the first operation is shooting. Optionally, after detecting the touch operation on the first operation control, the client controls the first virtual object to shoot using the virtual firearm based on the touch operation. The virtual gun may be a virtual gun currently held by the first virtual object, or may be any virtual gun owned by the first virtual object, which is not limited in this embodiment of the present application.
In another possible embodiment, the first operation is open-mirror shooting. Optionally, after detecting the touch operation on the first operation control, the client controls the first virtual object to enter an open state to aim at shooting based on the touch operation, and further controls the first virtual object to close the mirror to shoot when detecting that the touch operation on the first operation control disappears.
In yet another possible embodiment, the first operation is throwing a virtual article, such as a virtual explosive. Optionally, after detecting the touch operation for the first operation control, the client controls the first virtual object to throw the virtual article based on the touch operation. Of course, in practical applications, the user interface may include a plurality of operation controls for throwing the virtual object, and different operation controls are used for controlling the first virtual object to throw different virtual objects.
It should be noted that, in the embodiment of the present application, other operation controls, for example, an operation control for controlling the release technique of the first virtual object, may also be displayed in the user interface.
In step 202, in response to receiving the first touch operation, the first virtual object is controlled to be switched from the unexecuted state of the first operation to the ready-to-execute state of the first operation.
The first touch operation refers to an operation generated by user triggering, such as a single-click operation, a double-click operation, a long-press operation, a short-press operation, a drag operation, a light-press operation, a heavy-press operation, and the like. In the PC, the first touch operation may be an operation of a certain key on the keyboard by the user. In this embodiment of the application, the first touch operation may be a touch operation for the first operation control. Optionally, the user may click an icon corresponding to the first operation control in the user interface to generate the first touch operation; alternatively, the user may generate the first touch operation by pressing a key (e.g., an R key) corresponding to the first operation control.
Optionally, after displaying the user interface, the client may monitor the user interface, obtain a trigger position of the touch operation after detecting the touch operation in the user interface, and determine that the touch operation is the first touch operation when the trigger position is located in the trigger area of the first operation control.
In a possible implementation manner, the position of the first operation control in the user interface is fixed, and correspondingly, a fixed trigger area for the first operation control is provided in the user interface. The fixed trigger area may be any position in the user interface, for example, the fixed trigger area of the first operation control is a peripheral area of the first operation control. Optionally, the client may determine whether the touch operation is located in the fixed trigger area of the first operation control according to the trigger position. And if the touch operation is located in the fixed trigger area of the first operation control, determining that the touch operation is the first touch operation.
In another possible implementation manner, the position of the first operation control in the user interface is adjustable, that is, the user may adjust the position of the first operation control in the user interface according to the actual situation, and correspondingly, the user interface is provided with an active trigger area of the first operation control. The active trigger area may be a task area based on the position of the first operation control, for example, the active trigger area of the first operation control is an area whose distance from the center position of the first operation control is smaller than the target value. Optionally, the client may determine a distance between the trigger position and the center position of the first operation control according to the trigger position. And if the distance is smaller than the target value, determining the touch operation as the first touch operation. For example, referring to fig. 3 in combination, when the touch position of the touch operation is the first position 31, since the distance between the first position 31 and the center position 33 of the first operation control 32 is greater than the target value, it is determined that the touch operation is not the first touch operation of the first operation control 32; when the touch position of the touch operation is the second position 34, since the distance between the second position 34 and the center position 33 of the first operation control 32 is smaller than the target value, it is determined that the touch operation is the first touch operation of the first operation control 32.
In this embodiment of the application, when the client detects the first touch operation, the client controls the first virtual object to perform state switching based on the first touch operation, and the unexecuted state of the first operation is converted into the ready-to-execute state of the first operation. The unexecuted state of the first operation refers to an unexecuted state of the first virtual object with respect to the first operation, and it should be noted that, in this embodiment of the application, an executed state of another operation, a ready-to-execute state of another operation, a static state of the first virtual object, and the like are all unexecuted states of the first operation; the preparation state of the first operation refers to a state in which the first virtual object is prepared to perform the first operation, and for example, a pre-aiming state or an open mirror state in a shooting game is a preparation state for performing a shooting operation.
Taking the first operation as an example of open mirror shooting, with reference to fig. 4, when the first virtual object is in the unexecuted state of the first operation, the unexecuted interface 40 of the first operation is displayed, and further, after the client detects a click operation (i.e., a first touch operation) on the right-side shooting button 41, the first virtual object is controlled to switch from the unexecuted state of the first operation to the ready-to-execute state of the first operation, and the ready-to-execute interface 42 of the first operation is displayed.
Step 203, in response to receiving the second touch operation, controlling the first virtual object to switch from the ready-to-execute state of the first operation to the non-execute state of the first operation.
The second touch operation is similar to the first touch operation, and is an operation generated by triggering for the user, such as a single-click operation, a double-click operation, a long-press operation, a short-press operation, a drag operation, a light-press operation, a heavy-press operation, and the like. Optionally, the user may click a position corresponding to the user interface or press a corresponding key to generate the second touch operation. In this embodiment, when receiving the second touch operation, the client controls the first virtual object to perform state switching, and switches from the ready-to-execute state of the first operation back to the non-execute state of the first operation.
In one possible implementation, the second touch operation and the first touch operation are performed by the same finger, that is, the second touch operation and the first touch operation are continuous operations. Optionally, in this embodiment of the application, the second touch operation is a sliding operation in which the first operation control moves to a second operation control, and the second operation control is used to cancel execution of the first operation. Optionally, the client monitors the user interface after detecting the first touch operation, and when detecting a continuous operation of the first touch operation, if the continuous operation is a sliding operation that moves from a first operation control to a second operation control, determines that the sliding operation is the second touch operation, and then controls the first virtual object to switch from the ready-to-execute state of the first operation back to the non-execute state of the first operation based on the second touch operation. Of course, if the client does not detect the continuous operation of the first touch operation and detects that the first touch operation disappears, the client controls the first virtual object to switch from the ready-to-execute state of the first operation to the execute state of the first operation. The execution state of the first operation refers to a state in which the first virtual object is executing the first operation.
In another possible implementation, the second touch operation and the first touch operation are discontinuous operations. Optionally, the second touch operation is a touch operation for a second operation control, where the second touch operation is used to control the first virtual object to cancel execution of the first operation. Optionally, the client monitors the user interface after detecting the first touch operation, determines that the touch operation for the second operation control is the second touch operation after detecting the touch operation for the second operation control, and controls the first virtual object to be switched from the ready-to-execute state of the first operation back to the non-execute state of the first operation based on the second touch operation. Of course, in another possible implementation manner, the second touch operation is a touch operation for the first operation control. The operation attribute of the second touch operation is different from the operation attribute of the first touch operation. Optionally, the client controls the first virtual object to maintain the ready-to-execute state of the first operation after detecting that the first touch operation disappears, monitors the user interface, acquires the operation attribute of the touch operation after detecting the touch operation on the first operation control, and controls the first virtual object to switch from the ready-to-execute state of the first operation back to the non-execute state of the first operation based on the second touch operation when determining that the touch operation is the second touch operation based on the operation attribute of the touch operation. Of course, after the client detects that the first touch operation disappears, if a third touch operation is detected, the first virtual object is controlled to be switched from the ready-to-execute state of the first operation to the execute state of the first operation. The execution state of the first operation refers to a state in which the first virtual object is executing the first operation, and the third touch operation is a touch operation for the first operation control. Optionally, the operation attribute of the third touch operation may be the same as the operation attribute of the first touch operation; alternatively, the operation attribute of the third touch operation may be different from both the operation attribute of the first touch operation and the operation attribute of the second touch operation, which is not limited in this embodiment of the application.
Taking the first operation as an example of throwing the virtual article, with reference to fig. 5, when the first virtual object is in the ready-to-execute state of the first operation, the ready-to-execute interface 50 of the first operation is displayed, and further, after the client detects the second touch operation, the client controls the first virtual object to switch from the ready-to-execute state of the first operation back to the non-execute state of the first operation, and the non-execute interface 51 of the first operation is displayed. The second touch operation may be a touch operation of the first operation control 52, and may also be a touch operation of the second operation control 53.
It should be noted that, in the embodiment of the present application, in a case that the operation attribute of the first touch operation is different from the operation attribute of the second touch operation, a user or a designer may set the operation attribute of the first touch operation and the operation attribute of the second touch operation according to an actual situation. In a possible implementation, the operation attribute includes a click number. Optionally, the first touch operation is a single-click operation, and the second touch operation is a double-click operation. The single-click operation refers to an operation of clicking once for a certain action point within a preset time period, and the double-click operation refers to an operation of clicking twice for the certain action point within the preset time period. In another possible embodiment, the operation attribute includes an action pressure value. Optionally, the first touch operation is a tap operation, and the second touch operation is a tap operation. The light pressing operation refers to an operation in which the acting pressure value for a certain acting point is smaller than a threshold value, and the heavy pressing operation refers to an operation in which the acting pressure value for a certain acting point is larger than a threshold value. In yet another possible embodiment, the operation attribute includes an action duration. Optionally, the first touch control operation is a short-press operation, and the second touch control operation is a long-press operation. The short pressing operation refers to an operation that the acting time length of a certain acting point is smaller than a threshold value, and the long pressing operation refers to an operation that the acting time length of the certain acting point is larger than the threshold value. In actual application, of course, the operation attribute includes other attribute information, which is not limited in the embodiment of the present application.
To sum up, in the technical solution provided in this embodiment of the application, the first virtual object is controlled to enter the ready-to-execute state of the first operation through the first touch operation, and the first virtual object is controlled to leave the ready-to-execute state of the first operation through the second touch operation and enter the non-execute state of the first operation, so that when the first virtual object is ready to execute the first operation, the user can flexibly control the first virtual object to cancel the execution of the first operation, thereby providing a rescue scheme for the false triggering condition of the first operation, and reducing the false execution probability of the first operation; moreover, the first touch operation and the second touch operation are operations executed by the same finger, that is, the first touch operation and the second touch operation are continuous operations, when the user switches the first touch operation and the second touch operation, the fingers do not need to be changed, the operation difficulty is reduced, that is, in the process that the user adjusts the execution direction of the first operation through the right thumb and controls the first virtual object to move through the left thumb, if the execution of the first operation needs to be cancelled, the right thumb is continuously used for executing the second touch operation, a feasible scheme is provided for realizing the movement, the operation execution and the operation cancellation of the first virtual object through the two fingers at the same time, and the operation difficulty is reduced.
Next, the second operation control will be described.
In this embodiment of the application, the second operation control is configured to control the first virtual object to cancel execution of the first operation.
In one possible embodiment, the second operation control appears in the user interface simultaneously with the first operation control. Optionally, after receiving the opening instruction of the shooting application, the client displays a virtual environment screen, the first operation control, and the second operation control in the user interface.
In another possible implementation manner, the second operation control is displayed in the user interface after receiving the touch operation for the first operation control, and optionally, the client displays the second operation control in the user interface when receiving the first touch operation.
Optionally, in a case that the second touch operation is a sliding operation moving from the first operation control to the second operation control, in order to facilitate a user operation to reduce a sliding distance, the second operation control and the first operation control are displayed on the same side of the user interface. With reference to fig. 6 and 7, when the user interface 60 receives the first touch operation, and the first operation control 61 corresponding to the first touch operation is located on the left side of the user interface, the second operation control 62 is also displayed on the left side of the user interface 60, with the vertical center line of the user interface as a boundary. When the user interface 70 receives the first touch operation, if the first operation control 71 for the first touch operation is on the right side of the user interface, the second operation control 72 is also displayed on the right side of the user interface 70.
In a possible embodiment, the second operation control may be an operation control dedicated to cancel the first operation. Optionally, after receiving the first touch operation, the client displays the second operation control, such as the second operation control 62 shown in fig. 6 or the second operation control 72 shown in fig. 7, in the user interface.
In another possible implementation, the second operation control is an operation control for performing a second operation. Wherein the second operation is an operation that cannot be executed in the ready-to-execute state of the first operation. Optionally, after receiving the first touch operation, the client acquires an operation control corresponding to a second operation, and changes the display style of the operation control corresponding to the second operation into the display style of the second operation control by using the operation control corresponding to the second operation as the second operation control. Taking the shooting game as an example, as shown in fig. 4, in the first operation unexecuted interface 40, after the client receives the first touch operation, if the first touch operation is directed to the right-side shooting button 41 and the right-side shooting button 41 is used for controlling the first virtual object to shoot at the mirror-opening position, when the first virtual object enters the first operation ready-to-execute state, both the left-side shooting button 43 and the mirror-opening button 44 can be used as the second operation control. Further, in the preparation execution interface 42 for the first operation, the left shooting button 43 and the open mirror button 44 are used as second operation controls, and the display style of the left shooting button 43 and the open mirror button 44 is changed. Of course, under the condition that the number of the operation controls corresponding to the second operation is not unique, the client may randomly select one operation control corresponding to the second operation as the second operation control, and also select an operation control closest to the first operation control from the operation controls corresponding to the second operation as the second operation control, which is not limited in this embodiment of the application. Of course, in actual use, the user or the designer may set a fixed operation control as the second operation control in the operation control corresponding to the second operation, and as shown in fig. 8, in the preparation execution interface 42 for the first operation, the display style of the open button 44 is changed by using the open button 44 as the second operation control, and the display style of the left firing button 43 remains unchanged.
It should be noted that, in the embodiment of the present application, the user may also set the display mode of the second operation control according to the actual situation.
Taking a shooting game as an example, assuming that a user sets a second operation control to be displayed on a different side from a first operation control, and the second operation control is an operation control corresponding to the second operation, as shown in fig. 9, the technical scheme provided by the present application includes the following steps:
in step 901, the client detects a touch operation corresponding to the shooting button.
In step 902, the client determines whether the shooting button is a right shooting button. If yes, go to step 903; if not, go to step 904.
In step 903, the client changes the display style of the left shooting button into the display style of the cancel shooting button.
In step 904, the client changes the display style of the right shoot button to the display style of the cancel shoot button.
In step 905, the client controls the first virtual object to switch from the non-execution state of the shooting operation to the preparation execution state of the shooting operation.
In step 906, the client determines whether the user clicks a cancel shot button. If yes, go to step 907; if not, go to step 908.
In step 907, the client controls the first virtual object to switch from the ready-to-execute state of the shooting operation to the non-execute state of the shooting operation.
Assuming that a user sets a second operation control to be displayed on the same side as the first operation control, and the second operation control is not an operation control corresponding to the second operation, as shown in fig. 10, the technical solution provided by the present application includes the following steps:
step 1001, the client detects a touch operation corresponding to the shooting button.
In step 1002, the client determines whether the shooting button is a right shooting button. If yes, go to step 1003; if not, go to step 1004.
In step 1003, the client displays a cancel shooting button on the right side of the user interface.
In step 1004, the client displays a cancel shoot button on the left side of the user interface.
In step 1005, the client controls the first virtual object to switch from the non-execution state of the shooting operation to the preparation execution state of the shooting operation.
In step 1006, the client determines whether the user has left the right fire button without loosing hands. If yes, go to step 1007; if not, go to step 1010.
Step 1007, the client keeps the first virtual object in a ready-to-execute state for the shooting operation.
In step 1008, the client determines whether the user enters the trigger zone of the cancel fire button and looses his or her hands. If yes, go to step 1009; if not, go to step 1008.
In step 1009, the client controls the first virtual object to be switched from the ready-to-execute state of the shooting operation to the non-execute state of the shooting operation.
In step 1010, the client controls the first virtual object to switch from the ready-to-execute state of the shooting operation to the execute state of the shooting operation.
In step 1011, the client controls the first virtual object to return to the unexecuted state of the shooting operation.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Referring to fig. 11, a block diagram of a control apparatus for a virtual object according to an embodiment of the present application is shown. The device has the function of realizing the control method of the virtual object, and the function can be realized by hardware or by hardware executing corresponding software. The device can be a terminal or be arranged in the terminal. The apparatus 1100 may include: a first control display module 1101, a preparation state switching module 1102 and an unexecuted state switching module 1103.
The first control display module 1101 is configured to display a virtual environment picture and a first operation control, where the virtual environment picture is a display picture used for displaying a virtual environment provided by a shooting application, and the first operation control is used for controlling a first virtual object in the virtual environment to execute a first operation.
A preparation state switching module 1102, configured to, in response to receiving a first touch operation, control the first virtual object to switch from an unexecuted state of the first operation to a preparation executed state of the first operation; wherein the first touch operation is a touch operation for the first operation control.
An unexecuted state switching module 1103, configured to, in response to receiving a second touch operation, control the first virtual object to switch from a ready-to-execute state of the first operation to an unexecuted state of the first operation; wherein the second touch operation and the first touch operation are executed by the same finger.
In an exemplary embodiment, the second touch operation is a sliding operation that moves from the first operation control to a second operation control, and the second operation control is used to cancel execution of the first operation.
In an exemplary embodiment, the second touch operation is a touch operation for a second operation control, and the second operation control is configured to cancel execution of the first operation.
In an exemplary embodiment, the second touch operation is a touch operation for the first operation control, and an operation attribute of the second touch operation is different from an operation attribute of the first touch operation.
In an exemplary embodiment, the first touch operation is a single-click operation, and the second touch operation is a double-click operation; or the first touch operation is a light press operation, and the second touch operation is a heavy press operation.
In an exemplary embodiment, as shown in fig. 12, the apparatus 1100 further comprises: a readiness state retention module 1104.
A ready state holding module 1104, configured to, in response to disappearance of the first touch operation, control the first virtual object to hold a ready execution state of the first operation.
In an exemplary embodiment, as shown in fig. 12, the apparatus 1100 further comprises: the status switch module 1105 is executed.
An execution state switching module 1105, configured to, in response to receiving a third touch operation, control the first virtual object to switch from a ready-to-execute state of the first operation to an execute state of the first operation; wherein the third touch operation is a touch operation for the first operation control.
In an exemplary embodiment, as shown in fig. 12, the apparatus 1100 further comprises: a second control display module 1106.
A second control display module 1106, configured to display the second operation control in response to receiving the first touch operation; wherein the second operation control and the first operation control are displayed on the same side of the user interface.
In an exemplary embodiment, the first operation is any one of: shooting and throwing virtual articles.
To sum up, in the technical solution provided in this embodiment of the application, the first virtual object is controlled to enter the ready-to-execute state of the first operation through the first touch operation, and the first virtual object is controlled to leave the ready-to-execute state of the first operation through the second touch operation and enter the non-execute state of the first operation, so that when the first virtual object is ready to execute the first operation, the user can flexibly control the first virtual object to cancel the execution of the first operation, thereby providing a rescue scheme for the false triggering condition of the first operation, and reducing the false execution probability of the first operation; moreover, the first touch operation and the second touch operation are operations executed by the same finger, that is, the first touch operation and the second touch operation are continuous operations, when the user switches the first touch operation and the second touch operation, the fingers do not need to be changed, the operation difficulty is reduced, that is, in the process that the user adjusts the execution direction of the first operation through the right thumb and controls the first virtual object to move through the left thumb, if the execution of the first operation needs to be cancelled, the right thumb is continuously used for executing the second touch operation, a feasible scheme is provided for realizing the movement, the operation execution and the operation cancellation of the first virtual object through the two fingers at the same time, and the operation difficulty is reduced.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Referring to fig. 13, a block diagram of a terminal 1300 according to an embodiment of the present application is shown. The terminal 1300 may be an electronic device such as a mobile phone, a tablet computer, a game console, an electronic book reader, a multimedia player, a wearable device, a PC, etc. The terminal is used for implementing the control method of the virtual object provided in the above embodiment. The terminal may be the terminal 10 in the game play environment of fig. 1. Specifically, the method comprises the following steps:
in general, terminal 1300 includes: a processor 1301 and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1301 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1301 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
Memory 1302 may include one or more computer-readable storage media, which may be non-transitory. The memory 1302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1302 is used to store at least one instruction, at least one program, set of codes, or set of instructions configured to be executed by one or more processors to implement the control method of a virtual object described above.
In some embodiments, terminal 1300 may further optionally include: a peripheral interface 1303 and at least one peripheral. Processor 1301, memory 1302, and peripheral interface 1303 may be connected by a bus or signal line. Each peripheral device may be connected to the peripheral device interface 1303 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, display screen 1305, camera assembly 1306, audio circuitry 1307, positioning assembly 1308, and power supply 1309.
Those skilled in the art will appreciate that the configuration shown in fig. 13 is not intended to be limiting with respect to terminal 1300 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which when executed by a processor, implements the control method of the above-described virtual object.
Optionally, the computer-readable storage medium may include: ROM (Read Only Memory), RAM (Random Access Memory), SSD (Solid State drive), or optical disc. The Random Access Memory may include a ReRAM (resistive Random Access Memory) and a DRAM (Dynamic Random Access Memory).
In an exemplary embodiment, a computer program product or computer program is also provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the control method of the virtual object.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. In addition, the step numbers described herein only exemplarily show one possible execution sequence among the steps, and in some other embodiments, the steps may also be executed out of the numbering sequence, for example, two steps with different numbers are executed simultaneously, or two steps with different numbers are executed in a reverse order to the order shown in the figure, which is not limited by the embodiment of the present application.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (12)

1. A method of controlling a virtual object, the method further comprising:
displaying a virtual environment picture and a first operation control, wherein the virtual environment picture is a display picture used for showing a virtual environment provided by a shooting application, and the first operation control is used for controlling a first virtual object in the virtual environment to execute a first operation;
in response to receiving a first touch operation, controlling the first virtual object to be switched from an unexecuted state of the first operation to a ready-to-execute state of the first operation; wherein the first touch operation is a touch operation for the first operation control;
in response to receiving a second touch operation, controlling the first virtual object to be switched from a ready-to-execute state of the first operation to an unexecuted state of the first operation; wherein the second touch operation and the first touch operation are executed by the same finger.
2. The method according to claim 1, wherein the second touch operation is a slide operation that moves from the first operation control to a second operation control, and the second operation control is used to cancel execution of the first operation.
3. The method according to claim 1, wherein the second touch operation is a touch operation for a second operation control, and the second operation control is used for cancelling the execution of the first operation.
4. The method according to claim 1, wherein the second touch operation is a touch operation for the first operation control, and an operation attribute of the second touch operation is different from an operation attribute of the first touch operation.
5. The method of claim 4,
the first touch operation is a single-click operation, and the second touch operation is a double-click operation;
or,
the first touch operation is a tap operation, and the second touch operation is a tap operation.
6. The method according to any one of claims 3 to 5, wherein after controlling the first virtual object to switch from the non-execution state of the first operation to the ready-to-execute state of the first operation, further comprising:
and in response to disappearance of the first touch operation, controlling the first virtual object to keep a ready-to-execute state of the first operation.
7. The method of claim 6, wherein after controlling the first virtual object to maintain the ready-to-execute state of the first operation, further comprising:
in response to receiving a third touch operation, controlling the first virtual object to be switched from a ready-to-execute state of the first operation to an execute state of the first operation;
wherein the third touch operation is a touch operation for the first operation control.
8. A method according to claim 2 or 3, characterized in that the method further comprises:
displaying the second operation control in response to receiving the first touch operation;
wherein the second operation control and the first operation control are displayed on the same side of the user interface.
9. The method according to any one of claims 1 to 5, wherein the first operation is any one of: shooting by opening mirror, throwing virtual articles.
10. An apparatus for controlling a virtual object, the apparatus comprising:
the system comprises a first control display module, a second control display module and a third control display module, wherein the first control display module is used for displaying a virtual environment picture and a first operation control, the virtual environment picture is used for displaying a virtual environment provided by a shooting application, and the first operation control is used for controlling a first virtual object in the virtual environment to execute a first operation;
the preparation state switching module is used for responding to the receiving of a first touch operation and controlling the first virtual object to be switched from the unexecuted state of the first operation to the preparation execution state of the first operation; wherein the first touch operation is a touch operation for the first operation control;
the unexecuted state switching module is used for responding to the reception of a second touch operation and controlling the first virtual object to be switched from the ready-to-execute state of the first operation to the unexecuted state of the first operation; wherein the second touch operation and the first touch operation are executed by the same finger.
11. A terminal, characterized in that it comprises a processor and a memory in which at least one instruction, at least one program, set of codes or set of instructions is stored, which is loaded and executed by the processor to implement the control method of a virtual object according to any one of claims 1 to 9.
12. A computer readable storage medium, having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the method of controlling a virtual object according to any one of claims 1 to 9.
CN202010848012.5A 2020-08-21 2020-08-21 Virtual object control method, device, terminal and storage medium Active CN111905380B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010848012.5A CN111905380B (en) 2020-08-21 2020-08-21 Virtual object control method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010848012.5A CN111905380B (en) 2020-08-21 2020-08-21 Virtual object control method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111905380A true CN111905380A (en) 2020-11-10
CN111905380B CN111905380B (en) 2024-04-30

Family

ID=73279306

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010848012.5A Active CN111905380B (en) 2020-08-21 2020-08-21 Virtual object control method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111905380B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113318434A (en) * 2021-06-10 2021-08-31 网易(杭州)网络有限公司 Game information processing method and device and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107422938A (en) * 2017-06-21 2017-12-01 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN111475029A (en) * 2020-04-16 2020-07-31 腾讯科技(深圳)有限公司 Operation method, device, equipment and storage medium of virtual prop

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107422938A (en) * 2017-06-21 2017-12-01 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN111475029A (en) * 2020-04-16 2020-07-31 腾讯科技(深圳)有限公司 Operation method, device, equipment and storage medium of virtual prop

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
洪瀚达: "《和平精英》怎么取消丢雷 手雷取消丢出方法教程", pages 1 - 2, Retrieved from the Internet <URL:https://www.9game.cn/wzry/3950455.html> *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113318434A (en) * 2021-06-10 2021-08-31 网易(杭州)网络有限公司 Game information processing method and device and storage medium

Also Published As

Publication number Publication date
CN111905380B (en) 2024-04-30

Similar Documents

Publication Publication Date Title
CN110548288B (en) Virtual object hit prompting method and device, terminal and storage medium
CN111589148B (en) User interface display method, device, terminal and storage medium
CN110585731B (en) Method, device, terminal and medium for throwing virtual article in virtual environment
CN110681152B (en) Virtual object control method, device, terminal and storage medium
CN111905363B (en) Virtual object control method, device, terminal and storage medium
CN111821691A (en) Interface display method, device, terminal and storage medium
CN110597449B (en) Prop using method, device, terminal and storage medium based on virtual environment
CN110801629B (en) Method, device, terminal and medium for displaying virtual object life value prompt graph
CN113633975B (en) Virtual environment picture display method, device, terminal and storage medium
CN111921188B (en) Virtual object control method, device, terminal and storage medium
US20230330537A1 (en) Virtual object control method and apparatus, terminal and storage medium
JP2024512582A (en) Virtual item display methods, devices, electronic devices and computer programs
CN111589114B (en) Virtual object selection method, device, terminal and storage medium
CN111318015B (en) Virtual article control method, device, terminal and storage medium
CN111905380B (en) Virtual object control method, device, terminal and storage medium
CN111643895B (en) Operation response method, device, terminal and storage medium
CN112057859B (en) Virtual object control method, device, terminal and storage medium
CN111298438B (en) Virtual object control method, device, equipment and storage medium
CN113633974A (en) Method, device, terminal and storage medium for displaying real-time game-checking information of user
CN113617030B (en) Virtual object control method, device, terminal and storage medium
KR102648210B1 (en) Virtual object control method and device, terminal, and storage medium
CN115040863A (en) Skill control method, device, equipment and storage medium for virtual object
CN112426725A (en) Virtual object control method, device, terminal and storage medium
CN113680061A (en) Control method, device, terminal and storage medium of virtual prop
CN112402965A (en) Position monitoring and anti-monitoring method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant