CN113952728A - Method and device for controlling virtual character in game and electronic equipment - Google Patents

Method and device for controlling virtual character in game and electronic equipment Download PDF

Info

Publication number
CN113952728A
CN113952728A CN202111288261.4A CN202111288261A CN113952728A CN 113952728 A CN113952728 A CN 113952728A CN 202111288261 A CN202111288261 A CN 202111288261A CN 113952728 A CN113952728 A CN 113952728A
Authority
CN
China
Prior art keywords
scene
user interface
graphical user
game
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111288261.4A
Other languages
Chinese (zh)
Inventor
曹金旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111288261.4A priority Critical patent/CN113952728A/en
Publication of CN113952728A publication Critical patent/CN113952728A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/422Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • A63F2300/6054Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands by generating automatically game commands to assist the player, e.g. automatic braking in a driving game

Abstract

The invention provides a method and a device for controlling virtual characters in a game and electronic equipment, relates to the technical field of games, and solves the technical problem of low convenience in transmission operation of the virtual characters. The method comprises the following steps: responding to that the designated operation for the graphical user interface meets a preset condition, and displaying a scene map corresponding to the game scene in the graphical user interface; in response to a selection operation for a target position in the scene map while maintaining the designation operation, determining a target scene position corresponding to the target position in the game scene; and controlling the first virtual character to reach the target scene position in response to the end of the operation of the specified operation after the selection operation.

Description

Method and device for controlling virtual character in game and electronic equipment
Technical Field
The present application relates to the field of game technologies, and in particular, to a method and an apparatus for controlling a game virtual character, and an electronic device.
Background
In current games, a player can manipulate a virtual character to quickly reach a target position in a game scene, namely a transfer function in the game. For example, a player can open a scene map by clicking a scene map opening button in a keyboard, clicking an up-down left-right key in the keyboard to select a teammate position, and then clicking an "a" key in the keyboard to confirm and transmit to the teammate.
However, for the existing control method of the virtual character, the operation process of the player is complicated, the transmission operation convenience degree of the virtual character is low, the operation efficiency is low, and the game experience of the player is influenced.
Disclosure of Invention
The application aims to provide a method and a device for controlling a virtual character in a game and electronic equipment, so as to solve the technical problem that the transmission operation convenience degree of the virtual character is low in the prior art.
In a first aspect, embodiments of the present application provide a method for controlling a virtual character in a game,
providing a graphical user interface through a terminal device, wherein the content displayed by the graphical user interface at least partially comprises a game scene of the game, and the game scene of the game at least comprises a first virtual character controlled by the terminal device; the method comprises the following steps:
responding to that the designated operation for the graphical user interface meets a preset condition, and displaying a scene map corresponding to the game scene in the graphical user interface;
in response to a selection operation for a target position in the scene map while maintaining the designation operation, determining a target scene position corresponding to the target position in the game scene;
and controlling the first virtual character to reach the target scene position in response to the end of the operation of the specified operation after the selection operation.
In one possible implementation, the step of displaying a scene map corresponding to the game scene in the graphical user interface in response to the specified operation for the graphical user interface meeting a preset condition includes:
and responding to that the designated operation for the graphical user interface meets the preset condition, and displaying a scene map corresponding to the game scene in the graphical user interface according to a designated display mode.
In one possible implementation, the specified display mode includes any one or more of the following:
and a transmission display mode of the appointed display transparency, the appointed display position and the appointed display duration.
In one possible implementation, the method further includes:
hiding the scene map in the graphical user interface in response to an end of the operation of the specified operation.
In one possible implementation, the condition that the preset condition is met includes that the duration of the specified operation is greater than a preset duration; further comprising:
responding to the fact that the specified operation duration aiming at the graphical user interface is smaller than or equal to the preset duration, and displaying the scene map in the graphical user interface according to a preset opening display mode;
and the display transparency according to the preset opening display mode is zero.
In one possible implementation, after the step of displaying the scene map in the graphical user interface in a preset open display manner in response to the specified operation duration for the graphical user interface being less than or equal to the preset duration, the method further includes:
hiding the scene map in the graphical user interface in response to the specified operation duration for the graphical user interface being performed again being less than or equal to the preset duration.
In one possible implementation, the step of determining a target scene position corresponding to the target position in the game scene in response to the selection operation for the target position in the scene map while maintaining the specifying operation includes:
in response to a selection operation for an arbitrary position in the scene map while maintaining the specified operation, determining a target position selected by the selection operation, and displaying identification information of the target position in the graphical user interface;
and determining a target scene position corresponding to the target position in the game scene.
In one possible implementation, the specified operation includes any one or more of:
the method comprises the steps of performing touch operation on a map opening control and pressing operation on a first designated key;
the first appointed key is used for opening the scene map.
In one possible implementation, the selecting operation includes any one or more of:
touch operation aiming at the scene map and pressing operation aiming at a second designated key;
the second designated key is used for selecting a position in the scene map.
In one possible implementation, after the step of determining a target scene position corresponding to the target position in the game scene in response to the selection operation for the target position in the scene map while maintaining the specifying operation, the method further includes:
canceling the determination of the target position and the target scene position in response to an operation other than the selection operation and the designation operation.
In one possible implementation, the other operations include any one or more of:
the operation of a range except the scene map and the map opening control in the graphical user interface and the pressing operation of the keys except the first specified key and the second specified key are performed;
the first appointed key is used for opening the scene map; the second designated key is used for selecting a position in the scene map.
In one possible implementation, the step of canceling the determination of the target position and the target scene position in response to an operation other than the selecting operation and the designating operation includes:
hiding the scene map in the graphical user interface and canceling the determination of the target position and the target scene position in response to an operation other than the selection operation and the designation operation.
In one possible implementation, the target location includes: a static location in the scene map, and/or a dynamic location in the scene map.
In a second aspect, a device for controlling virtual characters in a game is provided, in which a terminal device provides a graphical user interface, content displayed by the graphical user interface at least partially includes a game scene of the game, and the game scene of the game at least includes a first virtual character controlled by the terminal device; the device comprises:
the display module is used for responding to that the specified operation aiming at the graphical user interface meets the preset condition, and displaying a scene map corresponding to the game scene in the graphical user interface;
a determination module configured to determine a target scene position corresponding to a target position in the game scene in response to a selection operation for the target position in the scene map while maintaining the designation operation;
and the control module is used for controlling the first virtual character to reach the target scene position in response to the end of the operation of the specified operation after the selection operation.
In a third aspect, an embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the memory stores a computer program that is executable on the processor, and the processor implements the steps of the method according to the first aspect when executing the computer program.
In a fourth aspect, this embodiment of the present application further provides a computer-readable storage medium storing computer-executable instructions, which, when invoked and executed by a processor, cause the processor to perform the method of the first aspect.
The embodiment of the application brings the following beneficial effects:
according to the method and the device for controlling the virtual character in the game and the electronic equipment, the scene map corresponding to the game scene can be displayed in the graphical user interface in response to the fact that the designated operation for the graphical user interface meets the preset condition, the target scene position corresponding to the target position is determined in the game scene in response to the selection operation for the target position in the scene map while the designated operation is kept, and the first virtual character is controlled to reach the target scene position in response to the fact that the operation of the designated operation is finished after the selection operation. According to the scheme, the scene map capable of selecting the target position is triggered when the designated operation meets the preset condition, the designated operation after the target position is selected is finished, the target position is quickly transmitted to the target position, the quick and convenient operation of transmission is realized by using the combined keys, the original multiple operation process is simplified, the efficiency of transmission operation is improved, and the technical problem that the transmission operation convenience degree of the virtual character is lower in the prior art is solved.
Drawings
In order to more clearly illustrate the detailed description of the present application or the technical solutions in the prior art, the drawings needed to be used in the detailed description of the present application or the prior art description will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 shows a schematic structural diagram of a touch terminal provided in an embodiment of the present application;
fig. 3 is a schematic view of a usage scenario of a touch terminal according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of a method for controlling a virtual character in a game according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an electronic device displaying a graphical user interface provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of an electronic device displaying another graphical user interface provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of an electronic device displaying another graphical user interface provided by an embodiment of the present application;
fig. 8 is a structural intention of a control device for a virtual character in a game according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "comprising" and "having," and any variations thereof, as referred to in the embodiments of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the existing game, a player can manipulate a virtual character to perform a transfer action in a game scene. For example, a player can open a map by clicking a map opening button in a keyboard, clicking the upper, lower, left and right keys in the keyboard to select the position of a teammate, and then clicking the key A in the keyboard to confirm and transmit the position to the teammate.
However, for the existing control method of the virtual character, a player cannot synchronously observe surrounding battlefield information in the operation process, cannot perform displacement/battle operation, is very easy to be buried and killed by an enemy, and has the defects of complicated operation process, long time consumption, high operation cost, low convenience for the transmission operation of the virtual character, incapability of meeting the requirement of quick transmission of the player and influence on game experience of the player.
Based on this, the embodiment of the application provides a method and a device for controlling a virtual character in a game and an electronic device, and the method can solve the technical problem that the transmission operation convenience of the virtual character is low in the prior art.
In one embodiment of the present application, the method for controlling the virtual character in the game may be executed on a local terminal device or a server. When the control method of the virtual character in the game runs on the server, the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, a running main body of a game program and a game picture presenting main body are separated, the storage and the running of a control method of a virtual character in the game are finished on a cloud game server, and a client device is used for receiving and sending data and presenting a game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an optional implementation manner, taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
In a possible implementation manner, an embodiment of the present application provides a method for controlling a virtual character in a game, where a graphical user interface is provided through a terminal device, where the terminal device may be the aforementioned local terminal device, and may also be the aforementioned client device in a cloud interaction system.
For example, as shown in fig. 1, fig. 1 is a schematic view of an application scenario provided in the embodiment of the present application. The application scenario may include a touch terminal (e.g., a cell phone 102) and a server 101, and the touch terminal may communicate with the server 101 through a wired network or a wireless network. The touch terminal is used for operating a virtual desktop, and can interact with the server 101 through the virtual desktop to control virtual roles in the server 101.
The touch terminal of the present embodiment is described by taking the mobile phone 102 as an example. The handset 102 includes Radio Frequency (RF) circuitry 210, memory 220, a touch screen 230, a processor 240, and the like. Those skilled in the art will appreciate that the handset configuration shown in fig. 2 is not intended to be limiting and may include more or fewer components than those shown, or may combine certain components, or split certain components, or arranged in different components. Those skilled in the art will appreciate that the touch screen 230 is part of a User Interface (UI) and that the cell phone 102 may include fewer than or the same User Interface as illustrated.
The RF circuitry 210 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global system for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 220 may be used for storing software programs and modules, and the processor 240 executes various functional applications and data processing of the cellular phone 102 by operating the software programs and modules stored in the memory 220. The memory 220 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the stored data area may store data created from use of the handset 102, and the like. Further, the memory 220 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The touch screen 230 may be used to display a graphical user interface and receive user operations with respect to the graphical user interface. A particular touch screen 230 may include a display panel and a touch panel. The Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may collect contact or non-contact operations of a user on or near the touch panel (for example, as shown in fig. 3, operations of the user on or near the touch panel using any suitable object or accessory such as a finger 301, a stylus pen, etc.), and generate preset operation instructions. In addition, the touch panel may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction and gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into information that can be processed by the processor, sends the information to the processor 240, and receives and executes commands sent from the processor 240. In addition, the touch panel may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, a surface acoustic wave, and the like, and may also be implemented by any technology developed in the future. Further, the touch panel may cover the display panel, a user may operate on or near the touch panel covered on the display panel according to a graphical user interface displayed by the display panel, the touch panel detects an operation thereon or nearby and transmits the operation to the processor 240 to determine a user input, and the processor 240 provides a corresponding visual output on the display panel in response to the user input. In addition, the touch panel and the display panel can be realized as two independent components or can be integrated.
The processor 240 is the control center of the handset 102, connects various parts of the entire handset using various interfaces and lines, and performs various functions and processes of the handset 102 by running or executing software programs and/or modules stored in the memory 220 and calling data stored in the memory 220, thereby performing overall monitoring of the handset.
Embodiments of the present application are further described below with reference to the accompanying drawings.
Fig. 4 is a flowchart of a method for controlling a virtual character in a game according to an embodiment of the present application, where the method may be applied to a terminal device (e.g., the mobile phone 102 shown in fig. 2) capable of presenting a graphical user interface, where the graphical user interface is provided by the terminal device, content displayed by the graphical user interface at least partially includes a game scene of the game, and the game scene of the game at least includes a first virtual character controlled by the terminal device. As shown in fig. 4, the method includes:
and step S410, responding to that the specified operation aiming at the graphical user interface meets the preset condition, and displaying a scene map corresponding to the game scene in the graphical user interface.
It should be noted that the operation manner of the designated operation may include various manners, for example, a touch operation, a press operation, a click operation, and the like for the graphical user interface. The condition that the designated operation meets the preset condition may also include multiple conditions, for example, the touch duration of the touch operation is greater than the preset duration, the pressing pressure of the pressing operation is greater than the preset pressure, the number of clicks of the clicking operation meets the preset number, and the like. The preset time period may not be limited, for example, 1 second or 2 seconds, and the embodiment takes 1 second as an example for description. Likewise, the preset pressure may be preset for a preset number of times without limitation.
In the embodiment of the present application, the control device responding to the player-related operation may include, but is not limited to, any control device capable of receiving the player operation, such as a keyboard, a game pad, a touch-screen mobile phone, and the like. Illustratively, the system will display the scene map on the graphical user interface 1 second after the player presses the open scene map button in the graphical user interface.
In step S420, in response to a selection operation for a target position in the scene map while maintaining the designation operation, a target scene position corresponding to the target position is determined in the game scene.
Wherein the selection of the target position may be varied. As one example, the target location is a static location in a scene map, e.g., a specific coordinate location in the scene map, a fixed location of a virtual building, a fixed location of a virtual plant, and so forth. As another example, the target position is a dynamic position in the scene map, for example, a dynamic position of a dynamically moving target NPC, a dynamic position of another avatar other than the first avatar, such as a teammate, and the like. It should be noted that, if the dynamic position is selected, the position at the current end time is used as the target position for transmission when the touch is ended.
In practical applications, a player may select a target position in a scene map through a keyboard, a joystick, a direction key on a touch-screen mobile phone, and other devices, for example, select one teammate, and then determine a target scene position corresponding to the teammate position in a game scene.
When the player selects the target position in the scene map, the player always keeps the designation operation in step S410, for example, the designation operation always keeps the long-press operation state. For example, the player may hold a long press of the designated operation with one finger and perform a selection operation with the other finger for a target position in the scene map.
In response to the end of the operation of the designation operation after the selection operation, the first avatar is controlled to reach the target scene position in step S430.
For example, after the player selects a certain teammate position in the scene map, when the player does not operate the open scene map key in the graphical user interface, the system transmits the first virtual character to the target scene position corresponding to the teammate position in the game scene.
In the embodiment of the application, the scene map capable of selecting the target position is triggered when the designated operation meets the preset condition, the designated operation after the target position is selected is finished, the rapid and convenient operation of transmission is realized by utilizing the combined keys, the original repeated operation process is simplified, the efficiency of transmission operation is improved, and the technical problem of lower convenience degree of transmission operation of the virtual character in the prior art is solved.
The above steps are described in detail below.
In some embodiments, the scene map may be displayed in a designated display manner, which may no longer be a display manner of a conventional open scene map, thereby enabling to distinguish between only the open and transfer selection functions for the scene map. As an example, the step S410 may specifically include the following steps:
step a), responding to that the designated operation aiming at the graphical user interface meets the preset condition, and displaying a scene map corresponding to the game scene in the graphical user interface according to a designated display mode.
Illustratively, as shown in fig. 5, when the player presses the open scene map button in the graphical user interface for 1 second, the system displays a scene map 501 on the graphical user interface, wherein the scene map 501 can be displayed semi-transparently according to a specified display mode, so as to prevent the scene map 501 from obstructing the first virtual character 502 and other game scenes (e.g., trees in fig. 5).
By displaying the scene map in the appointed display mode, the scene map can be prevented from shielding game pictures after a player opens the scene map, so that the player can synchronously observe surrounding battlefield information while viewing the scene map, the situation that the player is killed by an enemy in a buried and wild manner is avoided, and the game experience of the player is improved.
Based on the step a), the designated display mode can comprise a plurality of modes so as to flexibly display the scene map. Illustratively, the specified display mode includes any one or more of the following:
and a transmission display mode of the appointed display transparency, the appointed display position and the appointed display duration.
As an example, the scene map 501 may be displayed according to a designated display transparency, and as shown in fig. 5, the system displays the scene map 501 in a semi-transparent manner, so that the scene map 501 can avoid the shielding of the game scene, and the player can observe the scene map 501 and the surrounding battlefield information, and also prevent the scene map from shielding the game scene.
As another example, the scene map 501 may be displayed at a designated display position, as shown in fig. 5, the system displays the scene map 501 in the middle of the gui, so that the player can focus on the scene map 501 at the line of sight, and meanwhile, the residual light can also take into account the surrounding battlefield information, and also prevent the scene map from blocking the game scene picture.
As another example, the scene map 501 may be displayed according to a specified display duration, as shown in fig. 5, after the display duration of the scene map 501 reaches 5 seconds, the system controls the scene map 501 to automatically hide and display, so as to avoid the scene map 501 occupying the graphical user interface for a long time. The display duration may not be limited, for example, 5 seconds or 7 seconds (the embodiment takes 5 seconds as an example for explanation), so as to prevent the scene map from blocking the game scene picture.
The designated display mode covers multiple modes, so that the problem that the visual field of a player in the opened scene map is lost can be avoided, the scene map is prevented from shielding the game picture, the player can synchronously observe surrounding battlefield information while viewing the scene map, and the game experience of the player is improved.
Based on the step a), if the player does not want to perform selection operation aiming at the target position in the game scene, the player can directly end the designated operation so as to close the scene map flexibly and quickly. As an example, the method may further comprise the steps of:
and b), hiding the scene map in the graphical user interface in response to the end of the operation of the specified operation.
For example, as shown in fig. 5, after the player presses the open scene map button in the graphical user interface for 1 second, the system displays the scene map 501 on the graphical user interface, and then the player stops the display of the scene map 501 without touching the open scene map button any more under the condition that the player does not perform the selection operation of the virtual target object.
Under the condition that the player operates the corresponding scene map key to open the scene map, the player can close the scene map without operating the key, so that the rapid operation of closing the scene map is realized, the process of closing the scene map is simplified, the operation efficiency is improved, and the game experience of the player is improved.
Based on the step a), when the operation of the player on the graphical user interface does not meet the preset condition, such as clicking operation, the system can display the scene map according to a preset opening display mode different from the specified display mode, so that the player can flexibly select only an opening function or a transmission function for the scene map. As an example, the condition that the preset condition is met includes that the duration of the specified operation is greater than a preset duration; the method may further comprise the steps of:
and c), responding to the condition that the duration of the specified operation aiming at the graphical user interface is less than or equal to the preset duration, and displaying the scene map in the graphical user interface according to a preset opening display mode.
For the step c), the display transparency according to the preset open display mode is zero.
The preset open display mode may be used to represent a conventional open display mode, and for example, the conventional open display mode is an opaque display scene map, for example, a display scene map that is only opened without any display processing, such as transparency and no blurring, is not needed.
In practical applications, as shown in fig. 6, when a player operates an open scene map key in a graphical user interface for less than or equal to 1 second, it may be understood that a quick click is made on the open scene map key in the graphical user interface, and the system displays a scene map 601 on the graphical user interface and performs a blurring display on the first virtual character 602 and other game scenes (e.g., trees in fig. 5), where the scene map 601 may be displayed in an opaque manner according to a preset open display mode.
The scene map is displayed in the preset opening display mode, so that the scene map is displayed in a non-transparent mode, the game scene is blurred, the scene map can be displayed clearly and completely in front of the player, the player can flexibly select the opening function or the transmission function conveniently, the player can observe the game situation in detail under the opening function, the next operation is carried out according to the game situation, and the game experience of the player is improved.
Based on the step c), when the map is opened, when the player clicks the operation again, the scene map is closed, so as to realize flexible operation on the scene map. As an example, after the step c) above, the method may further include the steps of:
and d), hiding the scene map in the graphical user interface in response to that the specified operation time for the graphical user interface is less than or equal to the preset time.
Illustratively, as shown in fig. 6, when the player operates the open scene map key in the graphical user interface again for less than or equal to 1 second, it can be understood that the system will hide the scene map 601 in the graphical user interface by quickly clicking the open scene map key in the graphical user interface again.
Under the condition that the scene map is displayed in the preset opening display mode, the player can close the scene map by clicking the corresponding scene map key again, so that the quick operation of closing the scene map is realized, the process of closing the scene map is simplified, the operation efficiency is improved, and the game experience of the player is improved.
Based on the step a), when the player selects the target position according to the scene map, the identification information of the currently selected target position can be displayed so as to prompt the player of the currently selected target position effectively in time. Of course, the system may also control the transparency reduction of the scene map to improve the clarity of the scene map, so that the player can observe the target position to be transmitted from the scene map. As an example, specifying the display mode includes specifying a transmission display mode that displays transparency; the step S420 may specifically include the following steps:
and e), responding to the selection operation aiming at any position in the scene map while keeping the specified operation, determining the target position selected by the selection operation, and displaying the identification information of the target position in the graphical user interface.
And f), determining a target scene position corresponding to the target position in the game scene.
In practical applications, when a player selects a target position on a scene map, identification information of the target position can be visualized, for example, a highlight or a mark point or the like highlights the target position currently selected by the player, and prompt information or the like for the target position can be displayed.
The identification information of the currently selected target position is displayed when the player selects the target position for the scene map, so that the currently selected target position can be prompted to the player more timely and effectively.
In addition, the transparency of the scene map may be adjusted so as not to obscure the user's view of the scene information. Illustratively, as shown in fig. 7, when a player selects a target position in a scene map through a direction key on a keyboard, the transparency of the scene map 701 is reduced, thereby improving the clarity of the scene map. The corresponding position of the first avatar 702 controlled by the player in the scene map 701 is shown as a small person 703, and the player can select the target position by moving an arrow icon 704. For example, the player may select the teammates 705 in the scene map 701, and then determine the corresponding target scene positions of the teammates 705 in the game scene. When the player performs the object selection operation, the display transparency of the scene map is reduced to improve the definition of the scene map, so that the player can observe the map details more clearly and observe the game situation in detail, and the appropriate transmission object is selected.
In some embodiments, the specified operations may include multiple to enable multiple flexible modes of operation for the scene map. As an example, the specified operation includes any one or more of:
the method comprises the steps of touch operation aiming at a map opening control and pressing operation aiming at a first designated key.
The first appointed key is used for opening the scene map.
In practical applications, the control device for the player-related operation may include, but is not limited to, any control device capable of receiving the player operation, such as a keyboard, a joystick, a touch-screen mobile phone, and the like. As an example, the system will display the scene map on the graphical user interface 1 second after the player presses the "X" key (the first designated key, i.e., the open scene map key) on the keyboard. As another example, the system will display the scene map on the graphical user interface 1 second after the player presses the "X" key (the first designated key, i.e., the open scene map key) on the gamepad. As another example, the system will display a scene map on a graphical user interface provided by a touch screen cell phone 1 second after the player presses a map open control in the graphical user interface.
By covering various designated operations, the player can open the scene map through various control devices and various designated operations, different operation habits of different players are adapted, and the game experience of the player is improved.
In some embodiments, the selection operation may include a variety to enable a variety of flexible selection manners for the target location. Illustratively, the selecting operation includes any one or more of:
touch operation aiming at the scene map and pressing operation aiming at a second designated key; and the second designated key is used for selecting a position in the scene map.
As one example, the player may perform the selection operation of the target position by clicking a direction key (second designated key) on the keyboard. As another example, the player may perform the selection operation of the target position by clicking a direction key (second designated key) on the game pad. As another example, the player may perform a selection operation of a target location by clicking a directional control on a touch-screen cell phone.
By covering various selection operations, the player can select the target position through various selection operations by various control devices, different operation habits of different players are adapted, and the game experience of the player is improved.
In some embodiments, the player may cancel the selection operation for the target position, and the operation fault tolerance rate is improved to prevent the game situation from being in a disadvantage due to the misoperation of the player. As an example, after the step S420, the method may further include the steps of:
and g), in response to other operations except the selection operation and the designation operation, canceling the determination of the target position and the target scene position.
For example, as shown in fig. 7, after selecting the teammate 705 in the scene map 701 and further determining the target scene position corresponding to the teammate 705 in the game scene, the player decides to remain in place and not transmit, and the player may cancel the determination of the target position and the target scene position by clicking other keys on the keyboard except the key for opening the scene map and the direction key.
By setting a cancel function operation other than the selection operation and the designation operation, the player can cancel the selection operation for the target position to prevent the player from being disadvantaged in the game situation due to an erroneous operation.
Based on the step g), the manner of canceling the selection operation may include various ways to achieve flexible cancellation of the target location selection operation. Illustratively, the other operations include any one or more of:
the method comprises the steps of operating a range except a scene map and a map opening control in a graphical user interface, and pressing operating keys except a first specified key and a second specified key.
The first appointed key is used for opening a scene map; the second designation key is used to select a location in the scene map.
As one example, the player may cancel the determination of the target position and the target scene position by clicking other keys on the keyboard than the open scene map key (first designated key) and the direction key (second designated key).
As another example, the player may cancel the determination of the target position and the target scene position by clicking other keys on the game pad than the open scene map key (first designation key) and the direction key (second designation key).
As another example, the player may cancel the determination of the target location and the target scene location by clicking other controls on the touchscreen handset besides the map open control and the directional control.
Through covering various other operations, the player can cancel the determination of the target position and the target scene position through various other operations through various control devices, different operation habits of different players are adapted, and the game experience of the player is improved.
Based on the step g), the player can cancel the display of the scene map at the same time when canceling the selection operation of the target position, so as to avoid the shielding of the scene view picture by the map. Illustratively, the step g) may specifically include the following steps:
and h), in response to other operations except the selection operation and the designation operation, hiding the scene map in the graphical user interface and canceling the determination of the target position and the target scene position.
As one example, the player may cancel the determination of the target location and the target scene location by clicking on keys on the keyboard other than the open scene map key (first designated key) and the direction key (second designated key), while the system hides the display of the scene map in the graphical user interface.
As another example, the player may cancel the determination of the target location and the target scene location by clicking on other keys on the gamepad than the open scene map key (first designated key) and the direction key (second designated key), while the system hides the display of the scene map in the graphical user interface.
As another example, the player may cancel the determination of the target location and the target scene location by clicking other controls on the touchscreen handset besides the map open control and the directional control, while the system hides the display of the scene map in the graphical user interface.
The system hides the display of the scene map in the graphical user interface when the player cancels the selection operation of the target position, so that the quick operation of simultaneously canceling the target and closing the scene map is realized, the operation flow is simplified, and the game experience of the player is improved.
In some embodiments, the selection of the target location of the scene map may be varied to enable flexible transfer of the location. As an example, the target location includes: a static location in the scene map, and/or a dynamic location in the scene map.
The static location may include various forms of static locations, such as a specific coordinate location in a scene map, a fixed location of a virtual building, a fixed location of a virtual plant, and so on.
For the dynamic position, various forms of dynamic positions may also be included, for example, a dynamic position of the dynamically moving target NPC, a dynamic position of another virtual character other than the first virtual character, such as a dynamic position of a teammate, and the like. At this time, if the dynamic position is selected, the position at the current end time is used as the target position for transmission when the response touch is finished.
For example, as shown in fig. 7, the target location may be a location of a target avatar, e.g., a location of a teammate 705; the target location may also be a specific coordinate location, e.g., the coordinate location (65, 24) of the transfer point 706; the target location may also be a location of a scene model, such as a location of a building 707.
The selectable transmitting places of the player are increased by covering the target positions of various different conditions, so that the player can transmit to different target scene positions according to the game situation, and the game experience of the player is improved.
Fig. 8 provides a schematic structural diagram of a control device for a virtual character in a game. The device can be applied to terminal equipment capable of presenting a graphical user interface, the graphical user interface is provided through the terminal equipment, the content displayed by the graphical user interface at least partially comprises a game scene of a game, and the game scene of the game at least comprises a first virtual character controlled through the terminal equipment. As shown in fig. 8, the apparatus 800 for controlling a virtual character in a game includes:
a display module 801, configured to display a scene map corresponding to a game scene in a graphical user interface in response to that a specified operation for the graphical user interface meets a preset condition;
a determining module 802, configured to determine, in response to a selection operation for a target position in the scene map while maintaining the specifying operation, a target scene position corresponding to the target position in the game scene;
and a control module 803, configured to control the first virtual character to reach the target scene position in response to an end of the operation of the specified operation after the selection operation.
In some embodiments, the display module 801 is specifically configured to:
and responding to that the specified operation aiming at the graphical user interface meets the preset condition, and displaying a scene map corresponding to the game scene in the graphical user interface according to a specified display mode.
In some embodiments, the specified display mode includes any one or more of:
and a transmission display mode of the appointed display transparency, the appointed display position and the appointed display duration.
In some embodiments, the apparatus further comprises:
and the first hiding module is used for hiding the scene map in the graphical user interface in response to the end of the operation of the specified operation.
In some embodiments, the condition that the preset condition is met comprises that the duration of the specified operation is greater than a preset duration; the device also includes:
the second display module is used for responding that the duration of the specified operation aiming at the graphical user interface is less than or equal to the preset duration, and displaying the scene map in the graphical user interface according to a preset opening display mode; the display transparency according to the preset open display mode is zero.
In some embodiments, the apparatus further comprises:
and the second hiding module is used for hiding the scene map in the graphical user interface in response to that the specified operation time for the graphical user interface is less than or equal to the preset time.
In some embodiments, the determining module 802 is specifically configured to:
in response to a selection operation for an arbitrary position in the scene map while maintaining the designation operation, determining a target position selected by the selection operation, and displaying identification information of the target position in the graphical user interface;
a target scene location corresponding to the target location is determined in the game scene.
In some embodiments, the specified operation comprises any one or more of:
the method comprises the steps of performing touch operation on a map opening control and pressing operation on a first designated key;
the first appointed key is used for opening the scene map.
In some embodiments, the selecting operation comprises any one or more of:
touch operation aiming at the scene map and pressing operation aiming at a second designated key;
and the second designated key is used for selecting a position in the scene map.
In some embodiments, the apparatus further comprises:
and a cancellation module configured to cancel the determination of the target position and the target scene position in response to an operation other than the selection operation and the designation operation after determining the target scene position corresponding to the target position in the game scene in response to the selection operation for the target position in the scene map while the designation operation is maintained.
In some embodiments, the other operations include any one or more of:
the method comprises the steps of operating ranges except a scene map and a map opening control in a graphical user interface, and pressing keys except a first designated key and a second designated key;
the first appointed key is used for opening a scene map; the second designation key is used to select a location in the scene map.
In some embodiments, the cancellation module is specifically configured to:
in response to an operation other than the selection operation and the designation operation, the scene map is hidden in the graphical user interface, and the determination of the target position and the target scene position is cancelled.
In some embodiments, the target location comprises: a static location in the scene map, and/or a dynamic location in the scene map.
The control device for the virtual character in the game provided by the embodiment of the application has the same technical characteristics as the control method for the virtual character in the game provided by the embodiment, so that the same technical problems can be solved, and the same technical effects can be achieved.
Corresponding to the control method of the virtual character in the game, the embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium stores machine executable instructions, and when the computer executable instructions are called and executed by a processor, the computer executable instructions cause the processor to execute the steps of the control method of the virtual character in the game.
The control device of the virtual role provided in the embodiment of the present application may be specific hardware on the device, or software or firmware installed on the device, and the like. The device provided by the embodiment of the present application has the same implementation principle and technical effect as the foregoing method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the foregoing method embodiments where no part of the device embodiments is mentioned. It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the foregoing systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
For another example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments provided in the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method for controlling a virtual character in a game according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus once an item is defined in one figure, it need not be further defined and explained in subsequent figures, and moreover, the terms "first", "second", "third", etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the scope of the embodiments of the present application. Are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (16)

1. A method for controlling virtual characters in a game is characterized in that a terminal device provides a graphical user interface, the content displayed by the graphical user interface at least partially comprises a game scene of the game, and the game scene of the game at least comprises a first virtual character controlled by the terminal device; the method comprises the following steps:
responding to that the designated operation for the graphical user interface meets a preset condition, and displaying a scene map corresponding to the game scene in the graphical user interface;
in response to a selection operation for a target position in the scene map while maintaining the designation operation, determining a target scene position corresponding to the target position in the game scene;
and controlling the first virtual character to reach the target scene position in response to the end of the operation of the specified operation after the selection operation.
2. The method according to claim 1, wherein the step of displaying a scene map corresponding to the game scene in the graphical user interface in response to the specified operation for the graphical user interface meeting a preset condition comprises:
and responding to that the designated operation for the graphical user interface meets the preset condition, and displaying a scene map corresponding to the game scene in the graphical user interface according to a designated display mode.
3. The method of claim 2, wherein the specified display mode comprises any one or more of:
and the display mode of the designated display transparency, the designated display position and the designated display duration.
4. The method of claim 2, further comprising:
hiding the scene map in the graphical user interface in response to an end of the operation of the specified operation.
5. The method according to claim 2, wherein the condition that the preset condition is met comprises that the duration of the specified operation is greater than a preset duration; further comprising:
responding to the fact that the duration of the specified operation aiming at the graphical user interface is smaller than or equal to the preset duration, and displaying the scene map in the graphical user interface according to a preset opening display mode;
and the display transparency according to the preset opening display mode is zero.
6. The method of claim 5, wherein the step of displaying the scene map in a preset open display manner in the graphical user interface in response to the specified operation duration for the graphical user interface being less than or equal to the preset duration further comprises:
hiding the scene map in the graphical user interface in response to the specified operation duration for the graphical user interface being performed again being less than or equal to the preset duration.
7. The method according to claim 2, wherein the step of determining a target scene position corresponding to the target position in the game scene in response to the selection operation for the target position in the scene map while maintaining the designation operation includes:
in response to a selection operation for an arbitrary position in the scene map while maintaining the specified operation, determining a target position selected by the selection operation, and displaying identification information of the target position in the graphical user interface;
and determining a target scene position corresponding to the target position in the game scene.
8. The method of claim 1, wherein the specified operation comprises any one or more of:
the method comprises the steps of performing touch operation on a map opening control and pressing operation on a first designated key;
the first appointed key is used for opening the scene map.
9. The method of claim 1, wherein the selecting operation comprises any one or more of:
touch operation aiming at the scene map and pressing operation aiming at a second designated key;
the second designated key is used for selecting a position in the scene map.
10. The method according to claim 1, wherein the step of determining a target scene position corresponding to the target position in the game scene in response to the selection operation for the target position in the scene map while maintaining the designation operation further comprises:
canceling the determination of the target position and the target scene position in response to an operation other than the selection operation and the designation operation.
11. The method of claim 10, wherein the other operations comprise any one or more of:
the operation of a range except the scene map and the map opening control in the graphical user interface and the pressing operation of the keys except the first specified key and the second specified key are performed;
the first appointed key is used for opening the scene map; the second designated key is used for selecting a position in the scene map.
12. The method of claim 10, wherein the step of canceling the determination of the target position and the target scene position in response to an operation other than the selecting operation and the designating operation comprises:
hiding the scene map in the graphical user interface and canceling the determination of the target position and the target scene position in response to an operation other than the selection operation and the designation operation.
13. The method of claim 1, wherein the target location comprises: a static location in the scene map, and/or a dynamic location in the scene map.
14. The device for controlling the virtual characters in the game is characterized in that a terminal device provides a graphical user interface, the content displayed by the graphical user interface at least partially comprises a game scene of the game, and the game scene of the game at least comprises a first virtual character controlled by the terminal device; the device comprises:
the display module is used for responding to that the specified operation aiming at the graphical user interface meets the preset condition, and displaying a scene map corresponding to the game scene in the graphical user interface;
a determination module configured to determine a target scene position corresponding to a target position in the game scene in response to a selection operation for the target position in the scene map while maintaining the designation operation;
and the control module is used for controlling the first virtual character to reach the target scene position in response to the end of the operation of the specified operation after the selection operation.
15. An electronic device comprising a memory and a processor, wherein the memory stores a computer program operable on the processor, and wherein the processor implements the steps of the method of any of claims 1 to 13 when executing the computer program.
16. A computer readable storage medium having stored thereon computer executable instructions which, when invoked and executed by a processor, cause the processor to execute the method of any of claims 1 to 13.
CN202111288261.4A 2021-11-02 2021-11-02 Method and device for controlling virtual character in game and electronic equipment Pending CN113952728A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111288261.4A CN113952728A (en) 2021-11-02 2021-11-02 Method and device for controlling virtual character in game and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111288261.4A CN113952728A (en) 2021-11-02 2021-11-02 Method and device for controlling virtual character in game and electronic equipment

Publications (1)

Publication Number Publication Date
CN113952728A true CN113952728A (en) 2022-01-21

Family

ID=79468914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111288261.4A Pending CN113952728A (en) 2021-11-02 2021-11-02 Method and device for controlling virtual character in game and electronic equipment

Country Status (1)

Country Link
CN (1) CN113952728A (en)

Similar Documents

Publication Publication Date Title
US11290543B2 (en) Scene switching method based on mobile terminal
US10398977B2 (en) Information processing method, terminal, and computer storage medium
CN110955370B (en) Switching method and device of skill control in game and touch terminal
US11623142B2 (en) Data processing method and mobile terminal
US10850196B2 (en) Terminal device
EP2820528B1 (en) Systems and methods for presenting visual interface content
CN109513208B (en) Object display method and device, storage medium and electronic device
CN111111190A (en) Interaction method and device for virtual characters in game and touch terminal
CN107741819A (en) Information processing method, device, electronic equipment and storage medium
CN111840988B (en) Game skill triggering method, game skill triggering device, game client and medium
JP2023552772A (en) Virtual item switching method, device, terminal and computer program
CN113457157A (en) Method and device for switching virtual props in game and touch terminal
CN111905371B (en) Method and device for controlling target virtual character in game
CN115089959A (en) Direction prompting method and device in game and electronic terminal
CN116531754A (en) Method and device for controlling virtual characters in game and electronic terminal
CN113952728A (en) Method and device for controlling virtual character in game and electronic equipment
CN115105831A (en) Virtual object switching method and device, storage medium and electronic device
CN113926186A (en) Method and device for selecting virtual object in game and touch terminal
KR101953781B1 (en) Method for controlling mobile terminal in online game
CN114849226A (en) Game function control method and device and electronic terminal
CN113663326B (en) Aiming method and device for game skills
CN116726485A (en) Method and device for controlling skills in game and electronic terminal
CN116832433A (en) Method and device for controlling virtual characters in game and electronic terminal
CN115779429A (en) Method and device for controlling virtual character in game and electronic terminal
CN115645924A (en) Virtual article processing method and device and electronic terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination