CN113577763B - Game role control method and device - Google Patents

Game role control method and device Download PDF

Info

Publication number
CN113577763B
CN113577763B CN202110866632.6A CN202110866632A CN113577763B CN 113577763 B CN113577763 B CN 113577763B CN 202110866632 A CN202110866632 A CN 202110866632A CN 113577763 B CN113577763 B CN 113577763B
Authority
CN
China
Prior art keywords
target game
operation mode
game character
motion state
character
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110866632.6A
Other languages
Chinese (zh)
Other versions
CN113577763A (en
Inventor
曹金旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110866632.6A priority Critical patent/CN113577763B/en
Publication of CN113577763A publication Critical patent/CN113577763A/en
Application granted granted Critical
Publication of CN113577763B publication Critical patent/CN113577763B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/65Methods for processing data by generating or executing the game program for computing the condition of a game character

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a method and a device for controlling a game role, wherein the method comprises the following steps: responding to a first motion state conversion instruction aiming at a target game role, and determining that the target game role is in a motion state meeting a preset motion condition; displaying operation key positions according to the positions of the target game roles; the operation key position is used for informing the operation mode of the target game role to be converted from the first operation mode to the second operation mode which accords with the preset motion condition; and responding to the movement instruction aiming at the target game role, and controlling the target game role to correspondingly move according to the movement instruction. The operation mode of the user in the special scene and the corresponding operation result are guided through the displayed operation key positions, and the game experience problem caused by operation feedback conflict brought by the same interactive input in different environments is avoided through visual feedback.

Description

Game role control method and device
Technical Field
The present invention relates to the field of game technologies, and in particular, to a method and apparatus for controlling a game character.
Background
The horizontal game requires that the vision of the user is concentrated in the center of the screen, the user needs to occupy both hands for operation, the operation requirement on the player is more, more complex actions and tactics can be realized, for example, the up-and-down movement is utilized to control the virtual game role to avoid skills, the virtual game role is controlled to climb the wall surface, and the like.
In a horizontal game, in a game environment, a user can control a virtual game character in a PC (Personal Computer ) side game to move through the "W", "S", "a", "D" keys of a control keyboard, and the user can move the character through controlling the up-down, left-right virtual rocker or the up-down, left-right keys of a host handle in a hand game, wherein the left-right keys (corresponding to the "a", "D" keys of the PC side) can be used for controlling the character to advance and retreat, respectively, and the up-down keys (corresponding to the "W", "S" keys of the PC side) can be used for controlling the character to approach and go away, respectively. While in another game environment, such as where a user controls a virtual game character in a game to hang/climb somewhere, the left and right keys may be used to control the game character to climb left and right and the up and down keys may be used to control the game character to climb up and down. Since the same input based on interactive buttons/controls may bring about different operational feedback in different game environments, the user cannot clearly manipulate the expectations brought about, resulting in poor manipulation.
Disclosure of Invention
In view of the above problems, embodiments of the present invention have been made to provide a game character control method and a corresponding game character control device that overcome or at least partially solve the above problems.
The embodiment of the invention discloses a control method of a game role, which comprises the steps of providing a graphical user interface through a terminal, wherein the graphical user interface comprises a target game role, and the method comprises the following steps:
responding to a first motion state conversion instruction aiming at a target game role, and determining that the target game role is in a motion state meeting a preset motion condition;
displaying an operation key according to the position of the target game role; the operation key position is used for informing the operation mode of the target game role to be converted from a first operation mode to a second operation mode which accords with the preset motion condition;
Responding to a movement instruction aiming at the target game role, and controlling the target game role to correspondingly move according to the movement instruction; the movement instruction is generated for corresponding interactive input based on the second operation mode.
Optionally, the graphical user interface includes a virtual game environment, further comprising, prior to the responding to the first motion state transition instruction for the target game character:
Identifying an interactable range of the target game character into the virtual game environment; the interactive range is divided according to the virtual game environment which can meet the preset motion condition as a limit;
Detecting a preset interactive input, and generating a first motion state conversion instruction according to the preset interactive input.
Optionally, displaying an operation key according to the position of the target game character includes:
and displaying operation keys in a preset range of the target game role and/or displaying operation keys in a region of the game scene corresponding to the second operation mode.
Optionally, the controlling the target game role to move correspondingly according to the movement instruction includes:
and controlling the game role to perform a second movement operation according to the interactive input in the second operation mode. Optionally, after the displaying the operation key, the method further includes:
Detecting interactive input for controlling the target game character to exit from a motion state meeting the preset motion condition, and generating a second motion state conversion instruction;
Canceling display of the operation key position in response to a second motion state transition instruction for the target game character; the cancel display of the operation key position is used for informing that the operation mode of the target game character is converted from the second operation mode which accords with the preset motion condition to the first operation mode.
Optionally, the method further comprises:
And under the condition that the operation mode of the target game character is converted from the second operation mode to the first operation mode, responding to a movement instruction generated based on the interaction input corresponding to the first operation mode, and controlling the game character to perform a first movement operation according to the interaction input in the first operation mode.
Optionally, based on the same interactive input, the movement operations respectively controlled for the target game character are different in the first operation mode and the second operation mode.
The embodiment of the invention also discloses a game role control device, which provides a graphical user interface through the terminal, wherein the graphical user interface comprises a target game role, and the device comprises:
The first motion state conversion instruction response module is used for responding to a first motion state conversion instruction aiming at a target game role and determining that the target game role is in a motion state meeting preset motion conditions;
An operation key position display module for displaying an operation key position according to the position of the target game character; the operation key position is used for informing the operation mode of the target game role to be converted from a first operation mode to a second operation mode which accords with the preset motion condition;
the first movement instruction response module is used for responding to a movement instruction aiming at the target game role and controlling the target game role to correspondingly move according to the movement instruction; the movement instruction is generated based on the interaction input corresponding to the second operation mode.
Optionally, the graphical user interface includes a virtual game environment, further comprising, prior to the responding to the first motion state transition instruction for the target game character:
The interactive range identification module is used for identifying the interactive range of the target game character entering the virtual game environment; the interactive range is divided according to the virtual game environment which can meet the preset motion condition as a limit;
the first motion state conversion instruction generation module is used for detecting preset interaction input and generating a first motion state conversion instruction according to the preset interaction input.
Optionally, the operation key position display module includes:
the first operation key position display sub-module is used for displaying operation key positions in a preset range of the target game role;
And the second operation key position display sub-module is used for displaying operation keys in the area where the game scene corresponding to the second operation mode is located.
Optionally, the first movement instruction response module includes:
And the movement instruction response sub-module is used for controlling the game role to carry out second movement operation according to the interaction input in the second operation mode.
Optionally, after the displaying the operation key, the apparatus further includes:
A second motion state conversion instruction generation module, configured to detect an interactive input for controlling the target game character to exit from a motion state satisfying the preset motion condition, and generate a second motion state conversion instruction;
A second motion state transition instruction response module for canceling the display of the operation key position in response to a second motion state transition instruction for the target game character; the cancel display of the operation key position is used for informing that the operation mode of the target game character is converted from the second operation mode which accords with the preset motion condition to the first operation mode.
Optionally, the apparatus further comprises:
And the second movement instruction response module is used for responding to a movement instruction generated based on the corresponding interaction input of the first operation mode and controlling the game character to perform a first movement operation according to the interaction input in the first operation mode under the condition that the operation mode of the target game character is converted from the second operation mode to the first operation mode.
Optionally, based on the same interactive input, the movement operations respectively controlled for the target game character are different in the first operation mode and the second operation mode.
The embodiment of the invention also discloses an electronic device, which comprises: a processor, a memory, and a computer program stored on the memory and executable on the processor, which when executed by the processor, performs the steps of the method of controlling any one of the game characters.
The embodiment of the invention also discloses a computer readable storage medium, wherein the computer readable storage medium stores a computer program, and the computer program realizes the steps of the control method of any game role when being executed by a processor.
The embodiment of the invention has the following advantages:
In the embodiment of the invention, based on the first motion state conversion instruction aiming at the target game role, the target game role is determined to be in a motion state meeting the preset motion condition, at this time, the operation key position can be displayed based on the position of the target game role, the displayed operation key position can be used for informing the operation mode of the target game role to be converted from the first operation mode to the second operation mode meeting the preset motion condition, at this time, the movement instruction generated by the corresponding interactive input of the second operation mode can be responded, and the target game role is controlled to move correspondingly. When the current game character is in a motion state meeting preset motion conditions, the operation mode of a user under a special scene and the corresponding operation result can be guided through the displayed operation key, and the game experience problem caused by operation feedback conflict brought by the same interactive input under different environments is avoided through visual feedback.
Drawings
FIG. 1 is a flow chart of steps of an embodiment of a method for controlling a game character of the present invention;
FIGS. 2A-2B are schematic illustrations of the operational key positions shown in an embodiment of the present invention;
FIG. 3 is a flowchart illustrating steps of an embodiment of a method for controlling a character of another game of the present invention;
FIG. 4 is a schematic diagram of a control target game character movement in accordance with an embodiment of the present invention;
FIG. 5 is a schematic diagram of another embodiment of the present invention for controlling movement of a target game character;
FIG. 6 is a schematic diagram of yet another embodiment of the present invention for controlling movement of a target game character;
fig. 7 is a block diagram showing a configuration of an embodiment of a game character control device according to the present invention.
Detailed Description
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
In the game, the game can be divided into a vertical game and a horizontal game based on the key of a user control area, and for the hand game, the vertical game can refer to the game design which is beneficial to the control of a user by a single hand, wherein the key of the control area is concentrated in the center of a game screen or directly concentrated at the lower part of the screen under the condition of vertical screen, and no event occurs before the user operates; the control area of the horizontal game is generally located at two sides, the game requires that the vision of the user is concentrated in the center of the screen, the user needs to occupy both hands for control, and the operation requirement on the player is more.
The horizontal game can be divided into a pure plane horizontal game and a deep horizontal game, the pure plane horizontal game can generally enable a user to visually feel the progress of the progress from left to right, but the pure plane horizontal game is limited greatly, a three-dimensional space cannot be well utilized, and virtual game roles in the game are easy to overlap when multiple players play the game; while deep horizontal games are relatively flat horizontal games, which can implement more complex actions and tactics, such as avoiding skills by up-and-down movements, and are suitable for multi-player games, operational feedback provided under different scenes based on the same interactive input provided by interactive buttons/controls may conflict, and game experience is easily affected by misoperation of players when the players are not familiar with the operations.
One of the core ideas of the invention is to provide a mechanism for detecting whether a character hangs/climbs somewhere and a mechanism for displaying relevant logic of 3DUI around the character, and based on the displayed operation key position, the operation mode of a player under a special scene and the corresponding operation result are guided from visual feedback, so that the game experience problem caused by operation feedback conflict brought by the same interactive input under different environments is avoided.
The control method of the game character in one embodiment of the present invention may be run on the terminal device or the server. The terminal device may be a local terminal device. When the game signal processing method runs on the server, the game signal processing method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, such as: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the game signal processing method are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; however, the terminal device for performing the game signal processing method is a cloud game server in the cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
Referring to fig. 1, there is shown a flowchart of steps of an embodiment of a method for controlling a game character according to the present invention, in which a graphical user interface is provided through a terminal, the graphical user interface including a target game character, and the embodiment of the present invention focuses on controlling the target game character in a special scenario, and may specifically include the following steps:
Step 101, responding to a first motion state conversion instruction aiming at a target game role, and determining that the target game role is in a motion state meeting a preset motion condition;
it should be noted that, the terminal may be the aforementioned local terminal device (including a mobile terminal or a PC end), or may be the aforementioned client device in the cloud interaction system. The operating system of the terminal may include Android (Android), IOS, windows Phone, windows, etc., and may generally support the running of various game applications.
By running the game application on the terminal and rendering a graphical user interface on the display of the terminal, the graphical user interface is displayed to at least partially contain a partial or complete game scene, and the specific form of the game scene can be square or other shapes (such as circular, etc.).
In one embodiment of the present invention, a first motion state conversion instruction applied to a target game character in a graphical user interface including a game scene may be responded, where the target game character may refer to a game character that can be controlled by a current player, where a motion state of the current motion state of the target game character may be determined to satisfy a preset condition, where the motion state of the preset condition corresponds to a preset special scene, for example, a climbing state and a game scene suitable for climbing, that is, where it may be determined that the game character that can be controlled by the player is currently in the special scene, so as to perform corresponding movement control on the game character according to an operation mode attached to the special scene. .
The first motion state conversion instruction may be used to convert a motion state of a target game role in a game scene, where the motion state of the target game role may refer to an action that the target game role can implement in the game scene, including walking, running, climbing, crossing, hanging, and other states. In practical applications, the motion state of the target game character when entering the special scene, for example, the suspension/climbing state, may have corresponding trigger detection and interaction input before and after, and the first motion state conversion instruction may be generated based on the interaction input of the interaction button/control in the special scene.
Specifically, in the process that the player controls the target game character to move, whether the target game character enters the interactive range of the virtual game environment or not can be identified, and the identified interactive range is divided according to the virtual game environment which can meet the preset motion condition as a limit; after the target game character is identified to enter the interaction range, if the interaction input based on the interaction buttons/controls is detected, a first motion state conversion instruction can be generated according to the current interaction input, and the current motion state of the target game character is correspondingly converted according to a special scene, wherein the performed interaction input is required to be in accordance with a control instruction of the target game character in the current special scene, namely, an instruction related to the motion state of the control target game character.
It should be noted that, if the target game character is identified to enter the interactive range satisfying the preset motion condition and the first motion state conversion instruction can be generated based on the interactive input, it can be directly determined that the target game character is currently in the motion state satisfying the preset motion condition at this time, and detection and judgment on the motion state of the target game character are not required.
As an example, assuming that the preset motion condition to be satisfied refers to a condition satisfying suspension/climbing, in a set game scenario, a terrain (e.g., a ladder, cliff, etc.) where suspension/climbing is enabled is distinguished from a normal terrain (e.g., a grass, a flat land, etc.), i.e., there is a boundary between a flat grass and a ladder in a more specific suspension/climbing scenario, while a cliff belonging to the specific suspension/climbing scenario and a ladder on the cliff are virtual game environments within an interactable range. When the player controls the target game character to be at the boundary of the common topography and the hanging/climbing topography, the game terminal can recognize that the target game character enters the interactive range, at the moment, the player executes interactive operation generated based on interactive input, for example, a climbing instruction sent by clicking a right button of a mouse, and after the game terminal determines that the execution operation is successful, the game terminal indicates that the target game character enters a motion state of a preset motion condition, and the game terminal can be regarded as hanging/climbing the target game character somewhere. The climbing instruction sent based on clicking the right button of the mouse may be a first motion state conversion instruction for converting a motion state.
102, Displaying an operation key position according to the position of a target game role;
In order to guide the player to perform the operation mode corresponding to the movement control under the special scene and the corresponding operation result thereof, when it is determined that the game character that the player can control is currently in the special scene, that is, the target game character is in the motion state meeting the preset motion condition (for example, meeting the suspension/climbing), the operation key position can be displayed according to the position of the target game character.
In one embodiment of the present invention, the displayed operation key position may be used to inform the operation mode of the target game character to be converted from the first operation mode to the second operation mode conforming to the preset motion condition, for example, conforming to the hanging/climbing, so that the player can be guided from the visual feedback, and the game experience problem caused by the operation feedback conflict caused by the same interactive input of the interactive key under different environments is avoided.
In an example, the first operation mode may refer to a mode that a game role can be controlled by adopting interactive input of up, down, left and right keys in a common scene, and implemented is an operation result of approaching, moving away, advancing and retreating; the second operation mode corresponding to the special scene can be a mode of controlling the game role by adopting interactive input of up, down, left and right keys, and the operation result of up, down climbing movement is realized.
For the display process of the operation key, as shown in fig. 2A, when the target game character is not in a special scene, that is, is not in a motion state meeting the preset motion condition, the operation key is not displayed in the graphical user interface; in the process of the player controlling the target game character to move, as shown in fig. 2B, when the target game character is in a special scene, that is, in a motion state satisfying a preset motion condition, an operation key position for guiding an operation mode corresponding to a movement control performed by the player in the special scene and a corresponding operation result may be displayed in the graphical user interface, for example, the target game character is in a topography capable of hanging/climbing, such as a wall surface, a cliff, etc., and the target game character may be controlled to climb up, down, left, and right under the condition of displaying the operation key position.
The displayed operation key positions may be displayed based on the position of the target game character, including displaying the operation key positions in a preset range of the target game character, and/or displaying the operation key positions in an area where a game scene corresponding to the second operation mode is located (i.e., a special scene area where the target game character is located, such as a cliff). In one case, when the operation key is displayed within the preset range of the target game character, mainly around the target game character, the character section area of the target game character may be first acquired so that the operation key is displayed in the character section area.
In a specific implementation, the key position content of the operation key position (up, down, left and right key contents as shown in fig. 2B) and the character model skeleton of the target game character can be identified, then the key position content of the operation key position is placed on the surface layer of a preset triangular patch, the placed surface layer of the preset triangular patch is equivalent to a rendering layer in a certain interface, and the key position content can be processed to obtain a part model which can be attached or combined with other rendering models; at this time, the position relationship between the surface layer of the preset triangular patch containing key position content and the skeleton of the character model of the target game character can be adjusted based on the character section area of the target game character. In the process of adjusting the position relationship between the triangular surface layer and the character model skeleton, the preset triangular surface layer can be connected with the character model skeleton of the target game character, and the position of the preset triangular surface layer is adjusted to be parallel to the tangent plane of the character model skeleton based on the connection, so that the ideal display effect is achieved.
In another case, when the operation key is displayed in the area where the game scene corresponding to the second operation mode is located, the display is mainly realized based on the current topography of the target game character in the special scene, that is, the operation key can be generated and displayed based on the topography difference of the place where the target game character is hung/climbed. The method can obtain the current topography of the target game character in the virtual game environment, and can be realized by directly hanging the operation key position on the surface of the current topography. As an example, when a character is hanging/climbing over a terrain, the current terrain is detected, and assuming the current terrain has a ladder for hanging/climbing, a key position hint 3DUI may be displayed on the surface of the ladder.
And step 103, responding to the movement instruction aiming at the target game role, and controlling the target game role to correspondingly move according to the movement instruction.
In one embodiment of the present invention, the displayed operation key is used to inform the operation mode for the target game character to be converted from the first operation mode to the second operation mode which accords with the preset motion condition, for example, accords with hanging/climbing.
Based on the same interactive input, the movement operations respectively controlled by the target game roles are different in the first operation mode and the second operation mode.
In practical application, before the operation mode is converted, the game character can be controlled to perform a first movement operation according to the interactive input in the first operation mode, and after the operation mode is converted, the game character can be controlled to perform a second movement operation according to the interactive input in the second operation mode. As an example, it is assumed that the first operation mode may refer to a mode of controlling a game character by means of up-down-left-right keying in a general scene, and implemented as an operation result of approaching, moving away, advancing and retreating, while the second operation mode corresponding to a special scene may refer to a mode of controlling a game character by means of up-down-left-right keying, and implemented as an operation result of up-down climbing movement, that is, generated movement instructions and movement operations performed are different for interactive input of the same up key. Before the operation mode is converted, a approaching instruction can be generated based on the interactive input of the upper key, and the first moving operation at the moment can be expressed as the control target game role moving to the near position; after the operation mode is converted, an upward climbing movement instruction can be generated based on the interactive input of the upper key, and the second movement operation at the moment can be expressed as upward climbing of the control target game role.
It should be noted that, the interactive input performed after the operation key position is displayed may be that the movement control of the game character is implemented based on the same interactive key, that is, the interactive key after the conversion is performed only in the operation mode, and for the mobile terminal, the displayed operation key position not only can provide visual feedback for the user, but also can generate a movement instruction conforming to the special movement state by adopting the interactive input of the key position content of the displayed operation key position. The invention is not limited in this regard.
In the embodiment of the invention, based on the first motion state conversion instruction aiming at the target game role, the target game role is determined to be in a motion state meeting the preset motion condition, at this time, the operation key position can be displayed based on the position of the target game role, the displayed operation key position can be used for informing the operation mode of the target game role to be converted from the first operation mode to the second operation mode meeting the preset motion condition, at this time, the target game role can be controlled to move correspondingly in response to the movement instruction generated based on the corresponding interactive input of the second operation mode. When the current game character is in a motion state meeting preset motion conditions, the operation mode of a user under a special scene and the corresponding operation result can be guided through the displayed operation key, and the game experience problem caused by operation feedback conflict brought by the same interactive input under different environments is avoided through visual feedback.
Referring to fig. 3, there is shown a flowchart of steps of another embodiment of a control method of game characters of the present invention, providing a graphical user interface through a terminal, the graphical user interface including a target game character, and specifically may include the steps of:
Step 301, when responding to a first motion state transition instruction for a target game character, displaying an operation key position;
In one embodiment of the present invention, it may be determined that the target game character is currently in a special scene in response to the first motion state transition instruction for the target game character, that is, the current motion state thereof satisfies the motion state in the special scene, and an operation key for indicating that the target game character is in the special scene may be displayed at this time so as to guide the player from the visual feedback.
Step 302, canceling the display operation key position in response to a second motion state conversion instruction for the target game character; according to the position of the target game character in the special scene, the operation key position is displayed, the current operation mode is converted from the first operation mode to the second operation mode which accords with the preset motion condition (for example, the hanging/climbing is met), and at the moment, the operation mode of the target game character can be converted again in response to the second motion state conversion instruction acted on the target game character in the graphical user interface containing the game scene.
In one embodiment of the present invention, in response to the second movement state transition instruction for the target game character, the reconversion of the operation mode may be represented as a cancel display operation key position, that is, the cancel display of the operation key position may be used to inform the target game character that the operation mode is changed from the second operation mode conforming to the preset movement condition to the first operation mode.
The second motion state conversion instruction may be generated based on the interactive input of the interactive button/control, specifically, the second motion state conversion instruction may be generated by detecting the interactive input for controlling the target game character to exit the motion state satisfying the preset motion condition, and then the second motion state conversion instruction is generated, and at the same time, the target game character is not in the motion state satisfying the preset motion condition. As an example, assuming that the interactive input exiting the motion state satisfying the preset motion condition is the related input of the space bar, when the target game character operated by the player is in the motion state of hanging/climbing the special scene, if the interactive input of the player through the space bar is detected, the target game character can be controlled to deviate from the current hanging/climbing motion state. The instruction sent based on the space key may be a second motion state conversion instruction for converting a motion state.
Step 303, responding to a movement instruction generated based on the corresponding interactive input of the first operation mode, and controlling the game role to perform a first movement operation according to the interactive input in the first operation mode.
After the target game character is controlled to exit the motion state satisfying the preset motion condition, the operation mode of the target game character may be switched again, for example, the target game character in the hanging/climbing state shown in fig. 2B is controlled to leave the hanging/climbing terrain (i.e., ladder), and is switched to the state shown in fig. 2A, and at this time, the displayed operation key position disappears.
In one embodiment of the present invention, the cancel display of the operation key position may be used to inform the operation mode for the target game character to be converted from the second operation mode conforming to the preset movement condition (for example, conforming to the hanging/climbing) to the first operation mode, and when the operation mode is converted, upon the interactive input based on the user, a corresponding movement instruction may be generated based on the interactive input performed, and the target game character may be controlled to move accordingly in accordance with the movement instruction.
In practical application, when the operation mode of the target game character is converted from the second operation mode to the first operation mode, the game character can be controlled to perform the first movement operation according to the interactive input in the first operation mode by responding to the movement instruction generated based on the corresponding interactive input in the first operation mode.
Based on the same interactive input, the movement operations respectively controlled by the target game roles are different in the first operation mode and the second operation mode. Specifically, before the operation mode is converted, the game character may be controlled to perform the second movement operation according to the interactive input in the second operation mode, and after the operation mode is converted, the game character may be controlled to perform the first movement operation according to the interactive input in the first operation mode.
As an example, it is assumed that the first operation mode may refer to a mode of controlling a game character by means of up-down-left-right keying in a general scene, and implemented as an operation result of approaching, moving away, advancing and retreating, while the second operation mode corresponding to a special scene may refer to a mode of controlling a game character by means of up-down-left-right keying, and implemented as an operation result of up-down climbing movement, that is, generated movement instructions and movement operations performed are different for interactive input of the same up key. Before the operation mode is converted, a downward climbing instruction can be generated based on the interactive input of the lower key, and the second movement operation at the moment can be expressed as downward climbing of the control target game role; after the operation mode is converted, a moving instruction for moving away can be generated based on the interactive input of the key-down, and the first moving operation at the moment can be expressed as controlling the target game character to move away.
It should be noted that, for detecting whether the up and down keys are for controlling approaching, moving away, or controlling advancing and retreating, the method can be implemented by detecting the position of the target game character in real time by the terminal, specifically, when the game character reaches a certain position, the method can be known through a test of pressing a certain operation key (for example, a W key) in a simulation manner, and the step is only a system internal detection mechanism, and is not displayed in a graphical user interface and is not perceived by a player; after a certain operation key position (for example, a W key) is pressed in a simulation manner, coordinate values of x, y and z in a game coordinate system where a target game character is located can be detected, wherein if the y-axis coordinate is changed, displacement movement in the upward direction can be indicated by the W key control at this time, and if the z-axis coordinate is changed, displacement movement in the distant direction can be indicated by the W key control at this time. The embodiments of the present invention are not limited with respect to detection of the mode of operation.
In order to facilitate the understanding of the control method for a game character according to the embodiment of the present invention by a person skilled in the art, the following description is made in connection with an application scenario for controlling movement of a target game character.
As an example, referring to fig. 4, a schematic diagram of controlling movement of a target game character according to an embodiment of the present invention may be applied to a mobile terminal, and a graphical user interface displayed on the mobile terminal may include a virtual game environment, the target game character, and interactive buttons. The displayed interactive buttons can refer to virtual rockers for controlling game roles in a graphical user interface, the virtual rockers can be represented as virtual rockers with controls on the upper, lower, left and right, and can also be represented as dragging displacement rockers in a constructed remote control interface.
It is assumed that the target game character in the graphic user interface is currently in a hanging/climbing state, and that an operation key for informing that the operation mode of the interactive key (i.e., virtual stick) has been switched has been displayed on the tangential plane of the target game character or the topographic surface of the obstacle on which the target game character is hung/climbed.
In one case, after the operation key position is displayed, the virtual remote control originally on the graphical user interface is adopted to perform movement control on the target game character, namely the displayed operation key position is only used as a display function, and at the moment, the target game character can be controlled to perform climbing movement up and down and left and right based on controlling the virtual rocker. For example, when the player controls the virtual rocker upwards, the target game character can now achieve an upward climbing movement on the terrain surface of the obstacle currently being climbed, i.e. the target game character moves from position a to position B.
In a preferred embodiment, when the virtual rocker is controlled to be upward, an upward key of the displayed operation keys may be identified, e.g., lit, etc., informing the player that the upward operation for the virtual rocker at this time may be expressed as a direction of movement upward based on the topographic surface of the obstacle currently being climbed. For the control of other directions of the virtual rocker, corresponding control can be performed on the corresponding key position of the displayed operation key position, and the embodiment of the invention is not limited.
It should be noted that, in this example, the graphical user interface may not display the interactive key, that is, the virtual joystick for controlling the game character, and the user may directly control the movement of the game character through the sliding operation on the lower left corner of the terminal screen. Details of specific operation implementation may be referred to the foregoing description for corresponding control, which is not repeated herein.
As another example, referring to fig. 5, a schematic diagram of another control target game character movement in an embodiment of the present invention may be applied to a mobile terminal, and a graphical user interface displayed by the mobile terminal may include a virtual game environment and a target game character.
It is assumed that the target game character in the graphic user interface is currently in a hanging/climbing state, and that an operation key for informing that the operation mode of the interactive key (i.e., virtual stick) has been switched has been displayed on the tangential plane of the target game character or the topographic surface of the obstacle on which the target game character is hung/climbed.
In another case, after the operation key position is displayed, the displayed operation key position is adopted to perform movement control on the target game character, namely, the displayed operation key position can be used as a display function and can also have a control function, and at the moment, the target game character can be controlled to perform climbing movement up and down and left and right based on the up and down and left and right key positions of the touch operation key position. For example, when the player performs a touch operation on the left key in the displayed operation key position terminal, the target game character can now realize a climbing movement to the left on the topographic surface of the obstacle currently being climbed, i.e., the target game character is moved from position a to position C.
As yet another example, referring to fig. 6, a schematic diagram for controlling movement of a target game character according to still another embodiment of the present invention may be applied to a PC side, and a graphical user interface displayed on the PC side may include a virtual game environment and the target game character. The input of the corresponding interactive key can be realized based on the keyboard to the displayed graphical user interface, and specifically, the input of the keys W, S, A and D of the keyboard can be controlled to correspondingly control the virtual game role to move.
At this time, the graphical user interface may further include an interactive key for informing of the current keyboard interactive input, and as an example, when the player makes a "W" key, i.e., an upward interactive input, an upward key position in the displayed interactive key may be identified, e.g., lighted, etc.
Assuming that the target game character in the graphical user interface is currently in a hanging/climbing state, and an operation key position for informing that the operation mode of the interactive key (i.e., the virtual rocker) is converted is displayed on the tangential plane of the target game character or the topographic surface of the obstacle hung/climbing by the target game character at this time, climbing movement control of the virtual game character can be realized based on the interactive input of the keyboard key, and the climbing movement control is specifically represented as input of the "W", "S", "a" and "D" keys of the keyboard. For example, when a player makes an interactive input based on the "S" key of the keyboard, the target game character can realize a downward climbing movement on the topographic surface of the obstacle currently being climbed, i.e., the target game character moves from position a to position D.
In the embodiment of the invention, when the current game character is in a motion state meeting the preset motion condition, the operation mode of the user in a special scene and the corresponding operation result thereof can be guided through the displayed operation key, and the game experience problem caused by operation feedback conflict brought by the same interactive input in different environments is avoided through visual feedback.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the invention.
Referring to fig. 7, there is shown a block diagram of an embodiment of a game character control apparatus of the present invention, in which a graphic user interface including a target game character is provided through a terminal, and may include the following modules:
A first motion state transition instruction response module 701, configured to determine that a target game character is in a motion state that meets a preset motion condition in response to a first motion state transition instruction for the target game character;
An operation key position display module 702 for displaying an operation key position according to the position of the target game character; the operation key position is used for informing the operation mode of the target game role to be converted from a first operation mode to a second operation mode which accords with the preset motion condition;
A first movement instruction response module 703, configured to respond to a movement instruction for the target game character, and control the target game character to perform corresponding movement according to the movement instruction; the movement instruction is generated based on the interaction input corresponding to the second operation mode.
In one embodiment of the present invention, the graphical user interface includes a virtual game environment, and may further include the following modules before the responding to the first motion state transition instruction for the target game character:
The interactive range identification module is used for identifying the interactive range of the target game character entering the virtual game environment; the interactive range is divided according to the virtual game environment which can meet the preset motion condition as a limit;
the first motion state conversion instruction generation module is used for detecting preset interaction input and generating a first motion state conversion instruction according to the preset interaction input.
In one embodiment of the present invention, the operation key position display module 403 may include the following sub-modules:
the first operation key position display sub-module is used for displaying operation key positions in a preset range of the target game role;
And the second operation key position display sub-module is used for displaying operation keys in the area where the game scene corresponding to the second operation mode is located.
In one embodiment of the present invention, the first movement instruction response module 703 may include the following sub-modules:
And the movement instruction response sub-module is used for controlling the game role to carry out second movement operation according to the interaction input in the second operation mode.
In one embodiment of the present invention, after displaying the operation key, the apparatus may further include:
A second motion state conversion instruction generation module, configured to detect an interactive input for controlling the target game character to exit from a motion state satisfying the preset motion condition, and generate a second motion state conversion instruction;
A second motion state transition instruction response module for canceling the display of the operation key position in response to a second motion state transition instruction for the target game character; the cancel display of the operation key position is used for informing that the operation mode of the target game character is converted from the second operation mode which accords with the preset motion condition to the first operation mode.
In one embodiment of the present invention, the apparatus may further include the following modules:
And the second movement instruction response module is used for responding to a movement instruction generated based on the corresponding interaction input of the first operation mode and controlling the game character to perform a first movement operation according to the interaction input in the first operation mode under the condition that the operation mode of the target game character is converted from the second operation mode to the first operation mode.
In one embodiment of the present invention, the movement operations respectively controlled for the target game character are different in the first operation manner and the second operation manner based on the same interactive input.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
The embodiment of the invention also provides electronic equipment, which comprises:
The game character control method comprises a processor, a memory and a computer program which is stored in the memory and can run on the processor, wherein the computer program realizes all the processes of the game character control method embodiment when being executed by the processor, can achieve the same technical effects, and is not repeated here.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, realizes the processes of the above-mentioned game role control method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the invention may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or terminal device that comprises the element.
The above description of a game character control method and a game character control device provided by the present invention applies specific examples to illustrate the principles and embodiments of the present invention, and the above examples are only used to help understand the method and core ideas of the present invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (9)

1. A method for controlling a game character, wherein a graphic user interface is provided through a terminal, the graphic user interface including a target game character, the method comprising:
Responding to a first motion state conversion instruction aiming at the target game role, and determining that the target game role is in a motion state meeting a preset motion condition; wherein the first motion state transition instruction is generated based on a preset interaction input detected after the target game character is identified to enter an interactable range of a virtual game environment, the interactable range being divided according to the virtual game environment satisfying the preset motion condition as a boundary;
displaying an operation key according to the position of the target game role; the operation key position is used for informing the operation mode of the target game role to be converted from a first operation mode to a second operation mode which accords with the preset motion condition;
Responding to a movement instruction aiming at the target game role, and controlling the target game role to correspondingly move according to the movement instruction; the movement instruction is generated based on the interaction input corresponding to the second operation mode.
2. The method of claim 1, wherein the displaying an operation key according to the position of the target game character comprises:
and displaying operation keys in a preset range of the target game role and/or displaying operation keys in a region of the game scene corresponding to the second operation mode.
3. The method of claim 1, wherein controlling the target game character to move accordingly in accordance with the movement instruction comprises:
And controlling the game role to perform a second movement operation according to the interactive input in the second operation mode.
4. The method of claim 1, wherein after the displaying the operation key, the method further comprises:
Detecting interactive input for controlling the target game character to exit from a motion state meeting the preset motion condition, and generating a second motion state conversion instruction;
Canceling display of the operation key position in response to a second motion state transition instruction for the target game character; the cancel display of the operation key position is used for informing that the operation mode of the target game character is converted from the second operation mode which accords with the preset motion condition to the first operation mode.
5. A method according to claim 3, further comprising:
And under the condition that the operation mode of the target game character is converted from the second operation mode to the first operation mode, responding to a movement instruction generated based on the interaction input corresponding to the first operation mode, and controlling the game character to perform a first movement operation according to the interaction input in the first operation mode.
6. The method of claim 5, wherein movement operations respectively controlled for the target game character in the first operation mode and the second operation mode are different based on the same interactive input.
7. A game character control apparatus, wherein a graphic user interface is provided through a terminal, the graphic user interface including a target game character, the apparatus comprising:
The first motion state conversion instruction response module is used for responding to a first motion state conversion instruction aiming at a target game role and determining that the target game role is in a motion state meeting preset motion conditions; wherein the first motion state transition instruction is generated based on a preset interaction input detected after the target game character is identified to enter an interactable range of a virtual game environment, the interactable range being divided according to the virtual game environment satisfying the preset motion condition as a boundary;
An operation key position display module for displaying an operation key position according to the position of the target game character; the operation key position is used for informing the operation mode of the target game role to be converted from a first operation mode to a second operation mode which accords with the preset motion condition;
the first movement instruction response module is used for responding to a movement instruction aiming at the target game role and controlling the target game role to correspondingly move according to the movement instruction; the movement instruction is generated based on the interaction input corresponding to the second operation mode.
8. An electronic device, comprising: a processor, a memory and a computer program stored on the memory and capable of running on the processor, which when executed by the processor performs the steps of the method of controlling a game character according to any one of claims 1-6.
9. A computer-readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the steps of the control method of a game character according to any one of claims 1 to 6.
CN202110866632.6A 2021-07-29 2021-07-29 Game role control method and device Active CN113577763B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110866632.6A CN113577763B (en) 2021-07-29 2021-07-29 Game role control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110866632.6A CN113577763B (en) 2021-07-29 2021-07-29 Game role control method and device

Publications (2)

Publication Number Publication Date
CN113577763A CN113577763A (en) 2021-11-02
CN113577763B true CN113577763B (en) 2024-05-28

Family

ID=78252105

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110866632.6A Active CN113577763B (en) 2021-07-29 2021-07-29 Game role control method and device

Country Status (1)

Country Link
CN (1) CN113577763B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113559516B (en) * 2021-07-30 2023-07-14 腾讯科技(深圳)有限公司 Virtual character control method and device, storage medium and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109589605A (en) * 2018-12-14 2019-04-09 网易(杭州)网络有限公司 The display control method and device of game
CN110215691A (en) * 2019-07-17 2019-09-10 网易(杭州)网络有限公司 The control method for movement and device of virtual role in a kind of game
CN110270086A (en) * 2019-07-17 2019-09-24 网易(杭州)网络有限公司 The control method for movement and device of virtual role in a kind of game
CN112402960A (en) * 2020-11-19 2021-02-26 腾讯科技(深圳)有限公司 State switching method, device, equipment and storage medium in virtual scene
CN112619167A (en) * 2020-12-21 2021-04-09 网易(杭州)网络有限公司 Information processing method and device, computer equipment and medium
CN113082712A (en) * 2021-03-30 2021-07-09 网易(杭州)网络有限公司 Control method and device of virtual role, computer equipment and storage medium
CN113082688A (en) * 2021-03-31 2021-07-09 网易(杭州)网络有限公司 Method and device for controlling virtual character in game, storage medium and equipment
CN113134233A (en) * 2021-05-14 2021-07-20 腾讯科技(深圳)有限公司 Control display method and device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8128496B2 (en) * 2006-07-31 2012-03-06 Camelot Co., Ltd. Game device, object display method in game device, and display program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109589605A (en) * 2018-12-14 2019-04-09 网易(杭州)网络有限公司 The display control method and device of game
CN110215691A (en) * 2019-07-17 2019-09-10 网易(杭州)网络有限公司 The control method for movement and device of virtual role in a kind of game
CN110270086A (en) * 2019-07-17 2019-09-24 网易(杭州)网络有限公司 The control method for movement and device of virtual role in a kind of game
CN112402960A (en) * 2020-11-19 2021-02-26 腾讯科技(深圳)有限公司 State switching method, device, equipment and storage medium in virtual scene
CN112619167A (en) * 2020-12-21 2021-04-09 网易(杭州)网络有限公司 Information processing method and device, computer equipment and medium
CN113082712A (en) * 2021-03-30 2021-07-09 网易(杭州)网络有限公司 Control method and device of virtual role, computer equipment and storage medium
CN113082688A (en) * 2021-03-31 2021-07-09 网易(杭州)网络有限公司 Method and device for controlling virtual character in game, storage medium and equipment
CN113134233A (en) * 2021-05-14 2021-07-20 腾讯科技(深圳)有限公司 Control display method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN113577763A (en) 2021-11-02

Similar Documents

Publication Publication Date Title
CN107648847B (en) Information processing method and device, storage medium and electronic equipment
US12026847B2 (en) Method and apparatus for controlling placement of virtual character and storage medium
CN108905212B (en) Game screen display control method and device, storage medium and electronic equipment
CN107132988B (en) Virtual objects condition control method, device, electronic equipment and storage medium
KR102701092B1 (en) Method and device for controlling virtual objects, terminals, and storage media
US10421013B2 (en) Gesture-based user interface
CN112933591A (en) Method and device for controlling game virtual character, storage medium and electronic equipment
CN107185232B (en) Virtual object motion control method and device, electronic equipment and storage medium
CN113181651B (en) Method, device, electronic equipment and storage medium for controlling virtual object movement in game
JP7447299B2 (en) Adaptive display method and device for virtual scenes, electronic equipment, and computer program
CN112416196B (en) Virtual object control method, device, equipment and computer readable storage medium
CN107213643A (en) Display control method and device, storage medium, the electronic equipment of game picture
JP7391448B2 (en) Virtual object control method, device, equipment, storage medium and computer program product
CN113577763B (en) Game role control method and device
CN111389003A (en) Game role control method, device, equipment and computer readable storage medium
CN106598228A (en) Object vision locating and control technology in VR environment
WO2023065949A1 (en) Object control method and apparatus in virtual scene, terminal device, computer-readable storage medium, and computer program product
CN111494948A (en) Game lens editing method, electronic equipment and storage medium
CN116251349A (en) Method and device for prompting target position in game and electronic equipment
CN107704165B (en) Virtual lens control method and device, storage medium and electronic equipment
CN107102725B (en) Control method and system for virtual reality movement based on somatosensory handle
CN113680047B (en) Terminal operation method, device, electronic equipment and storage medium
CN107728811A (en) Interface control method, apparatus and system
CN113521734A (en) Flight control method and device in game
CN113663326A (en) Game skill aiming method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant