CN112402976B - Game character control method, terminal, readable storage medium and electronic device - Google Patents

Game character control method, terminal, readable storage medium and electronic device Download PDF

Info

Publication number
CN112402976B
CN112402976B CN202011331423.3A CN202011331423A CN112402976B CN 112402976 B CN112402976 B CN 112402976B CN 202011331423 A CN202011331423 A CN 202011331423A CN 112402976 B CN112402976 B CN 112402976B
Authority
CN
China
Prior art keywords
control
selectable target
virtual character
selectable
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011331423.3A
Other languages
Chinese (zh)
Other versions
CN112402976A (en
Inventor
王泽�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202011331423.3A priority Critical patent/CN112402976B/en
Publication of CN112402976A publication Critical patent/CN112402976A/en
Application granted granted Critical
Publication of CN112402976B publication Critical patent/CN112402976B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad

Abstract

The application provides a control method of game roles, a terminal, a readable storage medium and electronic equipment, wherein the control method comprises the following steps: displaying a direction indication control on the upper part of the graphical user interface; responding to a first touch operation acting on the orientation adjustment area, adjusting the orientation of the virtual character in the game scene, and synchronously adjusting the orientation indicated in the direction indication control; responding to a second touch operation acting on the direction indication control, and displaying at least one selectable target control; and responding to the third touch operation, determining a selected target control from the selectable target controls, and controlling the virtual character to face the selected target object associated with the selected target control. In this way, the selectable target control can be displayed through operation on the direction indication control, and the player adjusts the virtual character towards the selected target object associated with the selected target control through the selectable target control, so that the direction adjustment time is saved, and the accuracy and efficiency of direction adjustment are improved.

Description

Game character control method, terminal, readable storage medium and electronic device
Technical Field
The present invention relates to the field of game technologies, and in particular, to a method for controlling a game character, a terminal, a readable storage medium, and an electronic device.
Background
With the continuous development of science and technology, 3D (3D) games using spatial stereo computing technology to realize operations have been developed to a long extent due to the advantages of strong visual effects and excellent substitution feeling, for example, a large escape-and-kill type shooting game-a chicken game, which is widely favored and accepted by players due to its civilian nature, the suitability of the game for the young and the old, and the high creativity, openness, derivatization, playability, interactivity, and ornamental value thereof.
In order to complete a game task or travel in a game scene, it is necessary to control the movement of the virtual character toward a certain teammate or a specific game area, and before controlling the movement of the virtual character, it is necessary to find a target direction in which the virtual character is intended to be directed toward a moving target object or target area, and adjust the virtual character to the corresponding target direction. At present, referring to fig. 1, fig. 1 is a schematic diagram of adjustment of a sliding direction, as shown in fig. 1, in adjusting a direction of a virtual character, the direction of the virtual character is mostly adjusted to a target direction by sliding a screen multiple times, so that the adjustment accuracy of the direction of the virtual character is low, time and effort are consumed, and user experience is easily bad.
Disclosure of Invention
In view of the foregoing, an object of the present application is to provide a control method, a terminal, a readable storage medium, and an electronic device for a game character, so as to display at least one selectable target control by operating on a direction indication control, and a player adjusts a virtual character towards a selectable target object associated with the selectable target control through the at least one selectable target control, thereby saving direction adjustment time and being beneficial to improving accuracy and efficiency of direction adjustment.
The embodiment of the application provides a control method of a game role, which is applied to a terminal displaying a graphical user interface, wherein at least part of game scenes and virtual roles are displayed in the graphical user interface, the graphical user interface comprises a small map and a direction adjustment area, the small map is a thumbnail of the game scenes, and the control method comprises the following steps:
displaying a direction indication control on the upper part of the graphical user interface, wherein the direction indication control is used for indicating the direction of the virtual character in the game scene;
responding to a first touch operation acted on the orientation adjustment area, adjusting the orientation of the virtual character in the game scene, and synchronously adjusting the orientation indicated in the direction indication control;
Responding to a second touch operation acting on the direction indication control, and displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene;
and responding to a third touch operation, determining a selected target control from the selectable target controls, and controlling the virtual character to face a selected target object associated with the selected target control.
In a possible implementation manner, the small map contains an indication identifier corresponding to the virtual character, wherein the indication identifier is used for indicating the position and/or orientation of the virtual character in the game scene.
In one possible embodiment, the direction indication control is displayed with at least one direction control and at least one direction angle control.
In one possible implementation, before the responding to the second touch operation on the direction indication control, the control method includes:
screening a plurality of selectable target objects from a plurality of game target objects which are positioned in the game scene and correspond to each game event;
after displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene, each selectable target object is associated with each selectable target control based on the distance between each selectable target object and the position of the virtual character or the serial number identification corresponding to each selectable target object and the arrangement sequence of each selectable target control.
In one possible implementation, the selectable target object is determined by:
when the virtual character receives a teammate instruction, determining at least one teammate character belonging to the same team with the virtual character;
and determining the determined at least one teammate role as at least one selectable target object.
In one possible implementation, the selectable target object is determined by:
determining a plurality of action tracks of the virtual character in a game scene based on a plurality of preset game events;
determining a plurality of track positions on each action track;
for each track position, at least one selectable target object corresponding to the track position is determined based on the position information of the track position.
In one possible implementation manner, the first touch operation, the second touch operation, and the third touch operation include at least one of the following operations:
click operation, slide operation, long press operation.
In one possible implementation manner, when the second touch operation and the third touch operation are both sliding operations, a start position of a sliding track of the sliding operation is the direction indication control, and an end position of the sliding track of the sliding operation is the selected target control.
In one possible implementation, after the responding to the third touch operation, determining a selected target control from the selectable target controls, and controlling the virtual character to face the selected target object associated with the selected target control, the control method further includes:
hiding the at least one selectable target control.
The embodiment of the application also provides a terminal, the terminal displays a terminal with a graphic user interface, at least a part of game scenes and virtual roles are displayed in the graphic user interface, the graphic user interface comprises a small map and a direction adjustment area, the small map is a thumbnail of the game scenes, and the terminal comprises:
the direction control display module is used for displaying a direction indication control on the upper part of the graphical user interface, and the direction indication control is used for indicating the direction of the virtual character in the game scene;
the direction adjusting module is used for responding to a first touch operation acted on the direction adjusting area, adjusting the direction of the virtual character in the game scene and synchronously adjusting the direction indicated by the direction indication control;
the control display module is used for responding to a second touch operation acting on the direction indication control and displaying at least one selectable target control, and the selectable target control corresponds to a selectable target object in the game scene;
And the orientation control module is used for responding to a third touch operation, determining a selected target control from the selectable target controls and controlling the virtual character to be oriented to a selected target object associated with the selected target control.
In a possible implementation manner, the small map contains an indication identifier corresponding to the virtual character, wherein the indication identifier is used for indicating the position and/or orientation of the virtual character in the game scene.
In one possible embodiment, the direction indication control is displayed with at least one direction control and at least one direction angle control.
In a possible implementation manner, the terminal further comprises a control association module, and the control association module comprises:
screening a plurality of selectable target objects from a plurality of target objects which are positioned in the game scene and correspond to each game event;
after displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene, each selectable target object is associated with each selectable target control based on the distance between each selectable target object and the position of the virtual character or the serial number identification corresponding to each selectable target object and the arrangement sequence of each selectable target control.
In one possible implementation, the control association module determines the selectable target object by:
when the virtual character receives a teammate instruction, determining at least one teammate character belonging to the same team with the virtual character;
and determining the determined at least one role position as at least one selectable target object.
In one possible implementation, the control association module determines the selectable target object by:
determining a plurality of action tracks of the virtual character in a game scene based on a plurality of preset game events;
determining a plurality of track positions on each action track;
for each track position, at least one selectable target object corresponding to the track position is determined based on the position information of the track position.
In one possible implementation manner, the first touch operation, the second touch operation, and the third touch operation include at least one of the following operations:
click operation, slide operation, long press operation.
In one possible implementation manner, when the second touch operation and the third touch operation are both sliding operations, a start position of a sliding track of the sliding operation is the direction indication control, and an end position of the sliding track of the sliding operation is the selected target control.
In a possible implementation manner, the terminal further comprises a control hiding module, wherein the control hiding module is used for:
hiding the at least one selectable target control.
The embodiment of the application also provides electronic equipment, which comprises: the system comprises a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, the processor and the memory are communicated through the bus when the electronic device runs, and the machine-readable instructions are executed by the processor to execute the steps of the game role control method.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the control method of a game character as described above.
According to the game role control method, the terminal, the readable storage medium and the electronic device, a direction indication control is displayed on the upper portion of a graphical user interface, and the direction indication control is used for indicating the direction of the virtual role in the game scene; responding to a first touch operation acted on the orientation adjustment area, adjusting the orientation of the virtual character in the game scene, and synchronously adjusting the orientation indicated in the direction indication control; responding to a second touch operation acting on the direction indication control, and displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene; and responding to a third touch operation, determining a selected target control from the selectable target controls, and controlling the virtual character to face a selected target object associated with the selected target control.
In this way, the direction indication control is displayed on the upper part of the graphical user interface, the direction of the virtual character in the game scene is adjusted in response to the first touch operation in the direction adjustment area, the direction indicated in the direction indication control is synchronously adjusted, at least one selectable target control is displayed in response to the second touch operation acting on the direction indication control, the selected target control is determined from the displayed at least one selectable target control in response to the third touch operation, and the virtual character is controlled to face the selected target object associated with the selected target control, so that the operation process is effectively simplified, the operation time is saved, the accuracy of direction adjustment of the virtual character is improved, the direction adjustment time is saved, and the efficiency of direction adjustment is improved.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic illustration of a slip direction adjustment;
FIG. 2 is a flow chart of a method for controlling a game character according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a directional control display;
FIG. 4 is a schematic illustration of an alternative target control display;
FIG. 5 is a schematic diagram of a selected target control determination process;
FIG. 6 is a flow chart of a method for controlling a game character according to another embodiment of the present application;
FIG. 7 is a schematic diagram of a game scenario after virtual character steering;
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present application;
FIG. 9 is a second schematic diagram of a terminal according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. Based on the embodiments of the present application, every other embodiment that a person skilled in the art would obtain without making any inventive effort is within the scope of protection of the present application.
First, application scenarios applicable to the present application will be described. The method and the device can be applied to the technical field of games, particularly in 3D games (such as large escape type shooting games-chicken eating games and the like), in order to execute certain game tasks or escape during the game, the direction of the virtual character needs to be adjusted in real time, so that the virtual character can face the direction of the destination to be reached, and then move towards the destination.
According to research, at present, most of adjustment of the virtual character in the game needs to be performed by sliding a screen, so that the central direction of the screen coincides with the target direction, then the character direction is adjusted to the target direction, and when the direction is adjusted, the direction of the virtual character needs to be adjusted to the target direction through repeated operation adjustment, so that the adjustment accuracy of the direction of the virtual character is low, time and effort are consumed, and poor user experience is easily caused.
Based on the above, the embodiment of the application provides a control method for a game character, which can display a selectable target control through operation on a direction indication control, and a player adjusts the direction of a virtual character to a selected target object associated with the selected target control through the selectable target control, so that the direction adjustment time is saved, and the accuracy and the efficiency of direction adjustment are improved.
Referring to fig. 2, fig. 2 is a flowchart of a method for controlling a game character according to an embodiment of the present application. As shown in fig. 2, the method for controlling a game character provided in the embodiment of the present application includes:
and S201, displaying a direction indication control on the upper part of the graphical user interface, wherein the direction indication control is used for indicating the direction of the virtual character in the game scene.
In the step, a direction indication control for prompting the direction is rendered and displayed in a graphical user interface of the terminal.
Here, the direction indication control is displayed with at least one direction and at least one direction angle. The center position of the direction indication control is a reference point of the virtual character, the angle mark displayed in the direction indication control is the current direction of the virtual character, the angle mark is arranged at equal angle intervals among each direction angle in the direction indication control, and the number of the direction angles is specifically displayed in the direction indication control and can be set according to requirements.
For example, the avatar is currently at an angle in the south and southeast directions, then two directions "south" and "southeast" may be displayed in the direction-raising indication control; and based on the two directions of the south and the southeast, a plurality of direction angles are respectively arranged at the two sides of the two directions of the south and the southeast according to a preset angle of 15 degrees.
Here, the position of the teammate character closest to the virtual character and belonging to the same team as the virtual character may be displayed in the direction indication control, and the teammate character is marked with a mark symbol in the corresponding direction angle in the direction indication control, and the distance between the teammate character and the virtual character is displayed beside the mark symbol.
Here, the direction indication control is disposed at an upper portion of the gui, which may be disposed right above the gui, so as to facilitate the user to adjust the direction of the virtual character according to the direction indication control.
Here, a small map is also displayed in the graphical user interface rendered by the terminal, the small map is a thumbnail of the game scene, and the small map contains an indication mark corresponding to the virtual character, wherein the indication mark is used for indicating the position and/or orientation of the virtual character in the game scene, and the indication mark changes along with the movement and rotation of the virtual character.
Referring to fig. 3, fig. 3 is a schematic diagram of a direction indication control display, as shown in fig. 3, a direction indication control 301 is displayed directly above a graphical user interface 300, a direction control 3011 and a direction angle control 3012 are displayed on the direction indication control 301, and as shown in fig. 3, the direction control 3011 includes a direction identifier of "southeast" and a direction of "south"; included in the direction angle control 3012 are a plurality of direction angle identifications divided every 15 ° out of 105 ° to 210 °. While a directional arrow 302 is displayed in the direction indication control 301, the direction of operation of the direction indication control 301 is identified, when operated in the direction indicated by the arrow 302, a plurality of direction indication controls will be displayed on the graphical user interface 300, while, as shown in fig. 3, trees 304 and houses 305 are displayed on the roadside of the direction in which the current virtual character 303 is oriented, and as shown in fig. 3, a minimap 306 is included in the graphical user interface 300, and in the minimap 306, an indication identifier 3061 corresponding to the virtual character 303 is included, the indication identifier 3061 indicating the position and/or current orientation of the virtual character 303 in the game scene.
S202, responding to a first touch operation acted on the orientation adjustment area, adjusting the orientation of the virtual character in the game scene, and synchronously adjusting the orientation indicated in the direction indication control.
In the step, the direction of the virtual character in the game scene is adjusted in response to a first touch operation acting on a direction adjustment area of the graphical user interface, and meanwhile, the direction indicated in the direction indication control is synchronously adjusted so that the direction indicated in the direction indication control is consistent with the current direction of the virtual character.
Here, when the first touch operation of the orientation adjustment area is to adjust the direction of the target object which is not determined by the virtual character, the orientation of the virtual character is adjusted, for example, when the orientation of the virtual character is in the southwest direction and the target to which the virtual character wants to be oriented is in the northeast direction, the adjustment may be performed in the orientation adjustment area so that the virtual character is oriented in the northeast direction.
And S203, responding to a second touch operation acting on the direction indication control, and displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene.
In this step, at least one selectable target control is displayed on the graphical user interface in response to a second touch operation acting on the direction indication control.
Wherein the selectable target control corresponds to a selectable target object in the game scene.
Here, the selectable target object may be at least one teammate character included in the game scene in the same team as the virtual character; at least one enemy character of another team in a antagonistic relationship with the virtual character; but also special scenarios in game tasks such as novice villages, safe areas, etc.
Here, the second touch operation applied to the direction indication control may be at least one of a click operation, a slide operation, and a long press operation, and the operation at the time of calling out the selectable target control is different from the direction adjustment operation on the direction indication control in the related art.
For example, the touch operation on the direction adjustment control at the present stage is a left-right sliding operation on the direction indication control, and then the sliding operation mode of the exhale selectable target control in the application can be a top-down sliding operation.
Here, after responding to the second touch operation on the direction indication control, displaying the selectable target control may be performed by directly displaying the selectable target control on the graphical user interface after determining that the second touch operation is received; it may also be that the selectable target control has an upper portion of the graphical user interface slid out of display upon determining that the second touch operation is received.
For the display mode that the selectable target control slides out from the upper part of the graphical user interface after the second touch operation is determined to be received, the second touch operation is required to be uninterrupted, and if interruption occurs in the second touch operation process, the selectable target control rebounds and is hidden for display.
Here, the position where the selectable target control is displayed is at a position adjacent to the direction indication control, where a plurality of selectable target controls may be displayed immediately below the direction indication control, or may be displayed in an original orientation adjustment area (the area includes the direction indication control and may be a rectangular area).
Here, when the selectable target control is displayed, the identification information corresponding to each selectable target object is displayed on the selectable target control.
For example, each selectable target object is a plurality of teammate roles affiliated with the virtual role in the same team, and the numbered association of each teammate may be displayed on the associated selectable target control.
Here, in associating the selectable target object with the selectable target control, in addition to associating identification information of the selectable target object with the selectable target control, position information of the selectable target object needs to be associated with the selectable target control.
The position information includes the direction of the selectable target object relative to the virtual character and the distance between the selectable target object and the virtual character, and the distance between each selectable target object and the virtual character can be displayed at a preset position of the corresponding selectable target control, for example, below the selectable target control, and the like.
Referring to fig. 4, fig. 4 is a schematic diagram of a selectable target control display, as shown in fig. 4, after it is determined that the second touch operation is received, a plurality of selectable target controls are displayed under the direction providing indication control 301, where the plurality of selectable target controls are respectively selectable target controls associated with a teammate character of the same team of virtual characters, and selectable target controls associated with a security zone in a game scene, and distance information between a selectable target object associated with the selectable target control and a current position of the virtual character 303 is also marked under each selectable target control, in order of teammate numbers from left to right, a No. 1 selectable target control 307 associated with the teammate character 1, a No. 2 selectable target control 308 associated with the teammate character 2, a security zone selectable target control 309 associated with the security zone, a No. 3 selectable target control 310 associated with the teammate character 3, and a selectable target control 311 associated with the teammate character 4, and the selectable target control 307 is distributed from left to right on the graphical user interface.
S204, responding to a third touch operation, determining a selected target control from the selectable target controls, and controlling the virtual character to face a selected target object associated with the selected target control.
In the step, a selected target object associated with the selected target control is determined in response to a third touch operation acting on the plurality of selectable target controls, and then the virtual character is controlled to move towards the selected target object associated with the selected target control.
Here, the third touch operation applied to the selectable target control may be at least one of a click operation, a slide operation, and a long press operation.
Here, the manner of determining the selected target control from the plurality of selectable target controls may be: and determining the selected target control according to the termination position or the operation position of the third touch operation.
For example, regarding the end position, taking the third touch operation as an example of the sliding operation, determining the end position of the sliding track, that is, the position of the finger is lifted when the user performs the sliding operation, detecting that the position of the end position coincides with the area where the position of the selectable target control is located, determining the determined selectable target control as the selected target control, and determining that the selected target control receives the third touch operation; taking the third touch operation as a click operation for the operation position as an example, for each selectable target control, determining that the corresponding click position acts on the selectable target control as a selected target control, and determining that the selected target control receives the third touch operation.
Here, the virtual character is controlled to rotate towards the selected target object associated with the selected target control, that is, the virtual character is rotated, that is, the direction of the virtual character is changed, and then the virtual character is controlled to move towards the selected target object associated with the selected target control; the visual angle of the virtual character may be adjusted, the direction of the virtual character itself is not changed, the visual angle of the virtual character is adjusted to the direction of the selected target object, and for the operation of adjusting the visual angle of the virtual character, after the third touch operation is finished (the user lifts the finger), the visual angle is automatically adjusted back to the visual angle of the virtual character before adjustment.
Referring to fig. 5, fig. 5 is a schematic diagram of a selected target control determining process, as shown in fig. 5, when a distress signal of a teammate character No. 2 is received, the finger 312 slides over a plurality of selectable target controls, when it is determined that the finger 312 stays at the No. 2 selectable target control 308 associated with the teammate character No. 2, the No. 2 selectable target control 308 associated with the teammate character No. 2 is determined as the selected target control, and the virtual character 303 will turn toward the position of the teammate character No. 2.
Here, when the second touch operation and the third touch operation are both sliding operations, the second touch operation and the third touch operation are the same sliding operation, the same sliding operation means that the second touch operation and the third touch operation are simultaneously lifted, that is, in the operation process, the starting position of the sliding track of the sliding operation is a direction indication control, the ending position of the sliding track of the sliding operation is a selected target control, after the second touch operation is performed on the direction indication control, the lifting of the hand is not performed, the selected target control is directly selected from a plurality of selectable target controls, and after the selection of the selected target control is completed, the lifting of the hand is performed again, and the sliding operation is ended.
According to the game role control method, the direction indication control is displayed on the upper portion of the graphical user interface and used for indicating the direction of the virtual role in the game scene; responding to a first touch operation acted on the orientation adjustment area, adjusting the orientation of the virtual character in the game scene, and synchronously adjusting the orientation indicated in the direction indication control; responding to a second touch operation acting on the direction indication control, and displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene; and responding to a third touch operation, determining a selected target control from the selectable target controls, and controlling the virtual character to face a selected target object associated with the selected target control.
In this way, the direction indication control is displayed on the upper part of the graphical user interface, the direction of the virtual character in the game scene is adjusted in response to the first touch operation in the direction adjustment area, the direction indicated in the direction indication control is synchronously adjusted, at least one selectable target control is displayed in response to the second touch operation acting on the direction indication control, the selected target control is determined from the displayed at least one selectable target control in response to the third touch operation, and the virtual character is controlled to face the selected target object associated with the selected target control, so that the operation process is effectively simplified, the operation time is saved, the accuracy of direction adjustment of the virtual character is improved, the direction adjustment time is saved, and the efficiency of direction adjustment is improved.
Referring to fig. 6, fig. 6 is a flowchart of a method for controlling a game character according to another embodiment of the present application. As shown in fig. 6, the method for controlling a game character provided in the embodiment of the present application includes:
s601, displaying a direction indication control on the upper part of the graphical user interface, wherein the direction indication control is used for indicating the direction of the virtual character in the game scene.
And S602, responding to a first touch operation acted on the orientation adjustment area, adjusting the orientation of the virtual character in the game scene, and synchronously adjusting the orientation indicated in the direction indication control.
S603, responding to a second touch operation acting on the direction indication control, and displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene.
S604, responding to a third touch operation, determining a selected target control from the selectable target controls, and controlling the virtual character to face a selected target object associated with the selected target control.
S605, hiding the at least one selectable target control.
In this step, at least one selectable target control displayed is hidden after the avatar is directed towards the selected target object associated with the selected target control.
Here, after determining that the virtual character is turned toward the selected target object associated with the selected target control, at least one of the displayed selectable target controls needs to be hidden.
The hiding mode of the at least one selectable target control corresponds to the display mode of the at least one selectable target control, namely, if the display mode of the at least one selectable target control is a display mode sliding out from the upper part of the graphical user interface, the hiding mode sliding upwards from the current display position is also adopted when the at least one selectable target control is retracted; if the at least one selectable target control is displayed in a manner that the at least one selectable target control is directly displayed on the graphical user interface, then the at least one selectable target control is also directly hidden when the at least one selectable target control is retracted.
Here, after hiding at least one selectable target control, a location identification of the selected target object to which the virtual character is turned may be displayed at a location adjacent to the direction indication prompt control to prompt the target location toward which the virtual character is directed.
For example, the steering position corresponding to the target control selected through the third touch operation is the position where the teammate No. 2 of the same team as the virtual character is located, and then the identifier "2" of the teammate No. 2 may be displayed at the adjacent position of the direction indication control.
Referring to fig. 7, fig. 7 is a schematic view of a game scenario after the virtual character 303 turns, as shown in fig. 7, after the virtual character 303 turns, the selectable target control is hidden, only the number 2 selectable target control 308 associated with the teammate character 2 is displayed at the adjacent position of the direction indication control 301 to identify that the virtual character 303 is facing the position of the teammate character 2, a plurality of trees 304 are displayed at the roadside of the direction in which the virtual character 303 faces, and the direction control 3011 and the direction angle control 3012 displayed on the direction indication control 301 are changed, as shown in fig. 7, the direction indication control 301 now displays the direction of the teammate character 2 of the virtual character, and the direction control 3011 includes the direction identification of "northeast" and the direction of "north"; the direction angle control 3012 includes 75 ° to 330 ° degrees, with 15 ° spacing between each two degrees.
The descriptions of S604 to S605 may refer to the descriptions of S202 to S203, and the same technical effects can be achieved, which will not be described in detail.
In one possible implementation, before the responding to the second touch operation on the direction indication control, the control method includes: screening a plurality of selectable target objects from a plurality of game target objects which are positioned in the game scene and correspond to each game event; after displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene, each selectable target object is associated with each selectable target control based on the distance between each selectable target object and the position of the virtual character or the serial number identification corresponding to each selectable target object and the arrangement sequence of each selectable target control.
Screening a plurality of selectable target objects from a plurality of game target objects corresponding to each game event in a game scene where the virtual character is located, and associating each selectable target object with each selectable target control according to the distance between each screened selectable target object and the virtual character or the serial number identification corresponding to each selectable target object and the arrangement sequence of each selectable target control after displaying at least one selectable target control.
Here, for one game scene, according to different game events, a plurality of game target objects may be included, and according to game tasks to be executed by the virtual characters or teams where the virtual characters are located, target objects to which the plurality of virtual characters may move are screened from a plurality of positions, and are determined as selectable target objects.
Here, when a plurality of selectable target controls are displayed, each selectable target control needs to be associated with each selectable target object, so that when the selected target control is determined, the selected target object can be determined quickly, and the rotation direction of the virtual character can be controlled.
When the selectable target object is associated with the selectable target control, identification information of the selectable target object, the distance between the steering position and the virtual character and the direction angle of the selectable target object relative to the virtual character are required to be associated with the corresponding selectable target control.
Here, the association between each selectable target object and each selectable target control may be sequentially associated with the selectable target controls according to the order of the distances between the selectable target objects and the virtual roles from near to far, so that the virtual roles may determine the distance between each selectable target object and themselves according to the positions of the selectable target controls, and further determine the selectable target object closest (far) to each selectable target object, thereby determining the given steering direction; the sequence number of the selectable target object indicated by the sequence number identification corresponding to each selectable target object can be sequentially associated with the selectable target controls from small to large, for example, the selectable target object is a teammate role in the same team as the virtual role, teammate roles 1-4 in the same team as the virtual role can be associated with the teammate roles 1-4 when the selectable target controls are associated.
In one possible implementation, the selectable target object is determined by: when the virtual character receives a teammate instruction, determining at least one teammate character belonging to the same team with the virtual character; and determining the determined at least one role as at least one selectable target object.
In the step, when the virtual character receives a teammate instruction, the direction in which the teammate is required to move is determined, at the moment, at least one teammate character belonging to the same team with the virtual character in a game map is required to be determined, and each teammate character is taken as a selectable target object.
Here, the command of the teammate received by the virtual character may be a command of the teammate, a command of asking for help, or the like, when the command of the teammate is received, in addition to each teammate character being a selectable target object, a gathering place in the command of the team, or the like, may be determined as a selectable target object and displayed, and the virtual character may select to move toward the teammate first or move directly to the gathering position; when the help seeking instruction is received, the help seeking position can be directly obtained from the help seeking instruction, and the help seeking position is determined to be a selectable target object.
In one possible implementation, the selectable target object is determined by: determining a plurality of action tracks of the virtual character in a game scene based on a plurality of preset game events; determining a plurality of track positions on each action track; for each track position, at least one selectable target object corresponding to the track position is determined based on the position information of the track position.
In the step, a plurality of game events are determined according to a plurality of preset game events, a plurality of action tracks of the virtual character advancing in a game scene are determined according to each game event, a plurality of track positions on each action track are determined, and at least one selectable target object corresponding to the track position is determined according to the position information of the track position aiming at each track position.
Here, the position information for each track position may include a distance between the track position and the virtual character, a direction angle of the track position with respect to the virtual character, and the like.
The track position and the selectable target object may be in a one-to-one relationship or in a one-to-many relationship, for example, one track position may have a plurality of tracks passing through, and then the distance or angle of the track position relative to the virtual character may be changed, so that a plurality of selectable target objects may be determined according to different distances and angles.
Here, the game event may be an event such as a task that the virtual character needs to execute, and when executing each game event, the virtual character may correspond to one or more action tracks, and a representative track position needs to be selected from each track included in the tracks, so as to determine the track position as a selectable target object.
For example, the trajectory position may be an end position (such as a safe zone) of the action trajectory, or a position (such as a position where a game prop is present) of the trajectory that is helpful for the virtual character to execute a task.
Here, in addition to determining the possible positions of the virtual characters as selectable target objects, positions (dangerous areas) where the virtual characters arrive with a small probability in the game scene may also be marked to prompt the virtual characters to move away from the areas.
According to the game role control method, the direction indication control is displayed on the upper portion of the graphical user interface and used for indicating the direction of the virtual role in the game scene; responding to a first touch operation acted on the orientation adjustment area, adjusting the orientation of the virtual character in the game scene, and synchronously adjusting the orientation indicated in the direction indication control; responding to a second touch operation acting on the direction indication control, and displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene; responding to a third touch operation, determining a selected target control from the selectable target controls, and controlling the virtual character to face a selected target object associated with the selected target control; hiding the at least one selectable target control.
In this way, the direction indication control is displayed on the upper part of the graphical user interface, the direction of the virtual character in the game scene is adjusted in response to the first touch operation in the direction adjustment area, the direction indicated in the direction indication control is synchronously adjusted, at least one selectable target control is displayed in response to the second touch operation acting on the direction indication control, the selected target control is determined from the displayed at least one selectable target control in response to the third touch operation, the virtual character is controlled to face the selected target object associated with the selected target control, so that the operation process is effectively simplified, the operation time is saved, the accuracy of the direction adjustment of the virtual character is improved, the direction adjustment time is saved, the efficiency of the direction adjustment is improved, and after the virtual character is determined to face the selected target object, at least one selectable target control is hidden, so that the operation process is effectively simplified, the operation time is saved, the accuracy of the direction adjustment of the virtual character is improved, the direction adjustment time is saved, and the efficiency of the direction adjustment is improved.
Referring to fig. 8 and 9, fig. 8 is a schematic diagram of a terminal according to an embodiment of the present application, and fig. 9 is a schematic diagram of a second terminal according to an embodiment of the present application. As shown in fig. 7, the terminal 800 includes:
A direction control display module 810, configured to display a direction indication control on an upper portion of a graphical user interface, where the direction indication control is configured to indicate an orientation of the virtual character in the game scene;
a direction adjustment module 820, configured to adjust a direction of the virtual character in the game scene in response to a first touch operation acting on the direction adjustment area, and synchronously adjust the direction indicated in the direction indication control;
the control display module 830 is configured to display at least one selectable target control in response to a second touch operation acting on the direction indication control, where the selectable target control corresponds to a selectable target object in the game scene;
and the orientation control module 840 is configured to determine a selected target control from the selectable target controls in response to a third touch operation, and control the virtual character to be oriented towards the selected target object associated with the selected target control.
In one possible implementation, as shown in fig. 9, the terminal 800 further includes a control association module 850, where the control association module 850 includes:
screening a plurality of selectable target objects from a plurality of target objects which are positioned in the game scene and correspond to each game event;
After displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene, each selectable target object is associated with each selectable target control based on the distance between each selectable target object and the position of the virtual character or the serial number identification corresponding to each selectable target object and the arrangement sequence of each selectable target control.
In one possible implementation, as shown in fig. 9, the terminal 800 further includes a control hiding module 860, where the control hiding module 860 is configured to:
hiding the at least one selectable target control.
In a possible implementation manner, the small map contains an indication identifier corresponding to the virtual character, wherein the indication identifier is used for indicating the position and/or orientation of the virtual character in the game scene.
In one possible embodiment, the direction indication control is displayed with at least one direction control and at least one direction angle control.
In one possible implementation, the control association module 850 determines the selectable target object by:
when the virtual character receives a teammate instruction, determining at least one teammate character belonging to the same team with the virtual character;
And determining the determined at least one role position as at least one selectable target object.
In one possible implementation, the control association module 850 determines the selectable target object by:
determining a plurality of action tracks of the virtual character in a game scene based on a plurality of preset game events;
determining a plurality of track positions on each action track;
for each track position, at least one selectable target object corresponding to the track position is determined based on the position information of the track position.
In one possible implementation manner, the first touch operation, the second touch operation, and the third touch operation include at least one of the following operations:
click operation, slide operation, long press operation.
In one possible implementation manner, when the second touch operation and the third touch operation are both sliding operations, a start position of a sliding track of the sliding operation is the direction indication control, and an end position of the sliding track of the sliding operation is the selected target control.
The terminal provided by the embodiment of the application displays a direction indication control on the upper part of a graphical user interface, wherein the direction indication control is used for indicating the direction of the virtual character in the game scene; responding to a first touch operation acted on the orientation adjustment area, adjusting the orientation of the virtual character in the game scene, and synchronously adjusting the orientation indicated in the direction indication control; responding to a second touch operation acting on the direction indication control, and displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene; and responding to a third touch operation, determining a selected target control from the selectable target controls, and controlling the virtual character to face a selected target object associated with the selected target control.
In this way, the direction indication control is displayed on the upper part of the graphical user interface, the direction of the virtual character in the game scene is adjusted in response to the first touch operation in the direction adjustment area, the direction indicated in the direction indication control is synchronously adjusted, at least one selectable target control is displayed in response to the second touch operation acting on the direction indication control, the selected target control is determined from the displayed at least one selectable target control in response to the third touch operation, and the virtual character is controlled to face the selected target object associated with the selected target control, so that the operation process is effectively simplified, the operation time is saved, the accuracy of direction adjustment of the virtual character is improved, the direction adjustment time is saved, and the efficiency of direction adjustment is improved.
Referring to fig. 10, fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 10, the electronic device 1000 includes a processor 1010, a memory 1020, and a bus 1030.
The memory 1020 stores machine-readable instructions executable by the processor 1010, when the electronic device 1000 is running, the processor 1010 communicates with the memory 1020 through the bus 1030, and when the machine-readable instructions are executed by the processor 1010, the steps of the method for controlling a game character in the method embodiments shown in fig. 2 and fig. 6 can be executed, and detailed implementation is referred to method embodiments and is not repeated herein.
The embodiment of the present application further provides a computer readable storage medium, where a computer program is stored on the computer readable storage medium, where the computer program may execute the steps of the method for controlling a game character in the method embodiments shown in fig. 2 and fig. 6 when the computer program is run by a processor, and a specific implementation manner may refer to the method embodiments and is not described herein again.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing examples are merely specific embodiments of the present application, and are not intended to limit the scope of the present application, but the present application is not limited thereto, and those skilled in the art will appreciate that while the foregoing examples are described in detail, the present application is not limited thereto. Any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or make equivalent substitutions for some of the technical features within the technical scope of the disclosure of the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A control method for a game character, which is applied to a terminal displaying a graphical user interface, wherein at least part of a game scene and a virtual character are displayed in the graphical user interface, the graphical user interface comprises a small map and a direction adjustment area, the small map is a thumbnail of the game scene, and the control method comprises:
Displaying a direction indication control on the upper part of the graphical user interface, wherein the direction indication control is used for indicating the direction of the virtual character in the game scene;
responding to a first touch operation acted on the orientation adjustment area, adjusting the orientation of the virtual character in the game scene, and synchronously adjusting the orientation indicated in the direction indication control;
responding to a second touch operation acting on the direction indication control, and displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene;
responding to a third touch operation, determining a selected target control from the selectable target controls, and controlling the virtual character to face a selected target object associated with the selected target control;
before the response acts on the second touch operation of the direction indication control, the control method comprises the following steps:
screening a plurality of selectable target objects from a plurality of game target objects which are positioned in the game scene and correspond to each game event;
after displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene, each selectable target object is associated with each selectable target control based on the distance between each selectable target object and the position of the virtual character or the serial number identifier corresponding to each selectable target object and the arrangement sequence of each selectable target control;
Determining the selectable target object by:
determining a plurality of action tracks of the virtual character in a game scene based on a plurality of preset game events;
determining a plurality of track positions on each action track;
for each track position, at least one selectable target object corresponding to the track position is determined based on the position information of the track position.
2. The control method according to claim 1, wherein the small map contains indication marks corresponding to the virtual characters, and the indication marks are used for indicating positions and/or orientations of the virtual characters in the game scene.
3. The control method of claim 1, wherein the direction indication control displays at least one direction control and at least one direction angle control.
4. The control method according to claim 1, characterized in that the selectable target object is determined by:
when the virtual character receives a teammate instruction, determining at least one teammate character belonging to the same team with the virtual character;
and determining the determined at least one teammate role as at least one selectable target object.
5. The control method according to claim 1, wherein the first touch operation, the second touch operation, and the third touch operation include at least one of:
click operation, slide operation, long press operation.
6. The control method according to claim 5, wherein when the second touch operation and the third touch operation are both slide operations, a start position of a slide track of the slide operation is the direction indication control, and an end position of the slide track of the slide operation is the selected target control.
7. The control method of claim 1, wherein after determining a selected target control from the selectable target controls and controlling the virtual character toward a selected target object associated with the selected target control in response to a third touch operation, the control method further comprises:
hiding the at least one selectable target control.
8. A terminal, wherein the terminal displays a terminal with a graphical user interface, at least a part of a game scene and a virtual character are displayed in the graphical user interface, the graphical user interface includes a small map and an orientation adjustment area, the small map is a thumbnail of the game scene, the terminal comprises:
The direction control display module is used for displaying a direction indication control on the upper part of the graphical user interface, and the direction indication control is used for indicating the direction of the virtual character in the game scene;
the direction adjusting module is used for responding to a first touch operation acted on the direction adjusting area, adjusting the direction of the virtual character in the game scene and synchronously adjusting the direction indicated by the direction indication control;
the control display module is used for responding to a second touch operation acting on the direction indication control and displaying at least one selectable target control, and the selectable target control corresponds to a selectable target object in the game scene;
the orientation control module is used for responding to a third touch operation, determining a selected target control from the selectable target controls and controlling the virtual character to be oriented to a selected target object associated with the selected target control;
the terminal also comprises a control association module, wherein the control association module is used for:
screening a plurality of selectable target objects from a plurality of target objects which are positioned in the game scene and correspond to each game event;
after displaying at least one selectable target control, wherein the selectable target control corresponds to a selectable target object in the game scene, each selectable target object is associated with each selectable target control based on the distance between each selectable target object and the position of the virtual character or the serial number identifier corresponding to each selectable target object and the arrangement sequence of each selectable target control;
The control association module is used for determining selectable target objects by the following steps:
determining a plurality of action tracks of the virtual character in a game scene based on a plurality of preset game events;
determining a plurality of track positions on each action track;
for each track position, at least one selectable target object corresponding to the track position is determined based on the position information of the track position.
9. An electronic device, comprising: a processor, a memory and a bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating via said bus when the electronic device is running, said machine readable instructions when executed by said processor performing the steps of the method of controlling a game character according to any of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the control method of a game character according to any one of claims 1 to 7.
CN202011331423.3A 2020-11-24 2020-11-24 Game character control method, terminal, readable storage medium and electronic device Active CN112402976B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011331423.3A CN112402976B (en) 2020-11-24 2020-11-24 Game character control method, terminal, readable storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011331423.3A CN112402976B (en) 2020-11-24 2020-11-24 Game character control method, terminal, readable storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN112402976A CN112402976A (en) 2021-02-26
CN112402976B true CN112402976B (en) 2023-12-29

Family

ID=74778589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011331423.3A Active CN112402976B (en) 2020-11-24 2020-11-24 Game character control method, terminal, readable storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN112402976B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113350793B (en) * 2021-06-16 2024-04-30 网易(杭州)网络有限公司 Interface element setting method and device, electronic equipment and storage medium
CN113440841B (en) * 2021-07-14 2023-11-17 网易(杭州)网络有限公司 Virtual character control method and device, electronic equipment and readable storage medium
CN113546419B (en) * 2021-07-30 2024-04-30 网易(杭州)网络有限公司 Game map display method, game map display device, terminal and storage medium
CN113633969A (en) * 2021-08-13 2021-11-12 网易(杭州)网络有限公司 Data processing method, device, equipment and storage medium
CN113750529A (en) * 2021-09-13 2021-12-07 网易(杭州)网络有限公司 Direction indicating method and device in game, electronic equipment and readable storage medium
CN114053704B (en) * 2021-10-28 2023-06-09 腾讯科技(深圳)有限公司 Information display method, device, terminal and storage medium
CN114332311B (en) * 2021-12-05 2023-08-04 北京字跳网络技术有限公司 Image generation method, device, computer equipment and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2497543A2 (en) * 2011-03-08 2012-09-12 Nintendo Co., Ltd. Information processing program, information processing system, and information processing method
CN107754305A (en) * 2017-10-13 2018-03-06 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN107789837A (en) * 2017-09-12 2018-03-13 网易(杭州)网络有限公司 Information processing method, device and computer-readable recording medium
CN107812384A (en) * 2017-09-12 2018-03-20 网易(杭州)网络有限公司 Information processing method, device and computer-readable recording medium
CN107890673A (en) * 2017-09-30 2018-04-10 网易(杭州)网络有限公司 Visual display method and device, storage medium, the equipment of compensating sound information
CN107899235A (en) * 2017-10-13 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN108211350A (en) * 2017-12-07 2018-06-29 网易(杭州)网络有限公司 Information processing method, electronic equipment and storage medium
JP6447853B1 (en) * 2018-04-25 2019-01-09 株式会社コナミデジタルエンタテインメント GAME CONTROL DEVICE, GAME SYSTEM, AND PROGRAM
CN109876442A (en) * 2019-04-15 2019-06-14 网易(杭州)网络有限公司 Route indicating means, equipment and storage medium in game based on map
JP2019193742A (en) * 2018-05-02 2019-11-07 任天堂株式会社 Information processing program, information processor, information processing system, and method of processing information
CN110917616A (en) * 2019-11-28 2020-03-27 腾讯科技(深圳)有限公司 Orientation prompting method, device, equipment and storage medium in virtual scene
CN111185004A (en) * 2019-12-30 2020-05-22 网易(杭州)网络有限公司 Game control display method, electronic device, and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2497543A2 (en) * 2011-03-08 2012-09-12 Nintendo Co., Ltd. Information processing program, information processing system, and information processing method
CN107789837A (en) * 2017-09-12 2018-03-13 网易(杭州)网络有限公司 Information processing method, device and computer-readable recording medium
CN107812384A (en) * 2017-09-12 2018-03-20 网易(杭州)网络有限公司 Information processing method, device and computer-readable recording medium
CN107890673A (en) * 2017-09-30 2018-04-10 网易(杭州)网络有限公司 Visual display method and device, storage medium, the equipment of compensating sound information
CN107754305A (en) * 2017-10-13 2018-03-06 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN107899235A (en) * 2017-10-13 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN108211350A (en) * 2017-12-07 2018-06-29 网易(杭州)网络有限公司 Information processing method, electronic equipment and storage medium
JP6447853B1 (en) * 2018-04-25 2019-01-09 株式会社コナミデジタルエンタテインメント GAME CONTROL DEVICE, GAME SYSTEM, AND PROGRAM
JP2019193742A (en) * 2018-05-02 2019-11-07 任天堂株式会社 Information processing program, information processor, information processing system, and method of processing information
CN109876442A (en) * 2019-04-15 2019-06-14 网易(杭州)网络有限公司 Route indicating means, equipment and storage medium in game based on map
CN110917616A (en) * 2019-11-28 2020-03-27 腾讯科技(深圳)有限公司 Orientation prompting method, device, equipment and storage medium in virtual scene
CN111185004A (en) * 2019-12-30 2020-05-22 网易(杭州)网络有限公司 Game control display method, electronic device, and storage medium

Also Published As

Publication number Publication date
CN112402976A (en) 2021-02-26

Similar Documents

Publication Publication Date Title
CN112402976B (en) Game character control method, terminal, readable storage medium and electronic device
US11439906B2 (en) Information prompting method and apparatus, storage medium, and electronic device
CN111124226B (en) Game screen display control method and device, electronic equipment and storage medium
CN107789837B (en) Information processing method, apparatus and computer readable storage medium
CN107398071B (en) Game target selection method and device
CN107812384B (en) Information processing method, device and computer readable storage medium
CN107648847A (en) Information processing method and device, storage medium, electronic equipment
JP2019051311A (en) Information processing method, device, computer program, and computer readable storage medium
US11446565B2 (en) In-game display control method and apparatus, storage medium processor, and terminal
CN111760268B (en) Path finding control method and device in game
CN111905370B (en) Method and device for controlling virtual character in game, electronic equipment and storage medium
JP6731461B2 (en) Information processing method and apparatus, storage medium, electronic device
KR20140133776A (en) Recording medium and game device
CN112891929B (en) Game signal processing method and device
CN110665228B (en) Method and device for controlling character cards in game
JP2015516864A (en) Apparatus and method for providing online shooting game
TWI793838B (en) Method, device, apparatus, medium and product for selecting interactive mode for virtual object
CN112791410A (en) Game control method and device, electronic equipment and storage medium
CN107832000B (en) Information processing method, information processing device, electronic equipment and storage medium
CN113209624B (en) Target selection method, terminal, electronic equipment and storage medium
WO2022083451A1 (en) Skill selection method and apparatus for virtual object, and device, medium and program product
JP2020089492A (en) Game program, game processing method and game terminal
CN112717411B (en) Track recording method, device, equipment and storage medium of virtual vehicle
CN113663326B (en) Aiming method and device for game skills
JP7393916B2 (en) Program, information processing method, and information processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant