CN112263833A - Game control method and device - Google Patents

Game control method and device Download PDF

Info

Publication number
CN112263833A
CN112263833A CN202011304002.1A CN202011304002A CN112263833A CN 112263833 A CN112263833 A CN 112263833A CN 202011304002 A CN202011304002 A CN 202011304002A CN 112263833 A CN112263833 A CN 112263833A
Authority
CN
China
Prior art keywords
control
attack
user interface
graphical user
game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011304002.1A
Other languages
Chinese (zh)
Inventor
陈婉蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202011304002.1A priority Critical patent/CN112263833A/en
Publication of CN112263833A publication Critical patent/CN112263833A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a game control method and a game control device, wherein the method comprises the following steps: providing a first attack control on a graphical user interface; responding to a first operation acted on the first attack control, and controlling the game interaction object to execute an attack operation; responding to a second operation acting on a first area in the graphical user interface, controlling the game interaction object to move, and displaying a second attack control on the graphical user interface according to the current position of a touch point corresponding to the second operation on the graphical user interface, wherein the second attack control is set to move along with the touch point corresponding to the second operation; and controlling the game interaction object to execute the attack operation in response to the third operation of sliding from the first area to the second attack control. Since the third operation is a continuous operation of the second operation, the game interaction object is shifted from the control to the execution of the attack operation, the finger does not need to leave the screen, and the mistaken touch of surrounding buttons is avoided.

Description

Game control method and device
Technical Field
The embodiment of the application relates to the technical field of internet, in particular to a game control method and device.
Background
In some games, it is usually necessary to control the movement of the game interaction object, and at the same time, it is necessary to control the game interaction object to perform some preset operations, for example, in shooting type games, shooting operations, and so on.
In the current scheme, the left and right movement of the game interaction object is controlled by the left and right hand, and the up and down movement and preset operation of the game interaction object are controlled by the right hand. Taking the preset operation as the shooting operation as an example, the user can execute the shooting operation by clicking the corresponding shooting button, and can control the game interaction object to execute the up-and-down movement operation by clicking the button which is pressed up and down for a long time.
However, in the above scheme, when the point of the game interaction object hits the shooting button, the finger usually leaves the screen, and stops long-pressing the up-and-down moving button and then clicks the shooting button, which easily causes the mistaken touch of the surrounding buttons.
Disclosure of Invention
The embodiment of the application provides a game control method and device, and aims to solve the problem that mistaken touch is easily generated on surrounding buttons when the game interaction object is controlled to be displaced to execute attack operation.
In a first aspect, an embodiment of the present application provides a game control method, where a terminal device provides a graphical user interface, where the graphical user interface includes a game screen, and the game screen includes a game interaction object, including:
providing a first attack control at the graphical user interface;
responding to a first operation acted on the first attack control, and controlling the game interaction object to execute an attack operation;
responding to a second operation acting on a first area in the graphical user interface, controlling the game interaction object to move, and displaying a second attack control on the graphical user interface according to the current position of a touch point corresponding to the second operation on the graphical user interface, wherein the second attack control is set to move along with the touch point corresponding to the second operation;
and controlling the game interaction object to execute an attack operation in response to a third operation of sliding from the first area to the second attack control, wherein the third operation is a continuous operation of the second operation.
In one possible embodiment, the method further comprises:
and responding to a fourth operation acting on the second attack control, and controlling the game interaction object to move and execute the attack operation at the same time, wherein the fourth operation is a continuous operation of the third operation.
In a possible embodiment, the second operation is a sliding operation satisfying a first preset sliding direction, the fourth operation is a sliding operation satisfying the first preset sliding direction, and the sliding direction of the third operation is different from the first preset sliding direction.
In one possible implementation, the first area includes a movement-assisted identifier and a first movement control configured to move along a direction range determined by the movement-assisted identifier according to the second operation.
In one possible implementation, the graphical user interface provides a second movement control, the first movement control and the second movement control are located in different areas of the graphical user interface, and the first movement control is configured to control the height of the game interaction object from the ground in response to the second operation so as to control the game interaction object to move away from or close to the ground; the second movement control is configured to respond to touch operation to control the game interaction object to move within a horizontal plane range corresponding to the current height.
In a possible implementation manner, the mobile auxiliary mark is a bar mark, and the extension direction of the bar mark is perpendicular to or parallel to the long edge boundary of the game picture.
In a possible implementation manner, displaying, on the graphical user interface, a second attack control according to a current position of a touch point corresponding to the second operation on the graphical user interface includes:
determining a first position in a second area, different from the first area, in the graphical user interface according to the current position of the touch point corresponding to the second operation on the graphical user interface;
displaying the second attack control at the first location.
In a possible implementation, the second operation is a longitudinal sliding operation, the second attack control is located on the left side or the right side of the first area, and the first position has the same vertical coordinate as the current position; alternatively, the first and second electrodes may be,
the second operation is a transverse sliding operation, the second attack control is located on the upper side or the lower side of the first area, and the horizontal coordinate of the first position is the same as that of the current position.
In one possible implementation, responding to a third operation of sliding from the first region to the second attack control includes:
responding to a third operation of sliding from the first area to the second attack control in a first sliding direction, wherein the first sliding direction is perpendicular to the determined direction of the mobile auxiliary identifier.
In one possible embodiment, the method further comprises:
in response to the second operation, displaying a first indication icon in the graphical user interface, wherein the first indication icon is used for indicating sliding towards the second attack control and/or sliding along the direction determined by the movement auxiliary mark.
In a second aspect, an embodiment of the present application provides a game control apparatus, including:
the terminal equipment comprises a display module, a first attack control module and a second attack control module, wherein the display module is used for providing a first attack control on a graphical user interface provided by the terminal equipment, the graphical user interface comprises a game picture, and the game picture comprises a game interaction object;
the first control module is used for responding to a first operation acted on the first attack control and controlling the game interaction object to execute an attack operation;
the second control module is used for responding to a second operation acting on a first area in the graphical user interface, controlling the game interaction object to move, and displaying a second attack control on the graphical user interface according to the current position of a touch point corresponding to the second operation on the graphical user interface, wherein the second attack control is set to move along with the touch point corresponding to the second operation;
and the third control module is used for responding to a third operation of sliding from the first area to the second attack control and controlling the game interaction object to execute an attack operation, wherein the third operation is a continuous operation of the second operation.
In one possible embodiment, the third control module is further configured to:
and responding to a fourth operation acting on the second attack control, and controlling the game interaction object to move and execute the attack operation at the same time, wherein the fourth operation is a continuous operation of the third operation.
In a possible embodiment, the second operation is a sliding operation satisfying a first preset sliding direction, the fourth operation is a sliding operation satisfying the first preset sliding direction, and the sliding direction of the third operation is different from the first preset sliding direction.
In one possible implementation, the first area includes a movement-assisted identifier and a first movement control configured to move along a direction range determined by the movement-assisted identifier according to the second operation.
In one possible implementation, the graphical user interface provides a second movement control, the first movement control and the second movement control are located in different areas of the graphical user interface, and the first movement control is configured to control the height of the game interaction object from the ground in response to the second operation so as to control the game interaction object to move away from or close to the ground; the second movement control is configured to respond to touch operation to control the game interaction object to move within a horizontal plane range corresponding to the current height.
In a possible implementation manner, the mobile auxiliary mark is a bar mark, and the extension direction of the bar mark is perpendicular to or parallel to the long edge boundary of the game picture.
In a possible implementation, the second control module is specifically configured to:
determining a first position in a second area, different from the first area, in the graphical user interface according to the current position of the touch point corresponding to the second operation on the graphical user interface;
displaying the second attack control at the first location.
In a possible implementation, the second operation is a longitudinal sliding operation, the second attack control is located on the left side or the right side of the first area, and the first position has the same vertical coordinate as the current position; alternatively, the first and second electrodes may be,
the second operation is a transverse sliding operation, the second attack control is located on the upper side or the lower side of the first area, and the horizontal coordinate of the first position is the same as that of the current position.
In a possible implementation, the third control module is specifically configured to:
responding to a third operation of sliding from the first area to the second attack control in a first sliding direction, wherein the first sliding direction is perpendicular to the determined direction of the mobile auxiliary identifier.
In one possible embodiment, the second control module is further configured to:
in response to the second operation, displaying a first indication icon in the graphical user interface, wherein the first indication icon is used for indicating sliding towards the second attack control and/or sliding along the direction determined by the movement auxiliary mark.
In a third aspect, an embodiment of the present application provides a terminal device, including:
a memory for storing a program;
a processor for executing the program stored in the memory, the processor being configured to perform the game control method according to any one of the first aspect when the program is executed.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, which includes instructions that, when executed on a computer, cause the computer to perform the game control method according to any one of the first aspect.
According to the game control method and device provided by the embodiment of the application, the terminal device provides the graphical user interface, the graphical user interface comprises the game picture, the game picture comprises the game interaction object, the graphical user interface provides the first attack control, the terminal device can respond to the first operation acting on the first attack control to control the game interaction object to execute the attack operation, and can respond to the second operation acting on the first area in the graphical user interface to control the game interaction object to move. When the user executes the second operation, the second attack control is displayed on the graphical user interface according to the current position of the touch point corresponding to the second operation, and the terminal device can respond to a third operation of sliding from the first area to the second attack control to control the game interaction object to execute the attack operation. Because the third operation is the continuous operation of the second operation, the user can directly switch to the game interaction object to be controlled to execute the attack operation when the game interaction object is controlled to move, fingers do not need to leave the screen of the terminal device, the operation is simple, and the error touch of surrounding buttons can not be generated.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic view of a game control provided in an embodiment of the present application;
FIG. 2 is a schematic flow chart illustrating a game control method according to an embodiment of the present disclosure;
FIG. 3 is a first schematic view of a game interface provided in an embodiment of the present application;
FIG. 4 is a second schematic view of a game interface provided in an embodiment of the present application;
FIG. 5 is a third schematic view of a game interface provided in an embodiment of the present application;
FIG. 6 is a fourth schematic view of a game interface provided in an embodiment of the present application;
FIG. 7 is a fifth schematic view of a game interface provided in an embodiment of the present application;
FIG. 8 is a schematic structural diagram of a game control apparatus according to an embodiment of the present disclosure;
fig. 9 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a schematic view of a game control provided in an embodiment of the present application, and as shown in fig. 1, a page 10 is a game interface on a terminal device. Page 10 illustrates an interface for a shooting-type game, as illustrated in the center of page 10 is where the game interaction objects are aimed.
On the page 10, a left rocker 11 is included, and a user can control the left rocker 11 to move up, down, left and right, so that the game interaction object moves on a horizontal plane according to the movement of the left rocker 11, and the position aimed by the game interaction object correspondingly moves, wherein the horizontal plane refers to a plane parallel to the ground.
The page 10 further comprises a shooting button 12, an up button 13 and a down button 14, wherein when the user clicks the shooting button 12, the game interaction object performs a shooting operation. The up button 13 and the down button 14 are used for controlling the movement of the game interaction object in a vertical direction, where the movement in the vertical direction refers to the movement toward a direction away from the ground or toward the ground, wherein one of the up button 13 and the down button 14 is used for controlling the movement of the game interaction object toward a direction away from the ground, and the other button is used for controlling the movement of the game interaction object toward a direction toward the ground. For example, when the user presses the up button 13 for a long time, the game interaction object moves in a vertical direction in a direction away from the ground until the game interaction object stops moving up after the user releases his hand from the up button 13. Similarly, when the user presses the move button 14 for a long time, the game interaction object moves in the vertical direction toward the direction close to the floor until the game interaction object stops moving down after the user releases his hand from the move down button 14.
The main disadvantage of the game control scheme illustrated in fig. 1 is that, first, when the user operates the up button 13 or the down button 14, it is difficult to simultaneously perform the operation of the shooting buttons 12, and when the user operates the shooting buttons 12, it is also difficult to simultaneously perform the operation of the up button or the down button 14, and there are too many buttons, and erroneous touch is likely to occur. Next, when the user operates the up button 13 or the down button 14 to move the game interaction object in the vertical direction, the user presses the up button 13 or the down button 14 for a long time, so that the distance of movement of the game interaction object in the vertical direction is difficult to grasp, and the user operation is not immersive.
In order to solve the above problems, the present application provides a game control scheme. The solution of the present application will be described below with reference to the accompanying drawings.
Fig. 2 is a schematic flowchart of a game control method provided in an embodiment of the present application, where a terminal device provides a graphical user interface, the graphical user interface includes a game screen, and the game screen includes a game interaction object, as shown in fig. 2, the method may include:
s21, providing a first attack control on the graphical user interface.
And S22, responding to the first operation of the first attack control, and controlling the game interaction object to execute the attack operation.
The graphical user interface comprises a first attack control, a user can perform first operation on the first attack control, and at the moment, the terminal device responds to control the game interaction object to execute attack operation. For example, in a shooting game, the attack operation may be a shooting operation, the first operation may be a clicking operation or a long-pressing operation, and the like, and the user may click the first attack control to perform the shooting operation.
The first operation controls only the game interaction object to perform the attack operation, and does not control the movement of the game interaction object.
And S23, responding to a second operation acting on the first area in the graphical user interface, controlling the game interaction object to move, and displaying a second attack control on the graphical user interface according to the current position of the touch point corresponding to the second operation on the graphical user interface, wherein the second attack control is set to move along with the touch point corresponding to the second operation.
The second object is an operation for controlling the movement of the game interaction object, in this embodiment of the application, an effective operation area of the second operation is a first area in the graphical user interface, and the first area may be an area for controlling the game interaction object to move in the horizontal direction or an area for controlling the game interaction object to move in the vertical direction.
When the user executes a second operation in the first area on the graphical user interface, the game interaction object correspondingly moves according to the second operation, at this time, a second attack control is displayed on the graphical user interface, and the position of the second attack control on the graphical user interface is related to the current position of a touch point corresponding to the second operation on the graphical user interface.
And S24, responding to a third operation of sliding from the first area to the second attack control, and controlling the game interaction object to execute the attack operation, wherein the third operation is a continuous operation of the second operation.
When the second attack control appears on the graphical user interface, the user can execute a third operation of sliding from the first area to the second attack control, at the moment, the user clicks the second attack control, and the game interaction object can execute the attack operation according to the third operation.
In this embodiment of the application, the third operation is a continuous operation of the second operation, which means that the third operation can be directly and continuously executed without a finger of a user leaving a screen of the terminal device after the second operation is executed. And when the user performs the second operation in the first area at any time, the user can perform a third operation of sliding from the first area to the second attack control, and the movement of the game interaction object is switched to be controlled to perform the attack operation.
According to the game control method provided by the embodiment of the application, the terminal device provides the graphical user interface, the graphical user interface comprises the game picture, the game picture comprises the game interaction object, the graphical user interface provides the first attack control, the terminal device can respond to the first operation acting on the first attack control to control the game interaction object to execute the attack operation, and can respond to the second operation acting on the first area in the graphical user interface to control the game interaction object to move. When the user executes the second operation, the second attack control is displayed on the graphical user interface according to the current position of the touch point corresponding to the second operation, and the terminal device can respond to a third operation of sliding from the first area to the second attack control to control the game interaction object to execute the attack operation. Because the third operation is the continuous operation of the second operation, the user can directly switch to the game interaction object to be controlled to execute the attack operation when the game interaction object is controlled to move, fingers do not need to leave the screen of the terminal device, the operation is simple, and the error touch of surrounding buttons can not be generated.
The following describes the embodiments of the present application in detail with reference to the accompanying drawings. In the following embodiments, the graphical user interfaces are described by taking the interface of a large-fleeing-and-killing type hand game as an example.
Fig. 3 is a schematic view of a game interface provided in the embodiment of the present application, and as shown in fig. 3, is a large-fleeing and killing type interface 30 for a hand game, where the interface 30 is a graphical user interface provided by a terminal device.
And a second moving control is included on the graphical user interface, and the second moving control is configured to respond to touch operation to control the game interaction object to move within a horizontal range corresponding to the current height.
In this hand trip, the game interaction object is an unmanned aerial vehicle, and unmanned aerial vehicle can move, also can attack the operation (shooting operation promptly). On the interface 30, a left rocker 31, i.e. a second movement control on the graphical user interface, is included. The left rocker 31 is used for controlling the movement of the game interaction object on the horizontal plane, and the user can control the left rocker 31 to slide up, down, left and right through the left hand, so as to control the up, down, left and right displacement of the game interaction object on the horizontal plane. For example, on the left side rocking bar 31, four arrows, up, down, left, and right, may be displayed to indicate the user to control the moving direction of the left side rocking bar 31.
On the graphical user interface, a first attack control is further provided, such as on interface 30 in fig. 3, and a first attack control 301 is further included, where the first attack control 301 can control the game interaction object to perform an attack operation. When a user performs a first operation on first attack control 301, such as clicking on first attack control 301, the game interaction object may perform the attack operation. In the hand game illustrated in fig. 3, the attack operation is a shooting operation.
Optionally, the game interactive object further comprises a first area on the graphical user interface, and when the user performs a second operation on the first area, the movement of the game interactive object can be controlled.
For example in the example of fig. 3, the first zone is the zone near the unmanned aerial vehicle's lift rocker 32, and the second operation may be a click operation on that lift rocker 32. By clicking the lifting rocker 32, the movement of the drone in the vertical direction can be controlled.
Optionally, the first area includes a movement-assisted identifier and a first movement control, and the first movement control is configured to move along a direction range determined by the movement-assisted identifier according to a second operation. The mobile auxiliary mark is a strip mark, and the extending direction of the strip mark is vertical to or parallel to the long edge boundary of the game picture.
Fig. 4 is a second schematic view of a game interface provided in an embodiment of the present application, and the interface 40 illustrated in fig. 4 is an interface responding to a second operation based on the interface 30 illustrated in fig. 3, where the second operation is illustrated by clicking the lifting rocker 32.
In the first area, the mobile auxiliary identifier 41 and the first movement control 42 are included, wherein the extending direction of the mobile auxiliary identifier 41 is perpendicular to the long boundary of the interface 30, and is a bar identifier. When the user executes the second operation, the first mobile control can be dragged to move, and the moving direction is the direction determined by the mobile auxiliary identifier. The first movement control 42 is configured to control the height of the game interaction object from the ground in response to the second operation to control the game interaction object to move away from or close to the ground. For example, in fig. 4, the user may drag the first movement control 42 to move up and down, and the up and down direction is the direction determined by the movement auxiliary identifier 41. When the user drags the first movement control 42 to move up and down, the game interaction object may be controlled to move in the vertical direction.
When the user executes the second operation, the second attack control is displayed on the graphical user interface according to the current position of the touch point corresponding to the second operation on the graphical user interface. Specifically, according to the current position of the touch point corresponding to the second operation on the graphical user interface, a first position in a second area, different from the first area, of the graphical user interface may be determined, and then the second attack control is displayed at the first position.
The second operation may be a longitudinal sliding operation or a lateral sliding operation. When the second operation is a longitudinal sliding operation, the second attack control is located on the left side or the right side of the first area, and the vertical coordinate of the first position is the same as that of the current position. When the second operation is a lateral sliding operation, the second attack control is located on the upper side or the lower side of the first area, and the horizontal coordinate of the first position is the same as that of the current position.
It should be noted that, in the embodiment of the present application, the longitudinal sliding operation does not mean that the second operation only has sliding in the longitudinal direction, and there is no sliding in the transverse direction. The longitudinal sliding action may be at an offset angle from the longitudinal axis, i.e., the longitudinal sliding action may have a slight sliding movement in the lateral direction. Similarly, the lateral sliding operation does not mean that the second operation can only slide in the lateral direction, and there is no sliding in the longitudinal direction. The lateral sliding action may be at an offset angle from the lateral axis, i.e., the lateral sliding action may have a slight sliding movement in the longitudinal direction.
The second operation is described below as an example of the longitudinal sliding operation.
Fig. 5 is a third schematic diagram of a game interface provided in an embodiment of the present application, where the interface 50 illustrated in fig. 5 is an interface that displays a second attack control on the basis of the interface 40 illustrated in fig. 4, where the second operation is a vertical sliding operation.
On the interface 50, a second attack control 51 is included, and a first position of the second attack control 51 is related to a current position of a touch point corresponding to the second operation on the graphical user interface. For example, in fig. 5, the first position is the same as the horizontal coordinate of the current position. When the touch point of the second operation moves upward, the first position of the second attack control 51 also moves upward; when the touch point of the second operation moves downward, the first position of the second attack control 51 also moves downward.
When the second operation is executed, the terminal device also displays a first indication icon in the graphical user interface in response to the second operation, wherein the first indication icon is used for indicating sliding towards the second attack control and/or sliding along the direction determined by the mobile auxiliary mark.
For example, in fig. 5, in response to the second operation, a first indication icon 52 is displayed on the interface 50, and the first indication icon 52 indicates that the second attack control 51 is slid toward.
After the second attack control is displayed on the graphical user interface, the user may perform a third operation to slide from the first region to the second attack control. And the terminal equipment responds to a third operation of sliding from the first area to the second attack control in a first sliding direction, wherein the first sliding direction is vertical to the direction determined by the mobile auxiliary identification.
Fig. 6 is a fourth schematic view of a game interface provided in an embodiment of the present application, where an interface 60 illustrated in fig. 6 is an interface after a third operation is performed on the basis of the interface 50 illustrated in fig. 5.
As shown in fig. 6, where the first sliding direction is a horizontal direction, and the third operation is an operation of sliding from the first area to the second attack control 51 in the horizontal direction, the first sliding direction is indicated by an arrow on the interface 60, and the first sliding direction is perpendicular to the direction determined by the movement auxiliary identifier 41.
After the third operation is performed, the game interaction object performs an attack operation, in fig. 6, i.e., no one has a chance to perform a shooting operation. Note that the third operation is a continuous operation of the second operation.
After the third operation is performed, the user may further perform a fourth operation, where the fourth operation is an operation acting on the second attack control, and the fourth operation is a continuous operation of the third operation. And after executing the fourth operation, the terminal equipment responds to the fourth operation acting on the second attack control to control the game interaction object to move and execute the attack operation at the same time.
The fourth operation is similar to the second operation and is a sliding operation satisfying a first preset sliding direction, and the sliding direction of the third operation is different from the first preset sliding direction.
Fig. 7 is a fifth schematic view of a game interface provided in an embodiment of the present application, where an interface 70 illustrated in fig. 7 is an interface after a fourth operation is performed on the basis of the interface 60 illustrated in fig. 6.
In fig. 7, the first preset sliding direction is an up-down sliding direction, and the sliding direction of the third operation is substantially a left-right sliding direction.
The fourth operation is to control the second attack control 51 to slide up and down, thereby controlling the game interaction object to execute the attack operation while controlling the movement of the game interaction object. Optionally, the fourth operation controls the movement of the game interaction object in the vertical direction, that is, controls the height of the game interaction object from the ground, and controls the game interaction object to move away from or close to the ground.
The fourth operation is a continuous operation of the third operation, which means that the fourth operation can be directly and continuously executed without the need of leaving the screen of the terminal device with the finger of the user after the third operation is executed. When the user performs the third operation at any time, the user may perform a fourth operation acting on the second attack control to switch from controlling the game interaction object to perform the attack operation to controlling the game interaction object to move and perform the attack operation at the same time.
According to the game control method provided by the embodiment of the application, the terminal device provides the graphical user interface, the graphical user interface comprises the game picture, the game picture comprises the game interaction object, the graphical user interface provides the first attack control, the terminal device can respond to the first operation acting on the first attack control to control the game interaction object to execute the attack operation, and can respond to the second operation acting on the first area in the graphical user interface to control the game interaction object to move. When the user executes the second operation, the second attack control is displayed on the graphical user interface according to the current position of the touch point corresponding to the second operation, and the terminal device can respond to a third operation of sliding from the first area to the second attack control to control the game interaction object to execute the attack operation. Because the third operation is the continuous operation of the second operation, the user can directly switch to the game interaction object to be controlled to execute the attack operation when the game interaction object is controlled to move, fingers do not need to leave the screen of the terminal device, and the error touch of surrounding buttons can not be generated. Meanwhile, the second operation can be sliding operation, feedback of the displacement distance sense of the user can be given when the game interaction object is controlled to move, and the game immersion sense is good. Through the fourth operation, the game interaction object can be simultaneously controlled to execute displacement and attack operation, the operation is simple, and the finger displacement path is short.
Fig. 8 is a schematic structural diagram of a game control device according to an embodiment of the present application, and as shown in fig. 8, the game control device includes:
the display module 81 is configured to provide a first attack control on a graphical user interface provided by a terminal device, where the graphical user interface includes a game screen, and the game screen includes a game interaction object;
a first control module 82, configured to, in response to a first operation acting on the first attack control, control the game interaction object to perform an attack operation;
the second control module 83 is configured to control the game interaction object to move in response to a second operation performed on a first area in the graphical user interface, and display a second attack control on the graphical user interface according to a current position of a touch point corresponding to the second operation on the graphical user interface, where the second attack control is configured to move along with the touch point corresponding to the second operation;
a third control module 84, configured to control the game interaction object to perform an attack operation in response to a third operation of sliding from the first area to the second attack control, where the third operation is a continuous operation of the second operation.
In one possible implementation, the third control module 84 is further configured to:
and responding to a fourth operation acting on the second attack control, and controlling the game interaction object to move and execute the attack operation at the same time, wherein the fourth operation is a continuous operation of the third operation.
In a possible embodiment, the second operation is a sliding operation satisfying a first preset sliding direction, the fourth operation is a sliding operation satisfying the first preset sliding direction, and the sliding direction of the third operation is different from the first preset sliding direction.
In one possible implementation, the first area includes a movement-assisted identifier and a first movement control configured to move along a direction range determined by the movement-assisted identifier according to the second operation.
In one possible implementation, the graphical user interface provides a second movement control, the first movement control and the second movement control are located in different areas of the graphical user interface, and the first movement control is configured to control the height of the game interaction object from the ground in response to the second operation so as to control the game interaction object to move away from or close to the ground; the second movement control is configured to respond to touch operation to control the game interaction object to move within a horizontal plane range corresponding to the current height.
In a possible implementation manner, the mobile auxiliary mark is a bar mark, and the extension direction of the bar mark is perpendicular to or parallel to the long edge boundary of the game picture.
In a possible implementation, the second control module 83 is specifically configured to:
determining a first position in a second area, different from the first area, in the graphical user interface according to the current position of the touch point corresponding to the second operation on the graphical user interface;
displaying the second attack control at the first location.
In a possible implementation, the second operation is a longitudinal sliding operation, the second attack control is located on the left side or the right side of the first area, and the first position has the same vertical coordinate as the current position; alternatively, the first and second electrodes may be,
the second operation is a transverse sliding operation, the second attack control is located on the upper side or the lower side of the first area, and the horizontal coordinate of the first position is the same as that of the current position.
In a possible implementation, the third control module 84 is specifically configured to:
responding to a third operation of sliding from the first area to the second attack control in a first sliding direction, wherein the first sliding direction is perpendicular to the determined direction of the mobile auxiliary identifier.
In a possible embodiment, the second control module 83 is further configured to:
in response to the second operation, displaying a first indication icon in the graphical user interface, wherein the first indication icon is used for indicating sliding towards the second attack control and/or sliding along the direction determined by the movement auxiliary mark.
The game control device provided by the embodiment of the application is used for executing the method embodiment, the implementation principle and the technical effect are similar, and the embodiment is not described herein again.
Fig. 9 is a schematic diagram of a hardware structure of a terminal device provided in an embodiment of the present application, and as shown in fig. 9, the terminal device includes: at least one processor 91 and a memory 92. The processor 91 and the memory 92 are connected by a bus 93.
Optionally, the model determination further comprises a communication component. For example, the communication component may include a receiver and/or a transmitter.
In a specific implementation, the at least one processor 91 executes computer-executable instructions stored by the memory 92, such that the at least one processor 91 performs the game control method as described above.
For a specific implementation process of the processor 91, reference may be made to the above method embodiments, which implement similar principles and technical effects, and this embodiment is not described herein again.
In the embodiment shown in fig. 9, it should be understood that the Processor may be a Central Processing Unit (CPU), other general-purpose processors, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in the incorporated application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in the processor.
The memory may comprise high speed RAM memory and may also include non-volatile storage NVM, such as at least one disk memory.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The present application also provides a computer-readable storage medium, in which computer-executable instructions are stored, and when a processor executes the computer-executable instructions, the game control method as described above is implemented.
The computer-readable storage medium may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. Readable storage media can be any available media that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Of course, the processor and the readable storage medium may also reside as discrete components in the apparatus.
The division of the units is only a logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. A game control method is characterized in that a terminal device provides a graphical user interface, the graphical user interface comprises a game picture, the game picture comprises a game interaction object, and the method comprises the following steps:
providing a first attack control at the graphical user interface;
responding to a first operation acted on the first attack control, and controlling the game interaction object to execute an attack operation;
responding to a second operation acting on a first area in the graphical user interface, controlling the game interaction object to move, and displaying a second attack control on the graphical user interface according to the current position of a touch point corresponding to the second operation on the graphical user interface, wherein the second attack control is set to move along with the touch point corresponding to the second operation;
and controlling the game interaction object to execute an attack operation in response to a third operation of sliding from the first area to the second attack control, wherein the third operation is a continuous operation of the second operation.
2. The method of claim 1, further comprising:
and responding to a fourth operation acting on the second attack control, and controlling the game interaction object to move and execute the attack operation at the same time, wherein the fourth operation is a continuous operation of the third operation.
3. The method according to claim 2, wherein the second operation is a sliding operation satisfying a first preset sliding direction, the fourth operation is a sliding operation satisfying the first preset sliding direction, and the sliding direction of the third operation is different from the first preset sliding direction.
4. The method of claim 3, wherein the first region comprises a movement-assisted indicator and a first movement control configured to move along a range of directions determined by the movement-assisted indicator according to the second operation.
5. The method of claim 4, wherein the graphical user interface provides a second movement control, wherein the first movement control and the second movement control are located in different regions of the graphical user interface, and wherein the first movement control is configured to control a height of the game interaction object from a floor in response to the second operation to control the game interaction object to move away from or close to the floor; the second movement control is configured to respond to touch operation to control the game interaction object to move within a horizontal plane range corresponding to the current height.
6. The method according to claim 4 or 5, wherein the mobile auxiliary mark is a bar mark, and the extension direction of the bar mark is perpendicular or parallel to the long edge boundary of the game picture.
7. The method of claim 1, wherein displaying a second attack control on the graphical user interface according to a current position of a touch point corresponding to the second operation on the graphical user interface comprises:
determining a first position in a second area, different from the first area, in the graphical user interface according to the current position of the touch point corresponding to the second operation on the graphical user interface;
displaying the second attack control at the first location.
8. The method of claim 7,
the second operation is a longitudinal sliding operation, the second attack control is located on the left side or the right side of the first area, and the vertical coordinate of the first position is the same as that of the current position; alternatively, the first and second electrodes may be,
the second operation is a transverse sliding operation, the second attack control is located on the upper side or the lower side of the first area, and the horizontal coordinate of the first position is the same as that of the current position.
9. The method of claim 4 or 5, wherein responding to a third operation of sliding from the first region to the second attack control comprises:
responding to a third operation of sliding from the first area to the second attack control in a first sliding direction, wherein the first sliding direction is perpendicular to the determined direction of the mobile auxiliary identifier.
10. The method according to claim 4 or 5, characterized in that the method further comprises:
in response to the second operation, displaying a first indication icon in the graphical user interface, wherein the first indication icon is used for indicating sliding towards the second attack control and/or sliding along the direction determined by the movement auxiliary mark.
11. A game control apparatus, comprising:
the terminal equipment comprises a display module, a first attack control module and a second attack control module, wherein the display module is used for providing a first attack control on a graphical user interface provided by the terminal equipment, the graphical user interface comprises a game picture, and the game picture comprises a game interaction object;
the first control module is used for responding to a first operation acted on the first attack control and controlling the game interaction object to execute an attack operation;
the second control module is used for responding to a second operation acting on a first area in the graphical user interface, controlling the game interaction object to move, and displaying a second attack control on the graphical user interface according to the current position of a touch point corresponding to the second operation on the graphical user interface, wherein the second attack control is set to move along with the touch point corresponding to the second operation;
and the third control module is used for responding to a third operation of sliding from the first area to the second attack control and controlling the game interaction object to execute an attack operation, wherein the third operation is a continuous operation of the second operation.
12. A terminal device, comprising:
a memory for storing a program;
a processor for executing the program stored in the memory, the processor being configured to perform the game control method according to any one of claims 1 to 10 when the program is executed.
13. A computer-readable storage medium characterized by comprising instructions which, when run on a computer, cause the computer to perform the game control method according to any one of claims 1 to 10.
CN202011304002.1A 2020-11-19 2020-11-19 Game control method and device Pending CN112263833A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011304002.1A CN112263833A (en) 2020-11-19 2020-11-19 Game control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011304002.1A CN112263833A (en) 2020-11-19 2020-11-19 Game control method and device

Publications (1)

Publication Number Publication Date
CN112263833A true CN112263833A (en) 2021-01-26

Family

ID=74340315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011304002.1A Pending CN112263833A (en) 2020-11-19 2020-11-19 Game control method and device

Country Status (1)

Country Link
CN (1) CN112263833A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113975798A (en) * 2021-11-09 2022-01-28 北京字跳网络技术有限公司 Interaction control method and device and computer storage medium
WO2024045528A1 (en) * 2022-08-30 2024-03-07 网易(杭州)网络有限公司 Game control method and apparatus, and computer device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108771858A (en) * 2018-05-11 2018-11-09 网易(杭州)网络有限公司 Technical ability control switching method, device, terminal and storage medium
CN109224438A (en) * 2018-10-26 2019-01-18 网易(杭州)网络有限公司 The control method and device of virtual role in game
US20190126148A1 (en) * 2017-10-24 2019-05-02 Netease (Hangzhou) Network Co.,Ltd. Virtual Character Controlling Method and Apparatus, Electronic Device, and Storage Medium
CN110215691A (en) * 2019-07-17 2019-09-10 网易(杭州)网络有限公司 The control method for movement and device of virtual role in a kind of game
CN111111190A (en) * 2019-12-17 2020-05-08 网易(杭州)网络有限公司 Interaction method and device for virtual characters in game and touch terminal
CN111111218A (en) * 2019-12-19 2020-05-08 腾讯科技(深圳)有限公司 Control method and device of virtual unmanned aerial vehicle, storage medium and electronic device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190126148A1 (en) * 2017-10-24 2019-05-02 Netease (Hangzhou) Network Co.,Ltd. Virtual Character Controlling Method and Apparatus, Electronic Device, and Storage Medium
CN108771858A (en) * 2018-05-11 2018-11-09 网易(杭州)网络有限公司 Technical ability control switching method, device, terminal and storage medium
CN109224438A (en) * 2018-10-26 2019-01-18 网易(杭州)网络有限公司 The control method and device of virtual role in game
CN110215691A (en) * 2019-07-17 2019-09-10 网易(杭州)网络有限公司 The control method for movement and device of virtual role in a kind of game
CN111111190A (en) * 2019-12-17 2020-05-08 网易(杭州)网络有限公司 Interaction method and device for virtual characters in game and touch terminal
CN111111218A (en) * 2019-12-19 2020-05-08 腾讯科技(深圳)有限公司 Control method and device of virtual unmanned aerial vehicle, storage medium and electronic device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113975798A (en) * 2021-11-09 2022-01-28 北京字跳网络技术有限公司 Interaction control method and device and computer storage medium
WO2024045528A1 (en) * 2022-08-30 2024-03-07 网易(杭州)网络有限公司 Game control method and apparatus, and computer device and storage medium

Similar Documents

Publication Publication Date Title
US10716997B2 (en) Information processing method and apparatus, electronic device, and storage medium
CN110270086B (en) Method and device for controlling movement of virtual character in game
US10709982B2 (en) Information processing method, apparatus and non-transitory storage medium
US10702774B2 (en) Information processing method, apparatus, electronic device and storage medium
CN109550247B (en) Method and device for adjusting virtual scene in game, electronic equipment and storage medium
CN110448904B (en) Game view angle control method and device, storage medium and electronic device
CN112263833A (en) Game control method and device
CN109908581B (en) Game operation method, device and equipment
CN110575671A (en) Method and device for controlling view angle in game and electronic equipment
CN111840988B (en) Game skill triggering method, game skill triggering device, game client and medium
CN111840987B (en) Information processing method and device in game and electronic equipment
CN111701226A (en) Control method, device and equipment for control in graphical user interface and storage medium
CN111870942B (en) Attack control method and device for virtual units and electronic equipment
CN112619124A (en) Control method and device for game object movement and electronic equipment
CN113244610A (en) Method, device, equipment and storage medium for controlling virtual moving object in game
CN113900570B (en) Game control method, device, equipment and storage medium
CN107391000B (en) Information processing method and device, storage medium and electronic equipment
CN111773671A (en) Method and device for controlling movement of virtual object and terminal equipment
CN112619147A (en) Game equipment replacing method and device and terminal device
CN111973987A (en) Method, device and equipment for processing sliding shovel action in game and storage medium
CN109002293B (en) UI element display method and device, electronic equipment and storage medium
CN113721820B (en) Man-machine interaction method and device and electronic equipment
WO2022247196A1 (en) Game positioning method and apparatus, and mobile terminal
CN113413588B (en) Game task processing method and device, electronic equipment and storage medium
CN115738230A (en) Game operation control method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination