CN114392554A - Method, apparatus, device, medium and program product for game control - Google Patents

Method, apparatus, device, medium and program product for game control Download PDF

Info

Publication number
CN114392554A
CN114392554A CN202111674172.3A CN202111674172A CN114392554A CN 114392554 A CN114392554 A CN 114392554A CN 202111674172 A CN202111674172 A CN 202111674172A CN 114392554 A CN114392554 A CN 114392554A
Authority
CN
China
Prior art keywords
control
game
dragging
determining
game character
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111674172.3A
Other languages
Chinese (zh)
Inventor
叶翰尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111674172.3A priority Critical patent/CN114392554A/en
Publication of CN114392554A publication Critical patent/CN114392554A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present application provides a method, apparatus, device, medium and program product for game control, wherein the method comprises: the method comprises the steps of establishing a binding relationship between a virtual camera and a game role, responding to a first operation instruction acting on a first position in a game scene, controlling the game role to move to the first position, responding to a second operation instruction acting on a second position in the game scene, displaying an interactive control, responding to control operation aiming at the interactive control, determining a target speed to be moved by the game role according to the control operation, and controlling the game role to move to the second position at the target speed. According to the technical scheme, the second operation instruction is input in the game scene under the lock view angle to call out the interactive control, so that a player can input corresponding control operation aiming at the interactive control, the target moving speed of the game role is flexibly adjusted, and the control effect of the player on the game role is improved.

Description

Method, apparatus, device, medium and program product for game control
Technical Field
The present application relates to the field of gaming, and more particularly, to a method, apparatus, device, medium, and program product for game control.
Background
In a virtual game world, a player usually needs to operate one or more virtual game characters to move from one target point to another target point, and the moving process needs to set the moving speed and moving direction for the game characters.
In the end game of the prior lock view angle, in the process of controlling the movement of a game character, a player only needs to specify the movement direction of the game character, and the movement speed of the game character is usually a fixed gear, for example, two gears of running and jogging, and when the game character is in different gears, the movement speed of the game character changes correspondingly.
However, the movement control method of the game character in the prior art cannot flexibly adapt to various game scenes in the game, and the control effect is poor.
Disclosure of Invention
The application provides a game control method, device, equipment, medium and program product, which are used for solving the problems that the moving speed of the existing game role cannot adapt to various scenes in a game and the control effect is poor.
In a first aspect, an embodiment of the present application provides a method for controlling a game, where a terminal device provides a graphical user interface, where a content displayed by the graphical user interface includes a game scene captured by a virtual camera, the game scene includes a game character controlled by a mouse, and the mouse is connected to the terminal device, where the method includes:
establishing a binding relationship between the virtual camera and the game role so as to control the movement of a virtual lens according to the position change of the game role in the game scene;
responding to a first operation instruction acting on a first position in the game scene, and controlling the game role to move to the first position;
responding to a second operation instruction acting on a second position in the game scene, and displaying an interactive control;
responding to the control operation aiming at the interactive control, and determining the target speed of the game role to be moved according to the control operation;
controlling the game character to move to the second position at the target speed;
wherein, the first operation instruction and the second operation instruction are both from the mouse.
In one possible design of the first aspect, before the controlling the game character to move to the second position at the target speed, the method further includes:
determining that the control operation is ended.
In another possible design of the first aspect, the control operation is a drag operation, and the method further includes:
responding to the control operation aiming at the interactive control, and determining a dragging direction and a dragging distance according to the control operation;
determining a direction control to be displayed according to the dragging direction and the dragging distance;
and displaying the direction control on the graphical user interface.
In yet another possible design of the first aspect, the determining, according to the dragging direction and the dragging distance, a direction control to be displayed includes:
determining the arrow direction of a direction control to be displayed according to the dragging direction;
determining a distance identification part of the direction control to be displayed according to the dragging distance, wherein the longer the dragging distance is, the larger the area of the distance identification part is;
and determining a direction control to be displayed according to the arrow direction and the distance identification part.
In yet another possible design of the first aspect, the determining the target speed to be moved by the game character according to the control operation being a drag operation includes:
determining a target direction to be moved by the game role according to the dragging direction of the dragging operation;
and determining the target speed of the game character to be moved according to the dragging distance of the dragging operation.
In yet another possible design of the first aspect, the first operation instruction is an instruction generated by a click operation on a left button of the mouse.
In yet another possible design of the first aspect, the method further includes:
and controlling the game character to stop moving in response to a third operation for the game character.
In a second aspect, an embodiment of the present application provides a game control apparatus, including:
the relationship establishing module is used for establishing a binding relationship between the virtual camera and the game role so as to control the movement of the virtual lens according to the position change of the game role in the game scene;
the first control module is used for responding to a first operation instruction acting on a first position in the game scene and controlling the game role to move to the first position;
the control display module is used for responding to a second operation instruction acting on a second position in the game scene and displaying an interactive control;
the speed determining module is used for responding to the control operation aiming at the interactive control and determining the target speed of the game role to be moved according to the control operation;
the second control module is used for controlling the game role to move to the second position at the target speed; wherein, the first operation instruction and the second operation instruction are both from the mouse.
In a third aspect, an embodiment of the present application provides a terminal device, including: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes the computer-executable instructions stored by the memory to implement the methods described above.
In a fourth aspect, the present application provides a readable storage medium, in which computer instructions are stored, and when executed by a processor, the computer instructions are used to implement the method as described above.
In a fifth aspect, the present application provides a program product including computer instructions, which when executed by a processor implement the method described above.
According to the game control method, the game control device, the game control equipment, the game control medium and the program product, the second operation instruction is input in the game scene under the lock view angle to call out the interactive control, the corresponding control operation can be input aiming at the interactive control, the target moving speed of the game role can be flexibly adjusted, and the control effect of a player on the game role is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application;
fig. 1 is a schematic view of a scene of a method for controlling a game according to an embodiment of the present application;
FIG. 2 is a flow chart illustrating a method of game control according to an embodiment of the present disclosure;
FIG. 3 is a diagram illustrating a first embodiment of a control operation provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a control process for controlling operations according to an embodiment of the present application;
fig. 5 is a schematic diagram of a second control operation embodiment provided in the present application;
FIG. 6 is a schematic diagram of a graphical user interface provided by an embodiment of the present application;
FIG. 7 is a schematic structural diagram of a game control apparatus according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a schematic view of a scene of a method for controlling movement of a game character according to an embodiment of the present application, and as shown in fig. 1, the method may be applied to a terminal device 10. Illustratively, the terminal device may be a computer or a mobile terminal, etc. Taking the terminal device 10 as an example, when a target game is executed, a game screen is displayed on the graphical user interface 11, and the game screen usually includes a player-controlled game character 101 and controls for controlling the game character.
When the terminal device 10 is a computer, the target game can lock the end game of the view angle, which is usually the view angle is locked, that is, the game screen usually only displays the game character and the game scene near the position where the game character is located. In the game picture, the control at least comprises a mouse. The player can control the movement of the game character in the game world by dragging/clicking a mouse, for example, to control the game character to enter a virtual bazaar/monster nest, etc. For example, the player can also control the moving direction of the game character through keyboard input, for example, the game character is controlled to move forward through the keys W on the keyboard.
In the prior art, two movement modes are usually fixedly set in the end-play of the lock view angle, specifically, two movement modes of jogging and running can be set, and a player can switch the game character back and forth in different movement modes through key input so as to adapt to different game scenes. For example, when driving, the player can switch the game character to the running mode by pressing the keyboard key X, which can increase the moving speed of the game character and reduce the time spent on driving. And if the game character is strolling a virtual city, the player can switch the game character into a slow-walking mode through the keyboard key X, so that the moving speed of the game character can be slowed down, the game character can be operated by the player to stroll the city conveniently, but the moving control mode of the game character is not flexible enough, and the game character cannot be adapted to various game scenes.
In addition, some games have no option of setting a movement mode at all, and the movement speed of a game character is determined by virtual attributes of the game character, for example, the movement speed of the game character is increased when the game character wears equipment for increasing the movement attributes. Such a game player can control the moving direction of the game character only by keyboard keys or mouse input, and cannot switch the moving mode thereof. This way of controlling the movement of the game character is less effective than the aforementioned way.
In view of the above problems, according to the movement control method for a game character provided in the embodiments of the present application, a player only needs to input a control operation on a terminal device, and the game character can respond to the control operation and be given a corresponding movement direction and movement speed, and the movement speed can be linearly changed according to an input amount of the control operation, so that linear adjustment of the movement speed can be achieved, flexibility of the game character is improved, and at the same time, one control operation can complete two controls of the movement direction and the movement speed, and the player does not need to control the movement speed through both mouse control direction and key switching, thereby reducing an operation amount of the player.
The technical solution of the present application will be described in detail below with reference to specific examples. It should be noted that the following specific embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
Fig. 2 is a flowchart illustrating a method for controlling a game according to an embodiment of the present application, where the method may use a terminal device as an execution subject. The terminal device needs to have a display interface to display a graphical user interface, and the terminal device needs to have processing capability to run a target game and display a game scene in the game on the graphical user interface, and meanwhile, the terminal device also needs to have interaction capability, so that a player can operate and control game characters in the game through a mouse connected with the terminal device. In this embodiment, a game scene captured by a virtual camera in a game is displayed on the graphical user interface, and the game scene includes a game character controlled by a mouse. As shown in fig. 2, the method may specifically include the following steps:
s201, establishing a binding relationship between the virtual camera and the game role so as to control the movement of the virtual lens according to the position change of the game role in the game scene.
In this embodiment, the target game refers to a lock-view game, such as an end-play game. By establishing the binding relationship between the virtual camera and the game role, the effect of locking the visual angle can be realized, namely, only the content captured by the virtual camera can be displayed on the graphical user interface, and the content at least comprises the game role.
Illustratively, the game character is moved by player control, and when the game character moves, the virtual lens moves along with the game character to capture game information around the game character and display the game information on the graphical user interface. For example, when a game Character moves to a first position, if there is a Non-Player controller (NPC) and a house, etc. near the first position, the NPC, the house, and the game Character are captured by a virtual camera and displayed on a graphical user interface.
S202, responding to a first operation instruction acting on a first position in a game scene, and controlling the game role to move to the first position.
In this embodiment, the movement of the game character can be controlled by keyboard input, for example, the player can input keys W on the keyboard to control the game character to move forward. For example, the player may also control the movement of the game character through a mouse input, for example, when the player clicks a right mouse button, the game character will move along with the click position of the right mouse button in the game scene.
For example, on some terminal devices with touch functions, a player can directly realize the movement of a game character through touch by touching a display interface on the terminal device. The position of the impact is the target movement position of the game character (for example, the first position).
For example, the first operation instruction may be generated by a click operation for a left button of a mouse.
S203, responding to a second operation instruction acting on a second position in the game scene, and displaying the interactive control.
In this embodiment, the second operation instruction is generated in a different manner from the first operation instruction, and the second operation instruction may be generated by a long-press operation on a right button of the mouse. For example, the first operation instruction may be generated by the player by clicking a left button of the mouse, and the second operation instruction may be generated by the player by long-pressing a right button of the mouse (for example, long-pressing for 0.33 seconds).
The interactive control can be an icon, and the player can operate the interactive control to realize control over the game character, such as controlling the moving direction, the moving speed and the like of the game character. For example, the interactive control may be a circular icon, and when the player drags the circular icon, the moving speed and the moving direction of the game character may be controlled by the distance of the dragging and the direction of the dragging. For example, if the player drags the circular icon to move 1 cm upward, the game character will determine that the movement speed is 50 according to the preset corresponding relationship between the dragging distance and the movement speed, and after the player finishes dragging, the game character will move upward at the speed of 50.
For example, the interactive control may be displayed directly at or near the second position to facilitate subsequent control operations by the player.
And S204, responding to the control operation aiming at the interactive control, and determining the target speed of the game role to be moved according to the control operation.
In this embodiment, the control operation may be a drag operation generated by the player through a mouse, for example, the player triggers the display of the interactive control through a long press of a right button of the mouse, and then the player drags the interactive control through the drag operation. For example, an intermediate value may be set, the target moving speed of the game character may become large if the player drags the interactive control by a distance exceeding the intermediate value, the target moving speed of the game character may become small if the player drags the interactive control by a distance below the intermediate value, and the moving speed of the game character may not change if the distance is equal to the intermediate value.
For example, the dragging distance of the interactive control can be converted into an input amount a, the base movement speed of the game character can be represented by U1, the target speed can be represented by U2, and then U2 is U1 × a + U1. Wherein, A takes the value of [ -1, 1 ].
For example, the player may drag the interactive control to move in different directions, the dragging direction may not be limited, and if the dragging distances in different dragging directions are the same, the resulting target speeds are also the same.
For example, the control operation may be that a player touches a display screen with a touch function on the terminal device with a finger, that is, the player touches the interactive control with the finger and slides the interactive control, so as to generate a drag operation, where a sliding distance of the finger determines a moving speed of the game character, and the larger the sliding distance is, the larger the moving speed of the game character is, the larger the sliding distance is, the sliding distance may be converted into the input amount a. Wherein, the moving speed of the game role is provided with an upper limit, and the moving speed of the game role does not become larger after reaching the upper limit.
In this embodiment, for example, the control operation may be a long-press operation in addition to the drag operation, for example, after the player calls out the interactive control through the first operation instruction and the second operation instruction, the player may click on the interactive control and press for a long time, the longer the long press time is, the faster the target speed is, and the shorter the long press time is, the smaller the target speed is, and after the long press is finished, the game character will select the corresponding target speed to move towards the second position according to the long press time.
For example, fig. 3 is a schematic diagram of a first control operation embodiment provided by the embodiment of the present application, as shown in fig. 3, in a graphical user interface 11 provided by a terminal device 10, a player may first press a right button of a mouse for more than a preset time (e.g., 0.33 seconds), then start dragging a circular icon (i.e., an interactive control) at the head of an arrow to slide in the direction of the arrow, and release the right button after sliding a distance, at which time a game character will determine a target speed according to the distance the interactive control is dragged, and then start moving toward a second position. For example, the farther the sliding distance is, the larger the value of the input quantity a is, and the faster the corresponding target speed is.
Wherein the second position may be the position where the right mouse button is initially pressed for a long time, for example, it may coincide with the position of the circular icon.
Fig. 4 is a schematic diagram of a control process of a control operation according to an embodiment of the present application, as shown in fig. 4, in a graphical user interface 11 provided by a terminal device 10, a player first presses a right mouse button for a long time, at this time, a circular icon (i.e., an interactive control) is displayed near a game character for prompting the player to start using mouse dragging, when the player uses mouse dragging, a direction arrow is displayed in the graphical user interface 11 for the player to view, and the direction arrow indicates a dragging direction of the player. During the process of dragging the mouse by the player, the sliding distance can be shown to the player through the prompt message. For example, the filled area in the arrow in fig. 4 represents the current sliding distance. For example, in other embodiments, the current sliding distance may be indicated by filling color in the arrow.
Wherein, a maximum value of the sliding distance may be set, for example, 100 (i.e., the sliding distance is [0, 100]), the sliding distance [0, 100] may correspond to the input amount a [ -1, 1], when the sliding distance is 0, the corresponding input amount a is-1, when the sliding distance is 50, the corresponding input amount is 0, when the sliding distance is 100, the corresponding input amount is 1, that is, the controllable speed of the game character is between [0, 2 × U1 ].
For example, the game character can also replan the moving direction and the target speed in the moving process. For example, when the game character moves in the current moving direction and the current moving speed, the player may input the next second operation instruction again to call out the display control, and then input the next control operation for the interactive control to control the game character to move in the next moving direction and the next target speed.
And S205, controlling the game character to move to the second position at the target speed.
And the first operation instruction and the second operation instruction are both from a mouse.
According to the method and the device, the interaction control is called on the graphical user interface through the operation instruction, the player can input corresponding control operation aiming at the interaction control to flexibly adjust the moving speed of the game role, the control process is simple and rapid, and the moving direction and the target speed of the game role can be determined simultaneously, so that the game role can flexibly change the moving speed and the moving direction under different scenes, and the control effect on the game role is improved.
In some embodiments, the above method further comprises: it is determined that the control operation is ended.
In the present embodiment, it is necessary to wait for the end of the control operation before controlling the game character to move to the second position at the target speed. For example, when the control operation is a drag operation, it is necessary to wait for the end of the drag operation, for example, after the player calls out the interactive control by a long-press operation, the player continues to drag the interactive control, and after the player finishes dragging the interactive control, the target speed can be determined, and the game character is controlled to move towards the second position at the target speed.
In some embodiments, taking the control operation as the dragging operation as an example, the method may further include the following steps:
responding to the control operation aiming at the interactive control, and determining the dragging direction and the dragging distance according to the control operation;
determining a direction control to be displayed according to the dragging direction and the dragging distance;
a directional control is displayed on the graphical user interface.
In this embodiment, the direction control may be an icon for prompting the player of the drag direction and drag distance. For example, the direction control may be an arrow in fig. 4, the direction indicated by the arrow is a dragging direction, and the size of the filled area within the arrow indicates a dragging distance. Wherein the maximum dragging distance is reached when the area within the arrow is filled. The maximum drag distance indicates an upper limit of the moving speed of the game character.
For example, if the direction control is an arrow, the player may be prompted to drag the distance according to the length of the arrow, for example, the longer the dragging distance, the longer the arrow displayed on the graphical user interface.
According to the method and the device, the direction control is displayed on the graphical user interface, the current dragging direction and dragging distance of the player can be prompted, the player can know the target speed of the game role in real time, the player can conveniently control the game role, and the control effect is improved.
Further, in some embodiments, the step of "determining the direction control to be displayed according to the dragging direction and the dragging distance" may be specifically implemented by the following steps:
determining the arrow direction of a direction control to be displayed according to the dragging direction;
determining a distance identification part of the direction control to be displayed according to the dragging distance;
and determining a direction control to be displayed according to the arrow direction and the distance identification part.
Wherein, the longer the dragging distance, the larger the area of the distance identification part.
In this embodiment, the distance identification part may be a filling area in fig. 4, and the larger the dragging distance, the larger the area of the filling area.
The embodiment of the application can indicate the dragging direction of the player dragging the interactive control through the arrow direction, and the dragging distance of the player dragging the interactive control is prompted through the area of the distance identification part, so that the player can conveniently learn the target speed of a game role according to the area of the distance identification part, and the control effect is improved.
Further, on the basis of the above embodiments, in some embodiments, the step S204 of determining the target speed at which the game character is to move according to the control operation may specifically be implemented by:
determining a target direction to be moved by the game role according to the dragging direction of the dragging operation;
and determining the target speed of the game character to be moved according to the dragging distance of the dragging operation.
In this embodiment, the moving direction of the game character may be changed or may not be changed during the process of moving to the second position. The invariable condition is mainly that the linear direction of the first position and the second position is directly taken as the moving direction. In the course of the actual game, there may be obstacles, monsters, etc. on the straight line formed by the first position and the second position, resulting in a need to change the moving direction of the game character. For example, the player may actively adjust the target direction of the game character according to the distribution of obstacles and monsters.
The method and the device for controlling the game role in the embodiment of the application determine the target direction and the target speed of the game role by utilizing the dragging direction and the dragging distance of the interactive control, do not need to control the moving direction of the game role through right click of a mouse like the prior method, and then switch the moving speed of the game role through the key X, reduce the operation amount, enable the player to quickly and conveniently adjust the direction and speed, adapt to different game scenes, and improve the control effect of the player on the game role.
Further, in some embodiments, the method may include the steps of:
responding to the click of the target control on the graphical user interface, and acquiring the click duration of the target control;
determining whether the click duration exceeds a preset duration;
if so, acquiring the displacement distance and the displacement direction of the target control on the graphical user interface after the clicking is finished;
taking the displacement distance as an operation distance;
the displacement direction is taken as the operation direction.
For example, the preset time period may be 0.33 seconds or 1 second, and the longer time period is set mainly to prevent collision with other game interaction commands.
In this embodiment, the target control is a mouse, and the click duration is a pressing duration of a right button of the mouse. The mouse is characterized in that the clicking action and the displacement action of the mouse are continuously generated, namely, after the right button of the mouse is clicked for more than the preset time, the player can continue to drag the mouse on the graphical user interface without releasing the mouse, and when the displacement is completed, the mouse is released, and at the moment, the whole control operation is completed.
According to the embodiment of the application, the preset time is set, the control operation of the application is triggered when the mouse clicking time exceeds the preset time, the displacement direction and the displacement distance of the mouse are obtained after the control operation is triggered, the operation direction and the operation distance are determined, a player is prevented from being touched by mistake in the game process, and the movement control effect on game characters is improved.
Illustratively, in some embodiments, the method may further include the steps of:
acquiring an operation position and an operation duration of control operation;
determining the moving direction of the game role according to the current position and the operation position of the game role;
taking the operation duration as the input quantity of the control operation;
and determining the target speed according to the preset linear relation and the operation duration.
For example, fig. 5 is a schematic diagram of a second control operation embodiment provided in the embodiment of the present application, as shown in fig. 5, when it is required to control a game character 51 to move to a position of the game character 52, on a graphical user interface 11 provided by a terminal device 10, a player may control a position (as a second position) where a mouse clicks the game character 52, if the mouse click exceeds a preset time length, trigger a control operation, acquire a click position (i.e., the position where the game character 52 is located) where the mouse clicks the game character 52, as an operation position, acquire a mouse click time length, as an operation time length, determine a current movement direction of the game character according to the current position and operation position of the game character, and determine a current movement speed according to the operation time length.
Specifically, an optimal moving path may be determined according to the operation position (i.e., the position where the game character 52 is located) and the position where the game character 51 is located, and then the current moving direction may be determined according to the optimal moving path. For example, referring to fig. 5, the dotted arrow in fig. 5 indicates the current moving direction. It should be noted that there may be an obstacle between the game character 51 and the game character 52, and the moving direction of the game character 51 may change while the game character 51 moves to the game character 52.
When the current moving speed is determined according to the operation time length, the preset second linear relation is mainly relied on. Specifically, the preset second linear relationship may be U2 ═ b ═ T ═ U1, where b is a preset coefficient and may take a value in a range of (0, 1), T represents the operation time length, U2 represents the current moving speed, and U1 represents the basic moving speed of the game character 51 and may be a fixed value.
Illustratively, in this embodiment, with continued reference to fig. 5, when the player controls the mouse to click on the game character 52 at a position exceeding a preset time length, the pointing icon (e.g., a circle) may be initially outlined, and after the entire pointing icon is outlined, the player is instructed to control the mouse to click to reach the maximum operation time length, at which point the player may release the mouse click, and the game character 51 will start to move according to the current moving speed and the current moving direction.
According to the method and the device, the current moving direction and the current moving speed of the game role are determined by obtaining the operation position and the operation duration of the control operation, so that a player does not need to increase the process of dragging the mouse, the operation amount is reduced, and the movement control efficiency of the player on the game role is improved.
Fig. 6 is a schematic diagram of a graphical user interface provided in an embodiment of the present application, and as shown in fig. 6, the direction control may be an arrow 111, and an area filled with a bar in the arrow 111 may be used to indicate a target speed of the game character. The larger the area filled with the line in the arrow 111 is, the higher the target speed of the game character is, and the smaller the filled area is, the lower the target speed is. The direction pointed by arrow 111 may then be used to indicate the target direction of the game character.
According to the method and the device, the direction control is triggered on the graphical user interface, the target speed and the target direction of the game role of the player can be prompted, and the movement control effect of the player on the game role is improved.
In some embodiments, the method for controlling the movement of the game character may further include the steps of:
in response to the end of the control operation, the game character is controlled to start moving according to the target direction and the target speed.
In this embodiment, the game character may start moving after the control operation is ended. For example, during the movement of the game character, the player can input a new control operation to change the current movement speed and movement direction of the game character.
In some embodiments, the method for controlling the movement of the game character may further include the steps of:
and controlling the game character to stop moving in response to a third operation for the game character.
In the present embodiment, when the game character moves in the target direction at the target speed, if the player inputs a new third operation, the movement of the game character is terminated. Illustratively, the third operation may be a mouse operation and/or a keyboard command, etc. When the third operation is a mouse operation (e.g., mouse click), for example, a player clicks one of the NPCs in a game scene through the mouse, at this time, the game character stops moving. Further, a dialog of the NPC may be displayed on the graphical user interface in response to the mouse operation.
For example, if the player inputs a new first operation instruction, a new second operation instruction, and a new control operation while the game character moves in the target direction at the target speed, the game character may not stop moving but determine a new target speed and a new target direction according to the new control operation and then continue moving in the new target speed and the new target direction.
According to the embodiment of the application, the third operation is input, so that the game role can be separated from the current target speed and the target direction quickly, and the movement control effect of the player on the game role is improved.
In some embodiments, the method for controlling the movement of the game character may further include the steps of:
obtaining game information in a game scene within a preset range of game roles;
and determining the content displayed by the graphical user interface according to the game information.
In this embodiment, various game information, such as NPC, other game characters, items, monsters, equipment, and the like, may be present in the vicinity of the game character while it is in the game scene. The preset range may be a range covered by the virtual lens. For example, when the graphical user interface displays content, a game character may be used as the center of the graphical user interface, and then game information may be displayed around the game character.
When the game character moves, the game information in the preset range changes, so that the content displayed on the graphical user interface changes, but the game character is included.
According to the method and the device, the game information in the preset range of the game role is obtained, the content displayed on the graphical user interface is formed, the visual angle can be locked, a player can conveniently know the game information around the game role in real time, the player can conveniently control the movement of the game role, and the control effect is improved.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Fig. 7 is a schematic structural diagram of a game control apparatus according to an embodiment of the present application, where the apparatus may be integrated on a terminal device, or may be independent of the terminal device and cooperate with the terminal device to implement the present solution. As shown in fig. 7, the game control apparatus 70 may specifically include a relationship establishing module 71, a first control module 72, a control display module 73, a speed determining module 74, and a second control module 75.
The relationship establishing module 71 is configured to establish a binding relationship between the virtual camera and the game role, so as to control the movement of the virtual lens according to a position change of the game role in the game scene. The first control module 72 is configured to control the game character to move to a first position in response to a first operation instruction acting on the first position in the game scene. The control display module 73 is configured to display the interactive control in response to a second operation instruction acting on a second position in the game scene. The speed determining module 74 is configured to respond to the control operation for the interactive control, and determine a target speed at which the game character is to move according to the control operation. The second control module 75 is used for controlling the game role to move to the second position at the target speed; and the first operation instruction and the second operation instruction are both from a mouse.
In some embodiments, the game control apparatus further includes a determination module configured to determine that the control operation is finished.
Optionally, in some embodiments, if the control operation is a drag operation, the game control apparatus further includes a direction display module, configured to:
responding to the control operation aiming at the interactive control, and determining the dragging direction and the dragging distance according to the control operation;
determining a direction control to be displayed according to the dragging direction and the dragging distance;
a directional control is displayed on the graphical user interface.
Optionally, in some embodiments, the direction display module may be specifically configured to:
determining the arrow direction of a direction control to be displayed according to the dragging direction;
determining a distance identification part of the direction control to be displayed according to the dragging distance;
and determining a direction control to be displayed according to the arrow direction and the distance identification part.
Wherein, the longer the dragging distance, the larger the area of the distance identification part.
Optionally, in some embodiments, the direction display module may be specifically configured to:
determining the arrow direction of a direction control to be displayed according to the dragging direction;
determining a distance identification part of the direction control to be displayed according to the dragging distance, wherein the longer the dragging distance is, the larger the area of the distance identification part is;
and determining a direction control to be displayed according to the arrow direction and the distance identification part.
In some embodiments, if the control operation is a drag operation, the speed determination module may be specifically configured to:
determining a target direction to be moved by the game role according to the dragging direction of the dragging operation;
and determining the target speed of the game character to be moved according to the dragging distance of the dragging operation.
In some embodiments, the first operation instruction is an instruction generated by a click operation for a left button of a mouse.
In some embodiments, the two operation instructions are instructions generated by a long-press operation for a right button of the mouse.
In some embodiments, the game control apparatus further includes a stop control module configured to control the game character to stop moving in response to a third operation on the game character.
The apparatus provided in the embodiment of the present application may be used to execute the method in the above embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the determining module may be a processing element separately set up, or may be implemented by being integrated in a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and the function of the determining module is called and executed by a processing element of the apparatus. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized.
For example, when some of the above modules are implemented in the form of a processing element scheduler code, the processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor that can call the program code.
Fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 8, the terminal device 80 includes: at least one processor 81, memory 82, bus 83, and communication interface 84.
Wherein: the processor 81, the communication interface 84 and the memory 82 communicate with each other via a bus 83.
A communication interface 84 for communicating with other devices. The communication interface comprises a communication interface for data transmission, a display interface or an operation interface for man-machine interaction and the like.
The processor 81 is configured to execute computer-executable instructions, and may specifically execute relevant steps in the methods described in the foregoing embodiments.
Processor 81 may be a central processing unit, and one or more processors included in the terminal device may be the same type of processor, such as one or more CPUs; or may be a different type of processor such as one or more CPUs.
Memory 82 for storing computer-executable instructions. The memory 82 may comprise high-speed RAM memory, and may also include non-volatile memory, such as at least one disk memory.
The present embodiment also provides a readable storage medium, in which computer instructions are stored, and when at least one processor of the terminal device executes the computer instructions, the terminal device executes the movement control method for the game character provided in the foregoing various embodiments.
The present embodiments also provide a program product comprising computer instructions stored in a readable storage medium. The computer instructions can be read from a readable storage medium by at least one processor of the terminal device, and the computer instructions executed by the at least one processor cause the terminal device to implement the movement control method of the game character provided in the various embodiments described above.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship; in the formula, the character "/" indicates that the preceding and following related objects are in a relationship of "division". "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
It is to be understood that the various numerical references referred to in the embodiments of the present application are merely for convenience of description and distinction and are not intended to limit the scope of the embodiments of the present application. In the embodiment of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiment of the present application.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (12)

1. A game control method is characterized in that a terminal device provides a graphical user interface, the content displayed by the graphical user interface comprises a game scene captured by a virtual camera, the game scene comprises a game character controlled by a mouse, and the mouse is connected with the terminal device, and the method comprises the following steps:
establishing a binding relationship between the virtual camera and the game role so as to control the movement of a virtual lens according to the position change of the game role in the game scene;
responding to a first operation instruction acting on a first position in the game scene, and controlling the game role to move to the first position;
responding to a second operation instruction acting on a second position in the game scene, and displaying an interactive control;
responding to the control operation aiming at the interactive control, and determining the target speed of the game role to be moved according to the control operation;
controlling the game character to move to the second position at the target speed;
wherein, the first operation instruction and the second operation instruction are both from the mouse.
2. The method of claim 1, wherein prior to said controlling said game character to move to said second position at said target speed, said method further comprises:
determining that the control operation is ended.
3. The method of claim 1, wherein the control operation is a drag operation, the method further comprising:
responding to the control operation aiming at the interactive control, and determining a dragging direction and a dragging distance according to the control operation;
determining a direction control to be displayed according to the dragging direction and the dragging distance;
and displaying the direction control on the graphical user interface.
4. The method according to claim 3, wherein determining the direction control to be displayed according to the dragging direction and the dragging distance comprises:
determining the arrow direction of a direction control to be displayed according to the dragging direction;
determining a distance identification part of the direction control to be displayed according to the dragging distance, wherein the longer the dragging distance is, the larger the area of the distance identification part is;
and determining a direction control to be displayed according to the arrow direction and the distance identification part.
5. The method of claim 1, wherein the control operation is a drag operation, and determining a target speed at which the game character is to be moved according to the control operation comprises:
determining a target direction to be moved by the game role according to the dragging direction of the dragging operation;
and determining the target speed of the game character to be moved according to the dragging distance of the dragging operation.
6. The method according to claim 1, wherein the first operation instruction is an instruction generated by a click operation for a left button of the mouse.
7. The method according to claim 1, wherein the second operation instruction is an instruction generated by a long-press operation for a right key of the mouse.
8. The method of claim 1, further comprising:
and controlling the game character to stop moving in response to a third operation for the game character.
9. A game controlled apparatus, comprising:
the relationship establishing module is used for establishing a binding relationship between the virtual camera and the game role so as to control the movement of the virtual lens according to the position change of the game role in the game scene;
the first control module is used for responding to a first operation instruction acting on a first position in the game scene and controlling the game role to move to the first position;
the control display module is used for responding to a second operation instruction acting on a second position in the game scene and displaying an interactive control;
the speed determining module is used for responding to the control operation aiming at the interactive control and determining the target speed of the game role to be moved according to the control operation;
the second control module is used for controlling the game role to move to the second position at the target speed; wherein, the first operation instruction and the second operation instruction are both from the mouse.
10. A terminal device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored by the memory to implement the method of any of claims 1-8.
11. A readable storage medium having stored therein computer instructions, which when executed by a processor, are adapted to implement the method of any one of claims 1-8.
12. A program product comprising computer instructions, characterized in that the computer instructions, when executed by a processor, implement the method of any of claims 1-8.
CN202111674172.3A 2021-12-31 2021-12-31 Method, apparatus, device, medium and program product for game control Pending CN114392554A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111674172.3A CN114392554A (en) 2021-12-31 2021-12-31 Method, apparatus, device, medium and program product for game control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111674172.3A CN114392554A (en) 2021-12-31 2021-12-31 Method, apparatus, device, medium and program product for game control

Publications (1)

Publication Number Publication Date
CN114392554A true CN114392554A (en) 2022-04-26

Family

ID=81228774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111674172.3A Pending CN114392554A (en) 2021-12-31 2021-12-31 Method, apparatus, device, medium and program product for game control

Country Status (1)

Country Link
CN (1) CN114392554A (en)

Similar Documents

Publication Publication Date Title
CN107930122B (en) Information processing method, device and storage medium
US10500483B2 (en) Information processing method and apparatus, storage medium, and electronic device
CN108404408B (en) Information processing method, information processing apparatus, storage medium, and electronic device
US10716997B2 (en) Information processing method and apparatus, electronic device, and storage medium
CN107823882B (en) Information processing method, information processing device, electronic equipment and storage medium
CN109550247B (en) Method and device for adjusting virtual scene in game, electronic equipment and storage medium
CN107019909B (en) Information processing method, information processing device, electronic equipment and computer readable storage medium
JP2022527502A (en) Virtual object control methods and devices, mobile terminals and computer programs
CN107185232B (en) Virtual object motion control method and device, electronic equipment and storage medium
CN109589605B (en) Game display control method and device
CN109011559B (en) Method, device, equipment and storage medium for controlling virtual carrier in game
JP7150108B2 (en) Game program, information processing device, information processing system, and game processing method
US9772743B1 (en) Implementation of a movable control pad on a touch enabled device
JP7498362B2 (en) Method, apparatus, electronic device and storage medium for controlling movement of virtual objects in a game
CN111888766B (en) Information processing method and device in game, electronic equipment and storage medium
CN113350779A (en) Game virtual character action control method and device, storage medium and electronic equipment
CN113262476B (en) Position adjusting method and device of operation control, terminal and storage medium
CN113900570A (en) Game control method, device, equipment and storage medium
CN113476823A (en) Virtual object control method and device, storage medium and electronic equipment
CN113694530A (en) Virtual character movement control method and device, electronic equipment and storage medium
WO2024007675A1 (en) Virtual object switching method and apparatus, storage medium, and electronic apparatus
CN114392554A (en) Method, apparatus, device, medium and program product for game control
JP2024521888A (en) Method, device, terminal and program for controlling virtual objects
CN115089958A (en) Game control method, device, equipment and storage medium
CN113926186A (en) Method and device for selecting virtual object in game and touch terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination