CN113769404A - Game movement control method and device and electronic equipment - Google Patents

Game movement control method and device and electronic equipment Download PDF

Info

Publication number
CN113769404A
CN113769404A CN202111081229.9A CN202111081229A CN113769404A CN 113769404 A CN113769404 A CN 113769404A CN 202111081229 A CN202111081229 A CN 202111081229A CN 113769404 A CN113769404 A CN 113769404A
Authority
CN
China
Prior art keywords
target
control
area
movement control
control parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111081229.9A
Other languages
Chinese (zh)
Inventor
舒小雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111081229.9A priority Critical patent/CN113769404A/en
Publication of CN113769404A publication Critical patent/CN113769404A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a movement control method and device in a game and electronic equipment, relates to the technical field of games, and solves the technical problem that a player cannot comprehensively know the current specific movement condition of a virtual object when controlling the virtual object to move complicatedly. The method comprises the following steps: responding to a first touch operation aiming at the mobile control, and determining at least one target mobile control parameter corresponding to a target function area corresponding to the first touch operation; controlling the virtual object to move in the game scene based on the target movement control parameter, and displaying a target prompt identifier corresponding to the target movement control parameter in an area separated from the movement control in the graphical user interface; the target prompt mark comprises a direction prompt mark corresponding to a target direction control parameter and/or a speed prompt mark corresponding to a target speed control parameter in the target movement control parameters.

Description

Game movement control method and device and electronic equipment
Technical Field
The present application relates to the field of game technologies, and in particular, to a method and an apparatus for controlling movement in a game, and an electronic device.
Background
In the game, the player can control the virtual object to move by operating the movement control on the game interface. For example, when a player controls a virtual object to move in a game scene, acceleration, deceleration, and steering control of the virtual object may be achieved by operating a movement control.
However, when the player operates the movement control to control the virtual object to move, the player cannot accurately know the specific control state of the player himself with respect to the virtual object, and particularly when the player controls the virtual object to move more complicatedly, the player cannot comprehensively know the current specific movement condition of the virtual object, which affects the game experience of the player.
Disclosure of Invention
The application aims to provide a movement control method and device in a game and an electronic device, so as to solve the technical problem that a player cannot comprehensively know the current specific movement condition of a virtual object when controlling the virtual object to move in a complex manner.
In a first aspect, an embodiment of the present application provides a method for controlling movement in a game, where a terminal provides a graphical user interface, a game scene of the game includes a virtual object, the graphical user interface includes a movement control for the virtual object, the movement control includes a plurality of functional regions, each of the functional regions corresponds to at least one movement control parameter, and the movement control parameter includes a direction control parameter and/or a speed control parameter; the method comprises the following steps:
responding to a first touch operation aiming at the mobile control, and determining at least one target mobile control parameter corresponding to a target function area corresponding to the first touch operation;
controlling the virtual object to move in the game scene based on the target movement control parameter, and displaying a target prompt identifier corresponding to the target movement control parameter in an area separated from the movement control in the graphical user interface; the target prompt mark comprises a direction prompt mark corresponding to a target direction control parameter and/or a speed prompt mark corresponding to a target speed control parameter in the target movement control parameters.
In one possible implementation, the direction prompt identifier is a bar identifier, different positions of the bar identifier are used for indicating different moving directions, and the speed prompt identifier follows the movement of the touch point of the first touch operation on the direction prompt identifier.
In one possible implementation, the target function area corresponding to the first touch operation only corresponds to the target speed control parameter;
the step of displaying the target prompt identifier corresponding to the target movement control parameter in the area separated from the movement control in the graphical user interface includes:
and displaying the speed prompt identification corresponding to the target speed control parameter in an area separated from the mobile control in the graphical user interface, and controlling to hide the strip-shaped identification.
In one possible implementation, different locations in the functional area correspond to different degrees of control of the movement control parameter; the method further comprises the following steps:
determining a target control degree of a target movement control parameter corresponding to the target position relation according to the target position relation between the touch point of the first touch operation and the target function area;
and displaying a degree prompt identifier corresponding to the target control degree in the graphical user interface.
In one possible implementation, the degree of control includes: at least one of a control angle of the rotation direction and a control magnitude of the acceleration.
In one possible implementation, the target degree of control includes a target control angle of the turning direction;
the step of determining the target control degree of the target movement control parameter corresponding to the target position relationship according to the target position relationship between the touch point of the first touch operation and the target function area includes:
determining a target effective angle corresponding to the target position relation in the target function area according to the target position relation between the touch point of the first touch operation and the target function area; the effective angle corresponds to the control angle of the rotating direction one by one;
and determining a target control angle of the rotating direction corresponding to the target effective angle.
In one possible implementation, the step of displaying the target prompt identifier corresponding to the target movement control parameter in an area of the graphical user interface separate from the movement control includes:
when the touch point of the first touch operation is located in the range corresponding to the mobile control, controlling a preset fixed area separated from the mobile control in the graphical user interface to display a target prompt identifier corresponding to the target mobile control parameter;
and when the touch point of the first touch operation is positioned outside the range corresponding to the mobile control, displaying a target prompt identifier corresponding to the target mobile control parameter according to the position of the touch point.
In one possible implementation, the preset fixed area is located a preset distance above the mobile control.
In a possible implementation, the step of displaying, when the touch point of the first touch operation is located outside the range corresponding to the mobile control, the target prompt identifier corresponding to the target mobile control parameter according to the position of the touch point includes:
and when the touch point of the first touch operation is located outside the range corresponding to the mobile control, displaying a target prompt identifier corresponding to the target mobile control parameter at a preset distance above the position of the touch point.
In one possible implementation, a plurality of the functional regions are arranged annularly on the mobile control around a central location of the mobile control; an invalid function area is arranged at the central position of the mobile control; the method further comprises the following steps:
responding to a second touch operation aiming at the invalid functional area, and carrying out invalid processing on the second touch operation.
In one possible implementation, the method further comprises:
and in response to the touch point moving to the invalid functional area from a position, except the invalid functional area, in the mobile control, canceling the movement control of the virtual object and hiding the prompt identifier.
In one possible implementation, a correction area is arranged in a preset range around the mobile control; the method further comprises the following steps:
responding to a third touch operation aiming at the correction area, and determining at least one first movement control parameter corresponding to a first functional area corresponding to the third touch operation;
controlling the virtual object to move in the game scene based on the first movement control parameter, and displaying a first prompt identifier corresponding to the first movement control parameter in an area, separated from the movement control, of the graphical user interface; the first prompt identifier includes a direction prompt identifier corresponding to a first direction control parameter in the first movement control parameter and/or a speed prompt identifier corresponding to a first speed control parameter.
In one possible implementation, after the step of determining, in response to the first touch operation on the mobile control, at least one target movement control parameter corresponding to a target functional area corresponding to the first touch operation, the method further includes:
responding to the movement of the touch point of the first touch operation from the mobile control to the correction area, and determining the relative position of the touch point moving to the correction area relative to the mobile control and the corresponding second function area of the relative position in the mobile control;
controlling the virtual object to move in the game scene based on a second movement control parameter corresponding to the second functional area, and displaying a second prompt identifier corresponding to the second movement control parameter in an area, separated from the movement control, of the graphical user interface; and the second prompt identifier comprises a direction prompt identifier corresponding to a second direction control parameter in the second mobile control parameters and/or a speed prompt identifier corresponding to a second speed control parameter.
In one possible implementation, the corresponding second functional region of the relative position in the mobile control is a functional region where an intersection position between the straight line and the boundary of the mobile control is located; and the straight line is a connecting line between the touch point in the correction area and the central position of the mobile control.
In one possible implementation, the method further comprises:
and in response to a third touch operation aiming at the correction area, highlighting at least one of the intersection point position, the straight line and the functional area where the intersection point position is located.
In one possible implementation, the method further comprises:
and responding to the movement of the touch point of the first touch operation or the third touch operation to the position, except the correction area and the movement control, in the graphical user interface, stopping the movement control of the virtual object, and hiding the prompt identifier.
In a second aspect, a mobile control device in a game is provided, where a terminal provides a graphical user interface, a game scene of the game includes a virtual object, the graphical user interface includes a mobile control for the virtual object, the mobile control includes a plurality of functional regions, each functional region corresponds to at least one mobile control parameter, and the mobile control parameters include a direction control parameter and/or a speed control parameter; the device comprises:
the determining module is used for responding to a first touch operation aiming at the mobile control, and determining at least one target mobile control parameter corresponding to a target function area corresponding to the first touch operation;
the control module is used for controlling the virtual object to move in the game scene based on the target movement control parameter and displaying a target prompt identifier corresponding to the target movement control parameter in an area separated from the movement control in the graphical user interface; the target prompt mark comprises a direction prompt mark corresponding to a target direction control parameter and/or a speed prompt mark corresponding to a target speed control parameter in the target movement control parameters.
In a third aspect, an embodiment of the present application further provides an electronic terminal, which includes a memory and a processor, where the memory stores a computer program that is executable on the processor, and is characterized in that the processor implements the steps of the method according to the first aspect when executing the computer program.
In a fourth aspect, embodiments of the present application further provide a computer-readable storage medium storing computer-executable instructions that, when invoked and executed by a processor, cause the processor to perform the method of the first aspect.
The embodiment of the application brings the following beneficial effects:
the method, the device, and the electronic device for controlling movement in a game provided in the embodiments of the present application can determine at least one target movement control parameter corresponding to a target function region corresponding to a first touch operation in response to the first touch operation for a movement control, then control a virtual object to move in a game scene based on the target movement control parameter, and display a target prompt identifier corresponding to the target movement control parameter in a region separated from the movement control in a graphical user interface, where the target prompt identifier may include a direction prompt corresponding to a target direction control parameter in the target movement control parameter and/or a speed prompt identifier corresponding to a target speed control parameter, in this scheme, by using a specific target movement control parameter in a direction aspect and/or a speed aspect corresponding to the target function region corresponding to the touch operation, when the virtual object is controlled to move in a game scene, target prompt identifiers such as direction and speed corresponding to target movement control parameters can be displayed, so that the comprehensive movement conditions such as specific direction and specific speed of the current virtual object can be accurately reflected, the specific control conditions of the current player on the movement direction and the movement speed are accurately and comprehensively prompted, and the technical problem that the player cannot comprehensively know the current specific movement conditions of the virtual object when the virtual object is controlled to move in a complex manner is solved.
Drawings
In order to more clearly illustrate the detailed description of the present application or the technical solutions in the prior art, the drawings needed to be used in the detailed description of the present application or the prior art description will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic terminal provided in an embodiment of the present application;
fig. 3 is a schematic view of a usage scenario of an electronic terminal according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating a method for controlling movement in a game according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of an electronic terminal displaying a graphical user interface according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an electronic terminal displaying a graphical user interface according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an electronic terminal displaying a graphical user interface according to an embodiment of the present application;
FIG. 8 is a schematic diagram of an electronic terminal displaying a graphical user interface according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an electronic terminal displaying a graphical user interface according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an electronic terminal displaying a graphical user interface according to an embodiment of the present application;
FIG. 11 is a schematic diagram of an electronic terminal displaying a graphical user interface according to an embodiment of the present application;
FIG. 12 is a schematic diagram of an electronic terminal displaying a graphical user interface according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a movement control device in a game according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "comprising" and "having," and any variations thereof, as referred to in the embodiments of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
At present, a player generally operates a mobile control on a game interface to control a virtual object to move in many games, and the scheme has the disadvantage that when the player operates the mobile control to control the virtual object to move, the player cannot accurately know the specific control state of the player himself/herself on the virtual object, that is, the player can control the virtual object to move but cannot know the specific control state, so that the game experience of the player is affected.
For example, when a player controls a virtual object to move in a game scene, acceleration, deceleration and steering control of the virtual object can be realized by operating a movement control, but the player cannot know the control degree of the acceleration, deceleration and steering of the virtual object. For example, the player can control the virtual object to perform complex movements with closely combined speed and direction, and when the player operates the control to control the virtual object, the player cannot comprehensively know the current complex movements such as specific speed and specific direction.
Based on this, the embodiment of the application provides a method and a device for controlling movement in a game and an electronic device, and by the method, the technical problem that a player cannot know the current specific movement condition of a virtual object more comprehensively when controlling the virtual object to move complicatedly can be solved.
The in-game movement control method in one embodiment of the application can be run on a local terminal device or a server. When the control method is operated on a server, the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and a client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presenting main body are separated, the storage and the running of the movement control method in the game are finished on a cloud game server, and the client equipment is used for receiving and sending data and presenting the game picture, for example, the client equipment can be display equipment with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an optional implementation manner, taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
In a possible implementation manner, an embodiment of the present application provides a method for controlling movement in a game, where a graphical user interface is provided through a terminal device, where the terminal device may be the aforementioned local terminal device, and may also be the aforementioned client device in a cloud interaction system.
For example, as shown in fig. 1, fig. 1 is a schematic view of an application scenario provided in the embodiment of the present application. The application scenario may include a touch terminal (e.g., a cell phone 102) and a server 101, and the touch terminal may communicate with the server 101 through a wired network or a wireless network. The touch terminal is used for operating a virtual desktop, and can interact with the server 101 through the virtual desktop to control a virtual object in the server 101.
The touch terminal of the present embodiment is described by taking the mobile phone 102 as an example. The handset 102 includes Radio Frequency (RF) circuitry 210, memory 220, a touch screen 230, a processor 240, and the like. Those skilled in the art will appreciate that the handset configuration shown in fig. 2 is not intended to be limiting and may include more or fewer components than those shown, or may combine certain components, or split certain components, or arranged in different components. Those skilled in the art will appreciate that the touch screen 230 is part of a User Interface (UI) and that the cell phone 102 may include fewer than or the same User Interface as illustrated.
The RF circuitry 210 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 220 may be used for storing software programs and modules, and the processor 240 executes various functional applications and data processing of the cellular phone 102 by operating the software programs and modules stored in the memory 220. The memory 220 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the stored data area may store data created from use of the handset 102, and the like. Further, the memory 220 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The touch screen 230 may be used to display a graphical user interface and receive user operations with respect to the graphical user interface. A particular touch screen 230 may include a display panel and a touch panel. The Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may collect contact or non-contact operations of a user on or near the touch panel (for example, as shown in fig. 3, operations of the user on or near the touch panel using any suitable object or accessory such as a finger 301, a stylus pen, etc.), and generate preset operation instructions. In addition, the touch panel may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction and gesture of a user, detects signals brought by touch operation and transmits the signals to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into information that can be processed by the processor, sends the information to the processor 240, and receives and executes commands sent from the processor 240. In addition, the touch panel may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, a surface acoustic wave, and the like, and may also be implemented by any technology developed in the future. Further, the touch panel may cover the display panel, a user may operate on or near the touch panel covered on the display panel according to a graphical user interface displayed by the display panel, the touch panel detects an operation thereon or nearby and transmits the operation to the processor 240 to determine a user input, and the processor 240 provides a corresponding visual output on the display panel in response to the user input. In addition, the touch panel and the display panel can be realized as two independent components or can be integrated.
The processor 240 is the control center of the handset 102, connects various parts of the entire handset using various interfaces and lines, and performs various functions and processes of the handset 102 by running or executing software programs and/or modules stored in the memory 220 and calling data stored in the memory 220, thereby performing overall monitoring of the handset.
Embodiments of the present application are further described below with reference to the accompanying drawings.
Fig. 4 is a flowchart illustrating a method for controlling movement in a game according to an embodiment of the present disclosure.
The method can be applied to a terminal (for example, the mobile phone 102 shown in fig. 2) capable of presenting a graphical user interface, the graphical user interface is provided by the terminal, a game scene of the game includes a virtual object, the graphical user interface includes a movement control for the virtual object, the movement control includes a plurality of functional regions, each functional region corresponds to at least one movement control parameter, and the movement control parameters include a direction control parameter and/or a speed control parameter. As shown in fig. 4, the method includes:
step S410, in response to a first touch operation for the mobile control, determining at least one target mobile control parameter corresponding to a target function area corresponding to the first touch operation.
The virtual object in the embodiment of the present application may include, but is not limited to, a virtual character, a virtual vehicle, and the like. For example, a certain character, a certain sailing vessel, a certain army, a certain naval fleet, etc., the present embodiment takes the virtual object as the sailing vessel as an example for explanation.
For example, as shown in fig. 5, a movement control 501 is provided on the graphical user interface, and a player can control a virtual object 502 (e.g., a sailboat) to move in a game scene by performing a first touch operation on the movement control 501.
Of course, the touch point of the first touch operation may not be on the mobile control, and the target function area corresponding to the first touch operation may also be determined according to other manners. Illustratively, the target function area corresponding to the first touch operation is determined by connecting the touch point and the center of the area (such as the compensation area) outside the moving control.
In an alternative embodiment, the movement control may be a relatively large area range on the graphical user interface or a relatively small area range on the graphical user interface. The movement control can be square, rectangular, ring-shaped, or other shapes (e.g., circular, etc.). The content presented by the graphical user interface may contain all of the movement controls or may be local to the hot region of the content. The movement control may be displayed in the graphical user interface at the top left, top right, or other locations, as the illustrative embodiments are not limiting.
In an alternative embodiment, the functional area included in the movement control may be a plurality of areas, and may include, but is not limited to, an acceleration area, a deceleration area, a steering area, and the like. The functions that may be performed include, but are not limited to, acceleration, deceleration, steering, a combination of acceleration and steering, a combination of deceleration and steering, and the like.
For each functional area in the mobile control, for example, the upper end area and the lower end area in the mobile control may only have a functional area corresponding to the speed control parameter, that is, only the speed of the virtual object may be controlled; the left end region and the right end region in the mobile control can only have functional regions corresponding to the direction control parameters, namely, only the direction of the virtual object can be controlled; other oblique direction areas in the mobile control can be function areas corresponding to the direction control parameters and the speed control parameters, namely the speed and the direction can be controlled in a combined mode.
Step S420, controlling the virtual object to move in the game scene based on the target movement control parameter, and displaying a target prompt identifier corresponding to the target movement control parameter in an area separated from the movement control in the graphical user interface.
The target prompt mark comprises a direction prompt mark corresponding to a target direction control parameter in the target movement control parameters and/or a speed prompt mark corresponding to a target speed control parameter.
In practical applications, the area separate from the movement control may be above or below the movement control. And an arbitrary position that can be separated from the movement control, such as the left or right direction.
In an optional implementation manner, the target movement control parameter corresponds to a target prompt identifier. And may be various. The target prompt identification may be a single icon, such as a directional arrow, a rocket, a boat anchor, or other representative static icons, such as a rocket reminds the player to accelerate, a boat anchor reminds the player to decelerate; the method can also be dynamic display prompts such as a progress bar and an indicator, for example, a gradual change effect is displayed on the indicator to prompt the user of the change of the moving state of the virtual object, wherein the gradual change effect can be a gradual change of color or a gradual change of brightness; the target prompt identifier corresponding to the target movement control parameter may also be a combination of static and dynamic. Different movement control parameters correspond to different target prompt identifiers one to one.
In the embodiment of the application, the specific target movement control parameters in the direction and/or speed aspect corresponding to the target function area corresponding to the touch operation can be displayed while the virtual object is controlled to move in the game scene, so that the target prompt identifiers such as the direction and the speed corresponding to the target movement control parameters can be displayed, the more comprehensive movement conditions such as the specific direction and the specific speed of the current virtual object can be accurately reflected, the specific control conditions of the current player on the movement direction and the movement speed can be accurately and comprehensively prompted, and the technical problem that the player cannot comprehensively know the current specific movement conditions of the virtual object when the virtual object is controlled to move in a complex manner is solved.
The above steps are described in detail below.
In some embodiments, the direction cue indicator may be displayed simultaneously with the speed cue indicator. As an example, the direction prompt identifier is a bar identifier, different positions of the bar identifier are used for indicating different moving directions, and the speed prompt identifier follows the movement of the touch point of the first touch operation on the direction prompt identifier.
For example, other oblique direction areas on the mobile control may correspond to the functional areas corresponding to the direction control parameters and the speed control parameters, and the speed and the direction can be controlled in combination, so that bar identifiers corresponding to the speed prompt identifier and the direction prompt identifier, such as the prompt identifier 703, the prompt identifier 704, the prompt identifier 705, and the prompt identifier 706 in fig. 7, are displayed at the same time. The speed marks are displayed at different positions of the strip marks, so that the operating states such as speed and direction can be more definite, and the technical problem of unclear operating states is solved.
Based on this, only the direction prompt mark may be displayed and the speed prompt mark may be hidden, or only the speed prompt mark may be displayed and the direction prompt mark may be hidden. As an example, the target function area corresponding to the first touch operation only corresponds to the target speed control parameter; the step S420 may specifically include the following steps:
step S422, a speed prompt identifier corresponding to the target speed control parameter is displayed in an area separated from the mobile control in the graphical user interface, and the hidden strip-shaped identifier is controlled to be hidden.
In practical application, the upper and lower end regions of the mobile control may only include the functional regions corresponding to the speed control parameters, and only the speed of the virtual object can be controlled when the region is operated, and accordingly only the speed prompt identifier is displayed and the bar identifier corresponding to the direction prompt identifier is hidden, such as the prompt identifier 701 and the prompt identifier 702 in fig. 7. Or, the left and right end regions of the mobile control may only include the functional regions corresponding to the direction control parameters, and only the direction of the virtual object may be controlled when the region is operated, and accordingly only the bar-shaped identifier corresponding to the direction prompt identifier is displayed and the speed prompt identifier is hidden.
When only the speed of the virtual object is operated, only the corresponding speed prompt identification is displayed and the strip-shaped identification corresponding to the direction prompt identification is hidden, so that the displayed prompt identification is more targeted, the condition that other prompt identifications without operation confuse the prompt identification with operation is avoided, and the player can conveniently observe the current operation.
In some embodiments, the game can also prompt the control degree of the movement control parameter, so that the prompt content is more comprehensive. As an example, different positions in the functional area correspond to different degrees of control of the movement control parameter; the method may further comprise the steps of:
step a), determining a target control degree of a target movement control parameter corresponding to a target position relation according to the target position relation between a touch point of a first touch operation and a target function area;
and b), displaying a degree prompt identifier corresponding to the target control degree in the graphical user interface.
In practical applications, the movement control in the game includes a plurality of functional control areas, such as acceleration, deceleration, turning, ascending, descending and the like. As shown in fig. 6, taking the movement control 601 as an example, the function control area above the movement control represents an acceleration function area, but in the acceleration function area, the acceleration function area can be divided into two sub areas, namely a first acceleration function area 602 and a second acceleration function area 603. The first acceleration function area 602 and the second acceleration function area 603 correspond to different degrees of the acceleration function, respectively. For example, when the player clicks a touch point of the graphical user interface to be located in the first acceleration function region, the acceleration of the virtual object is 10 m/quadratic second; when the player clicks the touch point of the graphical user interface and is located in the second acceleration function area, the acceleration of the virtual object is 5 m/quadratic second. When the player clicks on a different area, the prompts displayed in the graphical user interface are also different. For example, when the player clicks the first acceleration function region 602, since the acceleration is large, the corresponding level prompt flag is the large prompt flag 604; when the player clicks the second acceleration function region 603, the acceleration is small, and therefore the corresponding achievement is presented as a small presentation indicator 605.
For the target position relationship between the touch point and the target function region, it should be noted that when the touch point is located outside the function region, the control degree or other logic is determined according to the position of the joystick in the function region, for example, when the touch point is located outside the function region, the control degree is determined to be the maximum control degree of the current control parameter.
By displaying the degree prompt identification corresponding to the target control degree on the user interface, the player can visually see the control degree of the player on the virtual object, the player can conveniently correct the operation and perform the next operation, and the game experience of the player is improved.
Based on the step a), the control degree can comprise a plurality of types, so as to prompt the control degree more flexibly. As an example, the target control degree may include:
at least one of a control angle of the rotation direction and a control magnitude of the acceleration.
In practical application, as shown in fig. 7, when the player only performs an operation of acceleration or deceleration, the presented prompt is an arrow icon, where the prompt identifier 701 corresponds to an acceleration function, and the prompt identifier 702 corresponds to a deceleration function; when the player only performs the operation of turning, the presented prompt is the combination of an arrow icon and a crescent icon, wherein the prompt mark 703 corresponds to the left-turn function, and the prompt mark 704 corresponds to the right-turn function; when the player performs the operation of accelerating or decelerating and turning, the presented prompt is still the combination of the arrow icon and the crescent icon, the arrow icon slides to different positions in the crescent icon according to the turning angle controlled by the player, when the player performs the operation of accelerating and turning left, the prompt such as the prompt mark 705 appears, and when the player performs the operation of decelerating and turning left, the prompt such as the prompt mark 706 appears.
In the present embodiment, when only the steering control is performed, the virtual object performs steering in a constant speed state.
Through the styles of different prompt marks, the current control degree of the target virtual object by the player user can be prompted, so that the player can conveniently perform the next operation, and the game experience of the player is improved.
Based on the step a), the control angle of the rotation direction can be controlled through the target effective angle corresponding to the target position relation between the touch point and the target function area. Illustratively, the target control degree includes a target control angle of the turning direction; the step a) may include the steps of:
step c), determining a corresponding target effective angle of the target position relation in the target function area according to the target position relation between the touch point of the first touch operation and the target function area; the effective angle corresponds to the control angle of the rotation direction one by one;
and d), determining a target control angle of the rotating direction corresponding to the target effective angle.
In practical applications, as shown in fig. 8, the function corresponding to the function control area 801 is an acceleration right turn function. The area between the first area boundary 802 and the second area boundary 803 is an acceleration right-turning function control area, and the steering angle that can be controlled is 0 to 90 degrees to the right. However, due to the existence of the acceleration function control area and the right turn function control area, the central angle corresponding to the area of the acceleration right turn function control area cannot reach 90 degrees, so the effective angle range of the acceleration right turn function control area needs to be calculated, and the effective angle range is realized through the sequence frame corresponding to the direction indication, that is, the effective angle of the finger operation in the interaction area and the indication direction of the actual control are in a one-to-one correspondence relationship.
As another example, the area between the first area boundary 804 and the second area boundary 805 on the cue flag is in a corresponding relationship with the acceleration right turn function control area described above. The effective angle of the finger operation in the interactive area is in one-to-one correspondence with the display position of the target prompt mark.
The above examples are given by way of example of an acceleration right turn function control region, and the same principles apply to an acceleration left turn function control region, a deceleration right turn function control region, and a deceleration left turn function control region.
The target control angle of the rotating direction corresponding to the target effective angle is determined by determining the target effective angle in the target function area, and the target prompt identification is further displayed at the corresponding display position, so that a player can visually observe the rotating direction of the virtual object, the rotating direction of the virtual object is accurately controlled, and the game experience of the player is improved.
In some embodiments, the step S420 may include the following steps:
step S424, when the touch point of the first touch operation is located in the range corresponding to the mobile control, controlling a preset fixed area separated from the mobile control in a graphical user interface to display a target prompt identifier corresponding to a target mobile control parameter;
in step S426, when the touch point of the first touch operation is located outside the range corresponding to the mobile control, the target prompt identifier corresponding to the target mobile control parameter is displayed according to the position of the touch point.
It should be noted that the range corresponding to the mobile control may include not only the area of the mobile control itself, but also a part of the designated area outside the area of the mobile control itself. For example, as shown in fig. 9, the range corresponding to the movement control may include a first range 904 (including a partial area of the movement control itself) below the whole line, and the range outside the range corresponding to the movement control may be a second range outside the first range 904, such as a preset interface area 902.
In practical application, the touch point of the first touch operation is not necessarily located in the range corresponding to the mobile control, and different prompt identifier display determination modes are corresponding to different conditions that the touch point is located in the range and outside the range, so that the display of the prompt identifier can adapt to different operation modes, and the operation flexibility is improved.
Based on the above steps S424 and S426, the display position of the prompt identifier may also be a fixed position in the graphical user interface, and if the hand of the player may block the prompt, the display position of the prompt identifier may also be displayed along with the touch point and moved away from the touch point, so that the prompt identifier is not blocked. As an example, the prompt identifier is displayed in a preset fixed area in the graphical user interface; after the step S420, the method may further include the steps of:
and e), in response to the touch point moving to the preset interface area, hiding the prompt at the preset fixed area and displaying the prompt at the preset distance above the touch point.
In practical application, the process that the target prompt mark slides from the mobile control to the outside of the mobile control is hidden from fixed display and changes to move along with the finger, and the displayed dynamic effect can be more intuitively reflected.
For example, as shown in fig. 9, when the player clicks the left turn function control area, the prompt is displayed in a preset fixed area 901 in the graphical user interface, and the preset fixed area 901 is located in a preset interface area 902 in the graphical user interface. The preset fixed area is located above the mobile control, and the preset distance 903 between the preset fixed area and the mobile control can be adjusted according to the use habits and the operation habits of the player. For example, if the fingers of the player are large and easily block the prompt, the preset distance may be set to a large value, so as to leave a sufficient distance between the preset fixed area and the mobile control, so that the fingers of the player do not block the prompt; if the fingers of the player are small and the prompt is not easy to block, the preset distance can be set to be a small numerical value, so that the preset fixed area is tightly attached to the movable control. For another example, the adjustment is performed according to the operation habit of the player, the player who prefers fingertip touch may select a smaller preset distance, and the player who prefers abdomen touch may select a larger preset distance.
When the touch point moves to the preset interface area, that is, the finger of the player moves to the preset interface area 902, in order to prevent the prompt from being blocked by the finger, the original prompt located at the preset fixed area 901 is hidden, and a new prompt is displayed above the finger touch point. As shown in fig. 10, when the finger of the player moves into the preset interface area 902, a new cue 1002 is generated above the touch point 1001, and the distance between the cue and the touch point is the preset distance 1003.
According to the method for prompting the change of the displayed position of the touch point according to the different positions of the touch point, the situation that fingers of a player can shield the prompt in operation is avoided, the prompt can be always displayed in front of the player, and the game experience of the player is improved.
Based on the step S424, the prompt display position may be located above the mobile control and in an area having a certain distance from the mobile control, so as to prevent the fingers of the player from blocking the prompt during the movement. As an example, the preset fixed area is located at a preset distance above the moving control.
In practical applications, as shown in fig. 9, the preset fixed area is located above the mobile control, and a distance between the preset fixed area and the mobile control is a preset distance 903. Illustratively, the preset fixed region is located above the graphical user interface, that is, a second region above the dashed-dotted line in fig. 9, for example, the preset interface region 902, the preset fixed region may include a part of the mobile control, or may be located completely above the mobile control, and the specific region boundary is preset after the system is debugged, so that the situation that the finger of the player is blocked can be avoided as much as possible.
The effect of preventing fingers of a player from blocking prompts in moving is achieved by locating the preset fixed area at the preset distance above the moving control in the graphical user interface.
Based on the step S426, when the player' S hand is located outside the range corresponding to the mobile control, the prompt mark may be displayed at a distance above the finger, so as to avoid the finger from blocking the prompt mark. As an example, the step S426 may specifically include the following steps:
step S4260, when the touch point of the first touch operation is located outside the range corresponding to the mobile control, displaying a target prompt identifier corresponding to the target mobile control parameter at a preset distance above the position of the touch point.
In a possible implementation manner, the distance between the prompt identifier and the mobile control when the touch point of the first touch operation is located within the range corresponding to the mobile control, and the distance between the prompt identifier and the finger when the touch point is located outside the range corresponding to the mobile control may be the same. For example, the first preset distance 1003 in fig. 10 is a distance from the touch point to a bottom center of the prompt identifier, that is, a distance between the touch point and the target prompt identifier, and the preset distance 903 in fig. 9 is a distance from the upper edge of the mobile control to the bottom center of the prompt identifier in the preset fixed area, that is, a distance between the preset fixed area and the mobile control. When the touch point is located outside the range corresponding to the mobile control, if the finger position of the player is located in a second region above the mobile control, for example, in the preset interface region 902, the first preset distance 1003 (i.e., the distance between the touch point and the target prompt identifier) shown in fig. 10 is equal to the preset distance 903 (i.e., the distance between the preset fixed region and the mobile control) shown in fig. 9. For example, the distance between the tip identifier and the movement control when the player's finger is not above the movement control is the same as the distance between the tip identifier and the finger when the player's finger is above the movement control.
When the touch point of the first touch operation is located outside the range corresponding to the mobile control, the prompt mark is displayed at a position above the touch point by a distance, and the condition that fingers of a player shield the prompt mark can be avoided. The distance between the prompt mark and the mobile control when the fingers of the player are located in the range corresponding to the mobile control is the same as the distance between the prompt mark and the fingers when the fingers of the player are located outside the range corresponding to the mobile control, so that the player can recognize that the same prompt mark only changes along with the position of the fingers, the position displayed by the prompt is reasonable, the player can feel similar, and the player can observe the prompt.
In some embodiments, the interaction areas of the inner rings of the mobile controls are dense, and overlapping may occur, so that the inner rings of the rockers can be hollowed out to reduce the possibility of false touch. As an example, a plurality of functional areas are arranged annularly on the mobile control around a central location of the mobile control; an invalid function area is arranged at the central position of the mobile control; the method may further comprise the steps of:
and f), responding to the second touch operation aiming at the invalid functional area, and performing invalid processing on the second touch operation.
In practical applications, as shown in fig. 11, a plurality of functional areas are arranged on the mobile control in a ring shape around the central position of the mobile control, a circular invalid functional area 1102 (shaded portion) is arranged at the central position of the mobile control, and the invalid functional area can also be understood as a false touch prevention correction area. When the position of the player clicking the graphical user interface is located in the invalid function area, the click does not trigger any movement control parameter, and the system carries out invalid processing on the touch operation. If the invalid function area does not exist, when the player clicks the central area of the mobile control, the misoperation is possibly caused because the interaction area is dense, and the game experience is influenced. The radius of the circular invalid functional area can be adjusted according to the actual situation and the specific operation habit of the player.
The invalid functional region is not necessarily circular, but may be square, triangular, or the like, and the circle is merely more suitable for the operation habit of most players.
The invalid function area is arranged at the central position in the mobile control, so that the possibility of mistakenly touching and misoperation of a player is reduced, and the game experience of the player is improved.
Based on the above step f), the invalid function region may implement a plurality of functions other than the false touch prevention function, for example, the movement control of the virtual object may be cancelled by the invalid function region. As an example, the method may further comprise the steps of:
and g), in response to the touch point moving to the invalid function area from the position except the invalid function area in the mobile control, canceling the mobile control of the virtual object and hiding the prompt identifier.
In practical applications, as shown in fig. 11, when a player clicks the right acceleration turning function control area 1101 to perform an acceleration turning operation on a virtual object, if the player wants to cancel the operation, the player can move a finger from the right acceleration turning function control area 1101 to the invalid function area 1102, and the system performs an invalidation process on the touch operation to cancel the acceleration turning operation on the virtual object.
Compared with a mode that the player leaves the fingers from the screen to cancel related operations, the mode is simpler and more convenient, the process that the player lifts the fingers and clicks again is omitted, the purpose of canceling the current operations can be achieved only by sliding the fingers, operation steps are saved, and the situation that the player clicks again to a certain extent possibly to cause mistaken touch is avoided.
In some embodiments, because the size of the operation device is different (for example, the tablet computer is larger than the screen of the mobile phone), the touch range of the mobile control may be smaller on a smaller device, which brings inconvenience to the operation of the player, so that the response range of the mobile control can be expanded, and the game experience of the player can be improved. As an example, a correction area is arranged in a peripheral preset range of the mobile control; the method may further comprise the steps of:
step h), responding to a third touch operation aiming at the correction area, and determining at least one first movement control parameter corresponding to a first functional area corresponding to the third touch operation;
and i), controlling the virtual object to move in the game scene based on the first movement control parameter corresponding to the first functional area, and displaying a first prompt identifier corresponding to the first movement control parameter in an area separated from the movement control in the graphical user interface.
For the step i), the first prompt identifier includes a direction prompt identifier corresponding to the first direction control parameter in the first movement control parameter and/or a speed prompt identifier corresponding to the first speed control parameter.
In practical applications, as shown in fig. 12, a correction area 1201 (the area between the dotted line and the movement control) is preset around the movement control. It should be noted that the shape of the correction region is not particularly limited, and may be a regular pattern such as a square, a circle, or an irregular pattern; the area of the correction area can be larger or smaller, can be larger than the area occupied by the movable control, and can also be tightly attached to the outer edge of the movable control; the position of the correction area is not particularly limited, and may be centered on the mobile control or not centered on the mobile control. In short, the correction area can be set according to the personal actual situation and the operation habit of the player, and the square correction area 1201 with the movement control as the center is taken as an example in the present embodiment.
For example, when the touch point clicked by the player is located outside the range of the mobile control and within the range of the correction area, the mobile control parameter of the mobile control is also triggered. As shown in fig. 10, the touch point 1202 (small shaded circle) of the player is located in the correction area 1201, and the system detects that the relative position of the touch point 1202 with respect to the movement control is the first relative position, that is, the relative position of the touch point 1203 (white circle), so that the current touch operation of the player triggers the function corresponding to the functional area where the first relative position is located, that is, the left turn acceleration function, and then a prompt of left turn acceleration is displayed on the graphical user interface, and the position where the prompt appears is determined according to the above-mentioned method.
The radius of circle is bigger, and the arc length that same central angle corresponds is bigger, and is regional through setting up to revise, but increased the region of player touch operation among the graphical user interface, make the player have bigger operating space, rather than only operating to moving the controlling part, the player operates has higher fault-tolerant rate, reduces the possibility of maloperation, has improved player's gaming experience.
Based on the step h) and the step i), even if the touch point slides from the range of the mobile control to the correction range area outside the range of the mobile control, the corresponding mobile control parameter can be continuously triggered, and the mobile control parameter is not interrupted, so that the control operation of the player is facilitated. As an example, after the step S410, the method may further include the steps of:
step j), responding to the movement of the touch point of the first touch operation from the mobile control to the correction area, and determining the relative position of the touch point moving to the correction area relative to the mobile control and a corresponding second function area of the relative position in the mobile control;
and k), controlling the virtual object to move in the game scene based on the second movement control parameter corresponding to the second functional area, and displaying a second prompt identifier corresponding to the second movement control parameter in an area separated from the movement control in the graphical user interface.
For the step k), the second prompt identifier includes a direction prompt identifier corresponding to the second direction control parameter in the second mobile control parameter and/or a speed prompt identifier corresponding to the second speed control parameter.
In practical application, the touch point falls within the range of the mobile control in the first click operation, and then when the player moves the finger out of the mobile control and into the correction area, the mobile control parameters of the mobile control are also continuously triggered. As shown in fig. 12, the player moves the finger from the range of the movement control to the correction area, and at this time, the position of the finger is the position of the touch point 1202 (the shaded small circle), and the system detects that the relative position of the touch point 1202 with respect to the movement control is the relative position of the touch point 1203 (the white circle), so that the current touch operation of the player continuously triggers the function corresponding to the function area where the relative position of the touch point 1203 (the white circle) is located, that is, the left turn acceleration function, and the control operation is not interrupted in the process of the finger movement, and then a prompt identifier for accelerating the left turn is displayed on the graphical user interface, and the position where the prompt identifier appears is determined according to the above-mentioned method.
When the fingers of the player slide to the correction area from the range of the mobile control, the corresponding mobile control parameters can be continuously triggered, the control on the virtual object is not interrupted, the control is not interrupted due to the movement of the fingers, the possibility of misoperation is reduced, and the game experience of the player is improved.
And based on the step h) or the step j), judging the functional area corresponding to the touch point position in the correction area, and based on the relative position of the functional area relative to the circle center of the mobile control, judging the control position of reasonable operation. As an example, the second functional region corresponding to the relative position in the mobile control is a functional region where an intersection point between the straight line and the boundary of the mobile control is located; and the straight line is a connecting line between the touch point in the correction area and the central position of the mobile control.
In practical applications, as shown in fig. 12, a finger of a player clicks the correction area or moves into the correction area to generate a touch point 1202, a connection line between the touch point 1202 and the center position of the moving control is a straight line 1204 (in fig. 12, the straight line is replaced by a dot-dash line for clarity of the picture), an intersection point of the straight line 1204 and the boundary of the moving control is a touch point 1203 (relative position) at a relative position, and a movement control parameter corresponding to a functional area where the touch point 1203 (white circle) at the relative position is located is a movement control parameter triggered by the touch point 1202, that is, a left turn acceleration function.
The touch point, the circle center of the movable control and the relative touch point (relative position) are in a line, so that the system can accurately judge the operation to be performed by the player.
Based on the step h) or the step j), the functional area corresponding to the touch point of the correction area can be highlighted to prompt the player of the currently selected functional area. As an example, the method may further comprise the steps of:
and step l), responding to a third touch operation aiming at the correction area, and highlighting at least one of the intersection point position, the straight line and the functional area where the intersection point position is located.
In practical applications, highlighting includes, but is not limited to, any one or more of:
static display, dynamic display, highlight display and amplification display.
For example, a touch point 1203 (white circle) as shown in fig. 12 may be generated, and the dynamic display state of the small circle may include but is not limited to: the method has the advantages that effects of swinging, vibrating, flickering, amplifying and the like are achieved, and a player can be prompted with which motion control parameter is triggered at present more efficiently.
Based on the step h) and the step i), when the player operates and moves to the area outside the correction area and the movement control, the movement of the virtual object is stopped, so that the player stops the movement control. As an example, the method may further comprise the steps of:
and m), in response to the movement of the touch point of the first touch operation or the third touch operation to a position except the correction area and the mobile control in the graphical user interface, stopping the movement control of the virtual object and hiding the prompt identifier.
In actual practice, as shown in fig. 12, when the player clicks on the region 1205 excluding the correction region and the movement control or moves a finger from the correction region/the movement control to the region 1205 excluding the correction region and the movement control, the system invalidates the operation of the player, i.e., stops the control of the virtual object by the player and hides the presentation information.
It should be noted that, after the player clicks a region other than the correction region and the movement control, or moves a finger from the correction region/the movement control to a region other than the correction region and the movement control, or lifts the finger, thereby causing a situation that the control of the virtual object is stopped, the system does not continue to maintain the movement control parameter corresponding to the position of the touch point at the time of lifting, but keeps the virtual object moving forward at a constant speed.
By setting the boundary of the controllable area, the situation that the mobile control and the correction area occupy too much screen range, influence the display of game pictures or influence the usable range of other functional controls can be avoided.
FIG. 13 provides a schematic diagram of the structure of a motion control device in a game. The device can be applied to a terminal capable of running a game program, a graphical user interface is provided through the terminal, a game scene of the game comprises a virtual object, the graphical user interface comprises a mobile control aiming at the virtual object, the mobile control comprises a plurality of functional areas, each functional area corresponds to at least one mobile control parameter, and the mobile control parameters comprise direction control parameters and/or speed control parameters. As shown in fig. 13, the in-game movement control device includes:
a determining module 1301, configured to determine, in response to a first touch operation for a mobile control, at least one target movement control parameter corresponding to a target function area corresponding to the first touch operation;
a control module 1302, configured to control a virtual object to move in a game scene based on a target movement control parameter, and display a target prompt identifier corresponding to the target movement control parameter in an area of the graphical user interface that is separate from a movement control; the target prompt mark comprises a direction prompt mark corresponding to a target direction control parameter and/or a speed prompt mark corresponding to a target speed control parameter in the target movement control parameters.
In some embodiments, the direction indicator is a bar indicator, different positions of the bar indicator are used for indicating different moving directions, and the speed indicator follows the movement of the touch point of the first touch operation on the direction indicator.
In some embodiments, the target function area corresponding to the first touch operation only corresponds to the target speed control parameter; the control module is specifically configured to:
and displaying a speed prompt identifier corresponding to the target speed control parameter in an area separated from the mobile control in the graphical user interface, and controlling the hidden strip-shaped identifier.
In some embodiments, different locations in the functional area correspond to different degrees of control of the movement control parameter; the device also includes:
the second determining module is used for determining the target control degree of the target movement control parameter corresponding to the target position relation according to the target position relation between the touch point of the first touch operation and the target function area;
and the first display module is used for displaying a degree prompt identifier corresponding to the target control degree in the graphical user interface.
In some embodiments, the degree of control includes: at least one of a control angle of the rotation direction and a control magnitude of the acceleration.
In some embodiments, the target degree of control includes a target control angle of the turning direction; the second determination module is specifically applied to:
determining a target effective angle corresponding to the target position relation in the target function area according to the target position relation between the touch point of the first touch operation and the target function area; the effective angle corresponds to the control angle of the rotation direction one by one;
and determining a target control angle of the rotating direction corresponding to the target effective angle.
In some embodiments, the control module is specifically configured to:
when the touch point of the first touch operation is located in the range corresponding to the mobile control, controlling a preset fixed area separated from the mobile control in a graphical user interface to display a target prompt identifier corresponding to a target mobile control parameter;
and when the touch point of the first touch operation is positioned outside the range corresponding to the mobile control, displaying a target prompt identifier corresponding to the target mobile control parameter according to the position of the touch point.
In some embodiments, the preset fixed area is located a preset distance above the movement control.
In some embodiments, the control module is specifically configured to:
and when the touch point of the first touch operation is located outside the range corresponding to the mobile control, displaying a target prompt identifier corresponding to the target mobile control parameter at a preset distance above the position of the touch point.
In some embodiments, the plurality of functional regions are arranged annularly on the mobility control around a central location of the mobility control; an invalid function area is arranged at the central position of the mobile control; the device also includes:
and the invalidation processing module is used for responding to the second touch operation aiming at the invalid functional area and carrying out invalidation processing on the second touch operation.
In some embodiments, the apparatus further comprises:
and the second control module is used for responding to the touch point moving to the invalid function area from the position except the invalid function area in the moving control, canceling the moving control of the virtual object and hiding the prompt identification.
In some embodiments, a correction area is arranged in a preset range around the mobile control; the device also includes:
the third determining module is used for responding to a third touch operation aiming at the correction area, and determining at least one first movement control parameter corresponding to the first functional area corresponding to the third touch operation;
the third control module controls the virtual object to move in the game scene based on the first movement control parameter, and displays a first prompt identifier corresponding to the first movement control parameter in an area separated from the movement control in the graphical user interface; the first prompt mark comprises a direction prompt mark corresponding to a first direction control parameter in the first movement control parameter and/or a speed prompt mark corresponding to a first speed control parameter.
In some embodiments, the apparatus further comprises:
the fourth determining module is used for responding to the movement of the touch point of the first touch operation from the mobile control to the correction area, determining the relative position of the touch point moved to the correction area relative to the mobile control, and determining the corresponding second function area of the relative position in the mobile control;
the fourth control module is used for controlling the virtual object to move in the game scene based on a second movement control parameter corresponding to the second functional area, and displaying a second prompt identifier corresponding to the second movement control parameter in an area separated from the movement control in the graphical user interface; the second prompt mark comprises a direction prompt mark corresponding to a second direction control parameter in the second mobile control parameters and/or a speed prompt mark corresponding to a second speed control parameter.
In some embodiments, the second functional region whose relative position corresponds in the mobile control is a functional region where an intersection position between the straight line and the boundary of the mobile control is located; and the straight line is a connecting line between the touch point in the correction area and the central position of the mobile control.
In some embodiments, the apparatus further comprises:
and the second display module is used for responding to a third touch operation aiming at the correction area and highlighting at least one of the intersection point position, the straight line and the functional area where the intersection point position is located.
In some embodiments, the apparatus further comprises:
and the fifth control module is used for responding to the movement of the touch point of the first touch operation or the third touch operation to the position except the correction area and the movement control in the graphical user interface, stopping the movement control of the virtual object and hiding the prompt identifier.
The movement control device in the game provided by the embodiment of the application has the same technical characteristics as the movement control method in the game provided by the embodiment, so that the same technical problems can be solved, and the same technical effects can be achieved.
Corresponding to the movement control method in the game, an embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium stores computer executable instructions, and when the computer executable instructions are called and executed by a processor, the computer executable instructions cause the processor to execute the steps of the movement control method in the game.
The mobile control device in the game provided by the embodiment of the application can be specific hardware on the device, or software or firmware installed on the device, and the like. The device provided by the embodiment of the present application has the same implementation principle and technical effect as the foregoing method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the foregoing method embodiments where no part of the device embodiments is mentioned. It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the foregoing systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
For another example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments provided in the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method for controlling movement in a game according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus once an item is defined in one figure, it need not be further defined and explained in subsequent figures, and moreover, the terms "first", "second", "third", etc. are used merely to distinguish one description from another and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the scope of the embodiments of the present application. Are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

1. A movement control method in a game is characterized in that a terminal provides a graphical user interface, a game scene of the game comprises a virtual object, the graphical user interface comprises a movement control aiming at the virtual object, the movement control comprises a plurality of functional areas, each functional area corresponds to at least one movement control parameter, and the movement control parameters comprise direction control parameters and/or speed control parameters; the method comprises the following steps:
responding to a first touch operation aiming at the mobile control, and determining at least one target mobile control parameter corresponding to a target function area corresponding to the first touch operation;
controlling the virtual object to move in the game scene based on the target movement control parameter, and displaying a target prompt identifier corresponding to the target movement control parameter in an area separated from the movement control in the graphical user interface; the target prompt mark comprises a direction prompt mark corresponding to a target direction control parameter and/or a speed prompt mark corresponding to a target speed control parameter in the target movement control parameters.
2. The method according to claim 1, wherein the direction indicator is a bar-shaped indicator, different positions of the bar-shaped indicator are used for indicating different moving directions, and the speed indicator follows movement of the touch point of the first touch operation on the direction indicator.
3. The method according to claim 2, wherein the target functional area corresponding to the first touch operation corresponds to the target speed control parameter only;
the step of displaying the target prompt identifier corresponding to the target movement control parameter in the area separated from the movement control in the graphical user interface includes:
and displaying the speed prompt identification corresponding to the target speed control parameter in an area separated from the mobile control in the graphical user interface, and controlling to hide the strip-shaped identification.
4. The method of claim 1, wherein different locations in the functional area correspond to different degrees of control of the movement control parameter; the method further comprises the following steps:
determining a target control degree of a target movement control parameter corresponding to the target position relation according to the target position relation between the touch point of the first touch operation and the target function area;
and displaying a degree prompt identifier corresponding to the target control degree in the graphical user interface.
5. The method of claim 4, wherein the degree of control comprises: at least one of a control angle of the rotation direction and a control magnitude of the acceleration.
6. The method of claim 4, wherein the target degree of control comprises a target control angle of a turning direction;
the step of determining the target control degree of the target movement control parameter corresponding to the target position relationship according to the target position relationship between the touch point of the first touch operation and the target function area includes:
determining a target effective angle corresponding to the target position relation in the target function area according to the target position relation between the touch point of the first touch operation and the target function area; the effective angle corresponds to the control angle of the rotating direction one by one;
and determining a target control angle of the rotating direction corresponding to the target effective angle.
7. The method of claim 1, wherein the step of displaying the target prompt identifier corresponding to the target movement control parameter in an area of the graphical user interface separate from the movement control comprises:
when the touch point of the first touch operation is located in the range corresponding to the mobile control, controlling a preset fixed area separated from the mobile control in the graphical user interface to display a target prompt identifier corresponding to the target mobile control parameter;
and when the touch point of the first touch operation is positioned outside the range corresponding to the mobile control, displaying a target prompt identifier corresponding to the target mobile control parameter according to the position of the touch point.
8. The method of claim 7, wherein the preset fixed area is located a preset distance above the mobile control.
9. The method according to claim 7, wherein the step of displaying the target prompt identifier corresponding to the target movement control parameter according to the position of the touch point when the touch point of the first touch operation is located outside the range corresponding to the movement control includes:
and when the touch point of the first touch operation is located outside the range corresponding to the mobile control, displaying a target prompt identifier corresponding to the target mobile control parameter at a preset distance above the position of the touch point.
10. The method of claim 1, wherein a plurality of the functional regions are arranged annularly around a central location of the mobile control on the mobile control; an invalid function area is arranged at the central position of the mobile control; the method further comprises the following steps:
responding to a second touch operation aiming at the invalid functional area, and carrying out invalid processing on the second touch operation.
11. The method of claim 10, further comprising:
and in response to the touch point moving to the invalid functional area from a position, except the invalid functional area, in the mobile control, canceling the movement control of the virtual object and hiding the prompt identifier.
12. The method according to claim 1, wherein a correction area is arranged in a preset range around the mobile control; the method further comprises the following steps:
responding to a third touch operation aiming at the correction area, and determining at least one first movement control parameter corresponding to a first functional area corresponding to the third touch operation;
controlling the virtual object to move in the game scene based on the first movement control parameter, and displaying a first prompt identifier corresponding to the first movement control parameter in an area, separated from the movement control, of the graphical user interface; the first prompt identifier includes a direction prompt identifier corresponding to a first direction control parameter in the first movement control parameter and/or a speed prompt identifier corresponding to a first speed control parameter.
13. The method of claim 12, wherein after the step of determining at least one target movement control parameter corresponding to a target functional area corresponding to a first touch operation for the movement control in response to the first touch operation, the method further comprises:
responding to the movement of the touch point of the first touch operation from the mobile control to the correction area, and determining the relative position of the touch point moving to the correction area relative to the mobile control and the corresponding second function area of the relative position in the mobile control;
controlling the virtual object to move in the game scene based on a second movement control parameter corresponding to the second functional area, and displaying a second prompt identifier corresponding to the second movement control parameter in an area, separated from the movement control, of the graphical user interface; and the second prompt identifier comprises a direction prompt identifier corresponding to a second direction control parameter in the second mobile control parameters and/or a speed prompt identifier corresponding to a second speed control parameter.
14. The method of claim 13, wherein the corresponding second functional region of the relative position in the movement control is a functional region where an intersection position between a straight line and a boundary of the movement control is located; and the straight line is a connecting line between the touch point in the correction area and the central position of the mobile control.
15. The method of claim 14, further comprising:
and in response to a third touch operation aiming at the correction area, highlighting at least one of the intersection point position, the straight line and the functional area where the intersection point position is located.
16. The method of claim 12, further comprising:
and responding to the movement of the touch point of the first touch operation or the third touch operation to the position, except the correction area and the movement control, in the graphical user interface, stopping the movement control of the virtual object, and hiding the prompt identifier.
17. A movement control device in a game is characterized in that a terminal provides a graphical user interface, a game scene of the game comprises a virtual object, the graphical user interface comprises a movement control aiming at the virtual object, the movement control comprises a plurality of functional areas, each functional area corresponds to at least one movement control parameter, and the movement control parameters comprise direction control parameters and/or speed control parameters; the device comprises:
the determining module is used for responding to a first touch operation aiming at the mobile control, and determining at least one target mobile control parameter corresponding to a target function area corresponding to the first touch operation;
the control module is used for controlling the virtual object to move in the game scene based on the target movement control parameter and displaying a target prompt identifier corresponding to the target movement control parameter in an area separated from the movement control in the graphical user interface; the target prompt mark comprises a direction prompt mark corresponding to a target direction control parameter and/or a speed prompt mark corresponding to a target speed control parameter in the target movement control parameters.
18. An electronic terminal comprising a memory, a processor, and a computer program stored in the memory and operable on the processor, wherein the processor, when executing the computer program, implements the steps of the method according to any of claims 1 to 16.
19. A computer readable storage medium having stored thereon computer executable instructions which, when invoked and executed by a processor, cause the processor to execute the method of any one of claims 1 to 16.
CN202111081229.9A 2021-09-15 2021-09-15 Game movement control method and device and electronic equipment Pending CN113769404A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111081229.9A CN113769404A (en) 2021-09-15 2021-09-15 Game movement control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111081229.9A CN113769404A (en) 2021-09-15 2021-09-15 Game movement control method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN113769404A true CN113769404A (en) 2021-12-10

Family

ID=78844233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111081229.9A Pending CN113769404A (en) 2021-09-15 2021-09-15 Game movement control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113769404A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114860148A (en) * 2022-04-19 2022-08-05 北京字跳网络技术有限公司 Interaction method, interaction device, computer equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107930119A (en) * 2017-11-21 2018-04-20 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN110548286A (en) * 2019-09-29 2019-12-10 网易(杭州)网络有限公司 Method and device for locking virtual object in game and electronic equipment
CN111803946A (en) * 2020-07-22 2020-10-23 网易(杭州)网络有限公司 Lens switching method and device in game and electronic equipment
CN112933591A (en) * 2021-03-15 2021-06-11 网易(杭州)网络有限公司 Method and device for controlling game virtual character, storage medium and electronic equipment
CN113244610A (en) * 2021-06-02 2021-08-13 网易(杭州)网络有限公司 Method, device, equipment and storage medium for controlling virtual moving object in game
CN113244603A (en) * 2021-05-13 2021-08-13 网易(杭州)网络有限公司 Information processing method and device and terminal equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107930119A (en) * 2017-11-21 2018-04-20 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN110548286A (en) * 2019-09-29 2019-12-10 网易(杭州)网络有限公司 Method and device for locking virtual object in game and electronic equipment
CN111803946A (en) * 2020-07-22 2020-10-23 网易(杭州)网络有限公司 Lens switching method and device in game and electronic equipment
CN112933591A (en) * 2021-03-15 2021-06-11 网易(杭州)网络有限公司 Method and device for controlling game virtual character, storage medium and electronic equipment
CN113244603A (en) * 2021-05-13 2021-08-13 网易(杭州)网络有限公司 Information processing method and device and terminal equipment
CN113244610A (en) * 2021-06-02 2021-08-13 网易(杭州)网络有限公司 Method, device, equipment and storage medium for controlling virtual moving object in game

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114860148A (en) * 2022-04-19 2022-08-05 北京字跳网络技术有限公司 Interaction method, interaction device, computer equipment and storage medium
CN114860148B (en) * 2022-04-19 2024-01-16 北京字跳网络技术有限公司 Interaction method, device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
US10716995B2 (en) Information processing method and apparatus, storage medium, and electronic device
US10507383B2 (en) Information processing method and apparatus, electronic device, and storage medium
US10500483B2 (en) Information processing method and apparatus, storage medium, and electronic device
CN111111190B (en) Interaction method and device for virtual characters in game and touch terminal
US10583355B2 (en) Information processing method and apparatus, electronic device, and storage medium
US10716997B2 (en) Information processing method and apparatus, electronic device, and storage medium
CN110955370B (en) Switching method and device of skill control in game and touch terminal
US10398977B2 (en) Information processing method, terminal, and computer storage medium
US10639549B2 (en) Information processing method, terminal, and computer storage medium
US11400375B2 (en) Object display method and apparatus, storage medium, and electronic device
US20190070493A1 (en) Information Processing Method and Apparatus, Electronic Device, and Storage Medium
US11198058B2 (en) Storage medium storing game program, information processing apparatus, information processing system, and game processing method
JP6921193B2 (en) Game programs, information processing devices, information processing systems, and game processing methods
CN110448904B (en) Game view angle control method and device, storage medium and electronic device
CN112346636A (en) In-game information processing method and device and terminal
CN113908534A (en) Game skill control method and device and electronic terminal
CN113769404A (en) Game movement control method and device and electronic equipment
CN113457157A (en) Method and device for switching virtual props in game and touch terminal
CN115089959A (en) Direction prompting method and device in game and electronic terminal
CN111905371A (en) Method and device for controlling target virtual character in game
CN113926186A (en) Method and device for selecting virtual object in game and touch terminal
CN113713386A (en) Information prompting method and device in game and touch terminal
CN113952728A (en) Method and device for controlling virtual character in game and electronic equipment
CN113663326B (en) Aiming method and device for game skills
CN115708953A (en) Control method and device for control in game and touch terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination