CN115155052A - Method and device for controlling cursor through handle, electronic equipment and storage medium - Google Patents

Method and device for controlling cursor through handle, electronic equipment and storage medium Download PDF

Info

Publication number
CN115155052A
CN115155052A CN202210714041.1A CN202210714041A CN115155052A CN 115155052 A CN115155052 A CN 115155052A CN 202210714041 A CN202210714041 A CN 202210714041A CN 115155052 A CN115155052 A CN 115155052A
Authority
CN
China
Prior art keywords
cursor
target interface
instruction
controlling
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210714041.1A
Other languages
Chinese (zh)
Inventor
刘兴
刘勇成
胡志鹏
卢小军
刘星
程龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202210714041.1A priority Critical patent/CN115155052A/en
Publication of CN115155052A publication Critical patent/CN115155052A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

The application provides a method, a device, an electronic device and a storage medium for controlling a cursor through a handle, comprising the following steps: displaying a target interface through the terminal equipment; receiving a first control instruction aiming at the cursor through a key of the handle, and controlling the cursor to move in a first moving mode on the target interface; receiving a second control instruction aiming at the cursor through a rocker of the handle, controlling the cursor to switch from a first moving mode to a second moving mode on the target interface, and controlling the moving distance of the cursor on the target interface according to the second control instruction; and receiving a third control instruction aiming at the cursor through a key of the handle, and controlling the cursor to switch from the second movement mode to the first movement mode on the target interface. Through this application, can support rocker and button two kinds of modes of controlling to realize two kinds of convenient switches between controlling the mode, avoided arousing that the user obscures.

Description

Method and device for controlling cursor through handle, electronic equipment and storage medium
Technical Field
The present application relates to the field of game technologies, and in particular, to a method and an apparatus for controlling a cursor through a handle, an electronic device, and a storage medium.
Background
In recent years, a game console platform usually selects a handle as an input device to meet the requirements of users on game operation sensitivity and operation hand feeling. In addition, when a user plays games on a handheld game terminal or a computer, the user is limited by the size of a screen or the flexibility of keyboard and mouse manipulation, and sometimes selects a handle as an input device.
When using a handle as an input device, the following problems exist: the cross navigation key is not active when the joystick is used to control the cursor to move on the UI interface. When many small UI icons or buttons are displayed on the UI interface, if the user operates the cursor through the joystick, the user cannot quickly and accurately find a certain UI icon or button, and if the user operates the cursor through the cross navigation key, the same operation needs to be repeated for many times (for example, the user clicks Dpad-down for many times), which results in complicated operation.
Disclosure of Invention
In view of this, the embodiments of the present disclosure at least provide a method, an apparatus, an electronic device and a storage medium for controlling a cursor through a handle, which can support two control modes, namely a joystick and a key, and are convenient to switch and not easy to confuse a user.
In a first aspect, an exemplary embodiment of the present application provides a method for controlling a cursor through a handle, the handle including a key and a joystick, the method comprising: displaying a target interface through the terminal equipment; receiving a first control instruction aiming at the cursor through the keys of the handle, and controlling the cursor to move in a first movement mode on the target interface, wherein the first movement mode comprises the following steps: controlling the cursor to move from the control currently indicated by the cursor to the next control in the direction indicated by the first control instruction on the target interface; receiving a second control instruction aiming at the cursor through the rocker of the handle, controlling the cursor to be switched from a first moving mode to a second moving mode on the target interface, and controlling the moving distance of the cursor on the target interface according to the second control instruction; wherein the second movement mode comprises: controlling the cursor to determine the moving distance of the cursor in the target interface along the direction indicated by the second control instruction according to the moving sensitivity and the time length indicated by the second control instruction; and receiving a third control instruction aiming at the cursor through the key of the handle, and controlling the cursor to be switched from the second movement mode to the first movement mode on the target interface.
In a possible embodiment, the method may further include: and when responding to a preset input instruction, forbidding responding to a target input instruction, wherein the preset input instruction can be one of the first control instruction and the second control instruction, and the target input instruction can be the other one of the first control instruction and the second control instruction.
In one possible embodiment, the step of inhibiting the response to the target input instruction may include: and receiving the target input instruction, and forbidding responding to the target input instruction.
In a possible embodiment, the second control instruction may include a first switching sub-instruction and a first manipulating sub-instruction, where the first switching sub-instruction may be used to control the cursor to switch from the first movement mode to the second movement mode at the target interface, and the first manipulating sub-instruction may be used to control the cursor to move in the second movement mode at the target interface.
In a possible embodiment, the third control instruction may include a second switching sub-instruction and a second manipulating sub-instruction, where the second switching sub-instruction may be used to control the cursor to switch from the second movement manner to the first movement manner at the target interface, and the second manipulating sub-instruction may be used to control the cursor to move in the first movement manner at the target interface.
In one possible implementation, the target interface may be an interface displayed on the terminal device for presenting a plurality of controls in a static form.
In one possible implementation, the target interface may include an in-game configuration screen provided by the terminal device.
In a possible embodiment, the step of displaying, by the terminal device, the target interface may include: displaying a game scene picture corresponding to a virtual character in a game scene in a graphical user interface provided by the terminal equipment; in response to a manipulation to the virtual character, controlling the virtual character to perform an action corresponding to the manipulation in the virtual scene; and responding to a preset trigger event, and controlling the picture displayed in the graphical user interface to be switched from the game scene picture to the target interface.
In one possible embodiment, the target interface may include a plurality of controls, wherein the initial control module under the target interface may be determined by: and determining an initial control module under the target interface from the key and the rocker according to the display attributes of the controls on the target interface.
In one possible embodiment, the initial control model under the target interface may be determined by: determining whether the plurality of controls meet a preset display condition according to the display attributes, wherein the preset display condition is used for representing the display complexity of the plurality of controls on the target interface; if the preset display condition is met, selecting the rocker as an initial control module under the target interface; and if the preset display condition is not met, selecting the key as an initial control module under the target interface.
In one possible embodiment, the presentation property may include a display layout and/or a display number of the plurality of controls on the target interface, wherein the preset presentation condition may include at least one of: the number of the controls displayed in any preset display area is not less than a first set value, and the target interface is divided into a plurality of preset display areas; the total number of the plurality of controls shown on the target interface is not less than a second set value.
In a second aspect, embodiments of the present application further provide a device for controlling a cursor through a handle, where the handle includes a button and a rocker, and the device includes: the display control module displays a target interface through the terminal equipment; the cursor control module receives a first control instruction aiming at the cursor through the keys of the handle and controls the cursor to move on the target interface in a first movement mode, wherein the first movement mode comprises the following steps: controlling the cursor to move from the control currently indicated by the cursor to the next control in the direction indicated by the first control instruction on the target interface; the first switching module receives a second control instruction aiming at the cursor through the rocker of the handle, controls the cursor to switch from a first moving mode to a second moving mode on the target interface, and controls the moving distance of the cursor on the target interface according to the second control instruction; wherein the second movement mode comprises: controlling the cursor to determine the moving distance of the cursor in the target interface along the direction indicated by the second control instruction according to the moving sensitivity and the time length indicated by the second control instruction; and the second switching module receives a third control instruction aiming at the cursor through the key of the handle and controls the cursor to be switched from the second movement mode to the first movement mode on the target interface.
In a third aspect, an embodiment of the present application further provides an electronic device, including: a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, the processor and the memory communicate via the bus when the electronic device is running, and the machine-readable instructions are executed by the processor to perform the steps of the method for controlling a cursor by a handle in the first aspect or any one of the possible embodiments of the first aspect.
In a fourth aspect, this embodiment further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, performs the steps of the method for controlling a cursor through a handle in the first aspect or any one of the possible implementation manners of the first aspect.
The method, the device, the electronic equipment and the storage medium for controlling the cursor through the handle can support two control modes of a rocker and a key, are convenient to switch and are not easy to cause user confusion.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
FIG. 1 illustrates a flow chart of a method for controlling a cursor via a handle provided by an exemplary embodiment of the present application;
FIG. 2 is a flowchart illustrating steps provided by an exemplary embodiment of the present application to display a target interface;
FIG. 3 illustrates a schematic view of a handle provided by an exemplary embodiment of the present application;
FIG. 4 is a schematic diagram showing the structure of a device for controlling a cursor through a handle according to an exemplary embodiment of the present application;
fig. 5 shows a schematic structural diagram of an electronic device provided in an exemplary embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not intended to limit the scope of the present application. Further, it should be understood that the schematic drawings are not drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and that steps without logical context may be reversed in order or performed concurrently. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
The terms "a", "an", "the" and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
It should be understood that in the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or" is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "comprising A, B and/or C" means comprising any 1 or any 2 or 3 of A, B and C.
It should be understood that in the embodiment of the present application, "B corresponding to a", "a corresponds to B", or "B corresponds to a" means that B is associated with a, and B can be determined according to a. Determining B from a does not mean determining B from a alone, but may be determined from a and/or other information.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be obtained by a person skilled in the art without making any inventive step based on the embodiments of the present application, fall within the scope of protection of the present application.
In recent years, cloud games have increasingly large market occupation, and a dedicated game host platform usually selects a handle as an input device to meet the requirements of users on game operation sensitivity and operation hand feeling. In addition, when a user plays games on a handheld game terminal or a computer, the user is limited by the size of a screen or the flexibility of keyboard and mouse manipulation, and sometimes selects a handle as an input device. However, when using a handle as an input device, there are the following problems: the cross navigation key is not active when the joystick's joystick is used to control the movement of the cursor on the UI interface.
For example, a user can control the cursor to move on the UI interface in a mouse-like manner by manipulating the cursor through the joystick, and when many small UI icons or buttons are displayed on the UI interface, even a small UI button icon can be easily clicked by the mouse for a mouse user at a PC end game, but for a handle user at a host platform, it is difficult to precisely find a particular small icon or button by manipulating the cursor through the joystick. In addition, the user can also control the cursor to move on the UI interface through the cross navigation key, and if the player wants to press a certain button from the leftmost upper corner to the rightmost lower corner of the UI interface by Dpad-down, the player needs to repeat multiple operations (for example, click Dpad-down and Dpad-Right multiple times), which results in very tedious operations.
In order to solve the problems, the application provides a method, a device, an electronic device and a storage medium for controlling a cursor through a handle, which can not only support two control modes of a rocker and a key, but also realize seamless switching between the two control modes, so that the switching is convenient and fast, and the user confusion is not easily caused.
First, the names referred to in the embodiments of the present application will be briefly described.
In an embodiment of the present application, a graphical user interface may be provided by a terminal device, where:
the terminal equipment:
in an exemplary application scenario, the terminal device may be an intelligent device that is used for providing a game scenario and is capable of controlling a virtual character, and the terminal device may include, but is not limited to, any one of the following devices: smart phones, tablet computers, portable computers, desktop computers, game machines, personal Digital Assistants (PDAs), e-book readers, MP4 (Moving Picture Experts Group Audio Layer IV) players, and the like. The terminal device is installed and operated with an application program supporting a game scene, such as an application program supporting a three-dimensional game scene. The application program may include, but is not limited to, any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a MOBA Game, a multi-player gunfight type survival Game, a Third-person Shooting Game (TPS). Alternatively, the application may be a standalone version of the application, such as a standalone version of the 3D game program, or may be a network online version of the application.
A graphical user interface:
the interface display format is used for human-computer communication, and allows a user to use an input device such as a mouse, a keyboard or a handle to manipulate icons, logos or menu options on a screen, and also allows the user to manipulate the icons or menu options on the screen by performing a touch operation on a touch screen of the touch terminal to select a command, start a program or perform other tasks. In a game scene, a game scene interface and a game configuration interface can be displayed in the graphical user interface.
Virtual roles:
refers to a virtual character in a virtual environment (e.g., a game scene), which can be a virtual character manipulated by a player, including but not limited to at least one of a virtual character, a virtual animal, a cartoon character, a non-player manipulated virtual character (NPC), and a virtual object, such as a static object in the virtual scene, for example, a virtual prop, a virtual task in the virtual scene, a location in the virtual environment, terrain, a house, a bridge, vegetation, and the like. Static objects are often not directly controlled by players, but can be made to behave accordingly in response to the interaction of a virtual character in the scene (e.g., attack, tear down, etc.), such as: the virtual character can demolish, pick, drag, build, etc. the building. Alternatively, the virtual object may not respond to the interaction behavior of the virtual character, for example, the virtual object may also be a building, a door, a window, a plant, etc. in the game scene, but the virtual character cannot interact with the virtual object, for example, the virtual character cannot destroy or remove the window, etc. Alternatively, when the virtual environment is a three-dimensional virtual environment, the virtual characters may be three-dimensional virtual models, each virtual character having its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment. Optionally, the virtual character is a three-dimensional character constructed based on three-dimensional human skeleton technology, and the virtual character realizes different external images by wearing different skins. In some implementations, the virtual role may also be implemented by using a 2.5-dimensional or 2-dimensional model, which is not limited in this embodiment of the present application.
There may be a plurality of virtual characters in the virtual scene, which are virtual characters manipulated by a player (i.e., characters controlled by the player through an input device, a touch screen), or Artificial Intelligence (AI) set in virtual environment battle by training. Optionally, the virtual character is a virtual character playing a game in the game scene. Optionally, the number of virtual characters in the game scene match is preset, or is dynamically determined according to the number of terminal devices participating in the virtual match, which is not limited in the embodiment of the present application. In one possible implementation, the user can control the virtual character to move within the virtual scene, e.g., control the character to run, jump, crawl, etc., and can also control the virtual character to fight other virtual characters using skills, virtual props, etc., provided by the application.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through the electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal device or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game scene and a game configuration interface, and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
An application scenario to which the present application is applicable is introduced. The method and the device can be applied to the technical field of games, and a plurality of players participating in the games join the same virtual battle together in the games.
Before entering the virtual match, the player can select different character attributes, such as identity attributes, for the virtual character in the virtual match, and determine different battles by allocating the different character attributes, so that the player wins the game match by executing tasks allocated by the game in different match stages of the virtual match, for example, a plurality of virtual characters with A character attributes are subjected to 'elimination' in the match stages to obtain the win of the game match. Here, when entering the virtual battle, the character attributes may be randomly assigned to each virtual character participating in the virtual battle.
An implementation environment provided by one embodiment of the present application may include: the system comprises a first terminal device, a server and a second terminal device, wherein the first terminal device and the second terminal device are respectively communicated with the server to realize data communication. In this embodiment, the first terminal device and the second terminal device are respectively installed with an application program for executing the method for controlling the cursor through the handle provided by the present application, and the server is a server side for executing the method for controlling the cursor through the handle provided by the present application. The first terminal device and the second terminal device are enabled to communicate with the server respectively through the application program.
Taking the first terminal device as an example, the first terminal device establishes communication with the server by running the application program. In an alternative embodiment, the server establishes the virtual match based on a game request from an application. The parameters of the virtual match may be determined according to parameters in the received game request, for example, the parameters of the virtual match may include the number of people participating in the virtual match, the level of the character participating in the virtual match, and the like. When the first terminal device receives a response of the game server, a game scene corresponding to the virtual battle is displayed through a graphical user interface of the first terminal device, the first terminal device is a device controlled by a first user, a virtual character displayed in the graphical user interface of the first terminal device is a player character (namely, a first virtual character) controlled by the first user, the first user inputs a character operation instruction through the graphical user interface to control the player character to execute corresponding operation in the game scene, and the first user also inputs a cursor control instruction through the graphical user interface to control a cursor to move on the graphical user interface.
Taking the second terminal device as an example, the second terminal device establishes communication with the server by running the application program. In an alternative embodiment, the server establishes the virtual match based on a game request from the application. The parameters of the virtual match may be determined according to parameters in the received game request, for example, the parameters of the virtual match may include the number of people participating in the virtual match, the level of the character participating in the virtual match, and the like. And when the second terminal equipment receives the response of the server, displaying the game scene corresponding to the virtual battle through the graphical user interface of the second terminal equipment. The second terminal device is a device controlled by a second user, the virtual character displayed in the graphical user interface of the second terminal device is a player character (i.e. a second virtual character) controlled by the second user, the second user inputs a character operation instruction through the graphical user interface to control the player character to execute corresponding operation in the virtual scene, and the second user also inputs a cursor control instruction through the graphical user interface to control the cursor to move on the graphical user interface.
The server performs data calculation according to the game data reported by the first terminal device and the second terminal device, and synchronizes the calculated game data to the first terminal device and the second terminal device, so that the first terminal device and the second terminal device control the graphical user interface to render a corresponding game scene and/or virtual character according to the synchronization data issued by the game server.
In the present embodiment, a first virtual character controlled by a first terminal device and a second virtual character controlled by a second terminal device are virtual characters in the same virtual match. The first virtual role controlled by the first terminal device and the second virtual role controlled by the second terminal device may have the same role attribute or different role attributes, and the first virtual role controlled by the first terminal device and the second virtual role controlled by the second terminal device may belong to the same camp or different camp.
In the virtual match, two or more virtual characters may be included, and different virtual characters may correspond to different terminal devices, that is, in the virtual match, two or more terminal devices may transmit and synchronize game data with the game server, respectively.
For the convenience of understanding of the present application, the method, the apparatus, the electronic device and the storage medium for controlling the cursor through the handle provided by the embodiments of the present application will be described in detail below.
Referring to fig. 1, a flowchart of a method for controlling a cursor through a handle according to an exemplary embodiment of the present application specifically includes:
step S101: and displaying the target interface through the terminal equipment.
In the embodiment of the application, the graphical user interface can be provided through the terminal equipment, and the target interface is an interface displayed on the graphical user interface.
In a preferred embodiment, the target interface may refer to an interface displayed on the terminal device for presenting the plurality of controls in a static form. For example, the presentation in a static form herein may be understood as that the display positions of the plurality of controls on the interface do not move, that is, the display positions of the controls do not move along with the operation performed by the user on the interface (for example, the user manipulates a cursor, and a specific key is triggered). In other words, the display positions of the multiple controls on the interface are fixed, and the multiple controls can be selected based on the operation performed by the user on the interface to trigger the function corresponding to the selected control.
Taking the above terminal device running a game program as an example, a game scene picture corresponding to a game may be displayed in the graphical user interface, in which case, the target interface may refer to a game configuration picture corresponding to a game running in the terminal device.
As an example, the game configuration screen may include any one of: the virtual battle management system comprises a character parameter configuration interface aiming at a virtual character in the game, a task viewing interface aiming at a virtual task in the game, a knapsack viewing interface aiming at a virtual knapsack equipped by the virtual character, and a battle parameter configuration interface aiming at the virtual battle.
Referring to fig. 2, one way of displaying the target interface is described by taking the target interface as a game configuration screen in the game.
FIG. 2 is a flowchart illustrating steps provided in an exemplary embodiment of the present application to display a target interface.
Referring to fig. 2, in step S201, a game scene screen corresponding to a virtual character located in a game scene is displayed in a graphical user interface provided by a terminal device.
In the embodiment of the present application, the game scene picture is a picture obtained by observing a game scene with a view angle corresponding to a virtual character, for example, the game scene picture may be a game scene picture at a first person view angle or a third person view angle (also referred to as a god view angle), the game scene picture may include a game scene, and may further include a game scene and a virtual character located in the game scene, and the virtual character is configured to execute a virtual action according to a game instruction received by a terminal device.
Illustratively, each virtual character has a one-to-one virtual camera in the virtual environment, and in this step, the position of the virtual camera changes synchronously with the position of the virtual character, so that the game scene pictures collected by the virtual camera change along with the change of the position of the virtual character.
Corresponding to the game configuration picture displayed in the static form, the game scene picture can be understood as an interface displayed in a dynamic form. In the game scene picture, along with the operation performed by the user on the interface, at least one of the following changes exists: the game scene changes, the position of the virtual character in the game scene changes, and the position and/or shape of the virtual object in the game scene changes.
In step S202, in response to a manipulation for the virtual character, the virtual character is controlled to perform an action corresponding to the manipulation in the virtual scene.
Here, the above-mentioned manipulation is issued by the game player to the terminal device, and is used for controlling the virtual character to execute the action instructed by the manipulation in the game scene of the graphical user interface.
Taking the manipulation for the virtual character as a moving operation, in this step, the virtual character may be controlled to move in the game scene in response to the moving operation for the virtual character, and the range of the game scene displayed in the graphical user interface may be controlled to change correspondingly according to the movement of the virtual character.
The movement operation in the embodiment of the application is used for controlling the virtual character to move in the game scene of the graphical user interface. The terminal device responds to the movement operation, can control the virtual character to move in the game scene, and along with the movement of the virtual character, the position of the virtual character in the game scene can correspondingly change, namely the terminal device responds to the movement operation, and can also control the game scene range displayed in the graphical user interface to correspondingly change according to the movement of the virtual character.
For example, the game scene picture displayed in the graphical user interface may be a picture obtained by observing a game scene with a virtual character as an observation center, and when the virtual character in the game scene is manipulated to move, the game scene picture moves along with the movement, that is, the observation center of the game scene picture is bound with the position of the virtual character, so that the observation center moves along with the movement of the position of the virtual character. However, the present invention is not limited to this, and other observation positions in the game scene may be used as the observation centers, as long as the range of the displayed game scene changes in accordance with the movement of the virtual character.
For example, the process of controlling the virtual character to move in the game scene may include: receiving a selection operation of a game player for a virtual character, and controlling the virtual character to move in a game scene in response to a dragging operation for the selected virtual character; alternatively, the selection operation of the game player for the virtual character may be received, and the virtual character may be controlled to move to the selected position in response to the position selection operation performed in the game scene. As an example, the move operation may include, but is not limited to, at least one of: on the computer end, clicking the virtual character through a left mouse button without loosening, and dragging the mouse to change the position of the virtual character in the game scene; or on the mobile terminal, the virtual character is pressed by fingers for a long time without loosening, and the position of the virtual character in the game scene is changed by sliding the fingers on the graphical user interface; or on the host platform, the direction is selected by toggling the first rocker of the handle, and the virtual character is controlled to move towards the direction pointed by the first rocker in the game scene or stop moving by toggling the second rocker of the handle.
Furthermore, as the game scene range changes according to the movement of the virtual character, another virtual character may be included in the changed game scene (here, another virtual character may also be included in the game scene before the change), where the another virtual character is a virtual character controlled by another player in the current virtual game. Similarly, the terminal devices of other players respond to the movement operation thereof, and can also control another virtual character to move in the game scene, and along with the movement of the other virtual character, the position of the other virtual character in the game scene also changes correspondingly.
For example, a virtual character may perform a system-specified task in a game scenario to achieve a goal of winning a task, as may another virtual character in a game scenario. If the virtual character is a virtual character with a first character attribute and the other virtual character is a virtual character with a second character attribute, the virtual character can be confused when the other virtual character executes a task, or the other virtual character is eliminated, or a task formulated for the virtual character is completed in the process that the other virtual character executes the task; if the virtual character is a virtual character having the second character attribute and the other virtual character is a virtual character having the first character attribute, the same process as above may be performed; if the virtual role and the other virtual role have the same role attribute, the virtual role and the other virtual role can execute tasks together or respectively, and can also search for other virtual roles with the other role attribute together or respectively so as to interfere the other virtual roles to execute the tasks and kill the other virtual roles.
In step S203, in response to a preset trigger event, the screen displayed in the graphical user interface is controlled to be switched from the game scene screen to the target interface.
Here, the preset trigger event refers to an event for triggering interface switching.
In an alternative embodiment, the triggering event may include at least one of: in this case, switching of the screen displayed in the graphical user interface may be triggered by controlling the virtual character to move to a specific position in the game scene, or the virtual character to pick up a specific virtual item in the game scene, the virtual character triggering a specific virtual task in the game scene, or the virtual game entering a specific game stage.
In an alternative embodiment, the triggering event is a switching operation for triggering switching from the game scene screen to the target interface, and exemplary switching operations may include, but are not limited to: and for the operation of the configuration options displayed on the graphical user interface, the game configuration interface is displayed by exiting the game scene picture.
The method of the embodiment of the present application is suitable for controlling the movement of the cursor through the handle on the game configuration interface, and the input of the handle can be controlled in other ways in the game scene, in other words, the method of controlling the cursor through the handle of the present application does not support the use in the game scene.
It should be understood that the above describes one way to enter the target interface, i.e., to switch from a game scene, but the application is not limited thereto, and the entering the target interface may be triggered by other ways.
Returning to fig. 1, step S102: and receiving a first control instruction aiming at the cursor through a key of the handle, and controlling the cursor to move in a first movement mode on the target interface.
Here, the first movement manner includes: and controlling the cursor to move from the control currently indicated by the cursor to the next control in the direction indicated by the first control instruction on the target interface. The cursor moving on the target interface may also be referred to as a free cursor, and an initial position of the cursor on the target interface may be set according to a requirement, which is not limited in this application. Further, the change in coordinates of the cursor in the first movement pattern is discrete and discontinuous, e.g., the cursor jumps from the currently indicated control to the next control.
FIG. 3 illustrates a schematic view of a handle provided by an exemplary embodiment of the present application.
In this example, the handle 10 is provided with a plurality of mounting slots, such as mounting slots 11, 12, 13 shown in the figure, the handle 10 includes a key and a rocker, taking the handle shown in the figure including 2 rockers as an example, a first rocker A1 is disposed in the mounting slot 11, a key B is disposed in the mounting slot 12, and a second rocker A2 is disposed in the mounting slot 13.
Under the target interface, the first rocker A1, the second rocker A2 and the key B on the handle 10 can all be used for controlling the movement of the cursor on the target interface. The differences of the rocker and the key on cursor control are as follows: the keys are preset with fixed moving directions, for example, the key B is a cross key, each extending end of the cross corresponds to one moving direction, for example, the upper extending end of the cross corresponds to the control cursor moving upward on the target interface, the lower extending end of the cross corresponds to the control cursor moving downward on the target interface, the left extending end of the cross corresponds to the control cursor moving leftward on the target interface, and the right extending end of the cross corresponds to the control cursor moving rightward on the target interface. That is, by operating any extending end of the button B on the handle 10, the cursor can be controlled to automatically navigate to the next focus point on the target interface beyond the direction indicated by the extending end, and each time the button B is pressed, the cursor performs an automatic navigation action on the target interface.
For each rocker, the direction of movement of the cursor on the target interface changes with the direction of toggling of the rocker. The rocker can be shifted in any direction exceeding 360 degrees, so that the control on the moving direction of the vernier is very flexible.
It should be understood that besides the rocker and the key, other mounting grooves may be provided on the handle, so as to arrange various other functional keys in the other mounting grooves. The function key is used for triggering a preset certain function, for example, in a game scene, in response to an operation on the certain function key, the virtual character is controlled to execute a corresponding specific action, or a designated backpack is opened. In addition, the structure of the handle shown in the above figures is only an example, and the present application is not limited thereto, and the handle may be arranged in other manners, as long as the handle includes a key and a rocker.
Taking the game program running in the terminal device as an example, when a game scene picture corresponding to the game is displayed in the graphical user interface provided by the terminal device, the virtual character can be controlled to execute actions in the game scene through the rocker and/or the key of the handle, and when a game configuration picture corresponding to the game is displayed in the graphical user interface provided by the terminal device, the cursor can be controlled to move on the game configuration picture through the rocker or the key of the handle. Optionally, in the game configuration interface, no response is made to the operation of other various functional keys (keys except for the key and the rocker) arranged on the handle. However, the present application is not limited to this, and may respond to the operation of other various function keys.
By the mode, two control modes of the rocker and the key of the handle can be supported aiming at the target interface, and the switching between the rocker and the key on the handle can be realized.
Step S103: and receiving a second control instruction aiming at the cursor through a rocker of the handle, controlling the cursor to switch from the first moving mode to the second moving mode on the target interface, and controlling the moving distance of the cursor on the target interface according to the second control instruction.
Here, the second movement manner includes: and controlling the cursor to determine the moving distance of the cursor in the direction indicated by the second control instruction on the target interface according to the moving sensitivity and the time length indicated by the second control instruction.
For example, the movement sensitivity may be determined according to a pressure value of the joystick being dialed, for example, a pressure value corresponding to the dialing may be detected when the joystick is dialed in the over-specified direction, the movement sensitivity is higher when the detected pressure value is larger (i.e., the moving speed of the cursor on the target interface is faster), and the movement sensitivity is lower when the detected pressure value is smaller (i.e., the moving speed of the cursor on the target interface is slower). It should be understood that the above-mentioned manner for determining the movement sensitivity is only an example, and the movement sensitivity may also be determined by other manners, which is not limited in the present application.
For example, the duration indicated by the second control instruction may be determined according to the duration of the rocker being toggled, for example, the duration of the rocker being toggled may be directly determined as the duration indicated by the second control instruction, or it may also be determined that the duration indicated by the second control instruction is in positive correlation with the duration of the rocker being toggled.
Further, the change in coordinates of the cursor in the second manner of movement is continuous, e.g., the cursor does not discretely jump from the currently indicated control to the next control. In the above steps, the moving direction of the cursor on the target interface can be determined according to the direction in which the rocker is toggled, and the moving distance of the cursor on the target interface is controlled according to the pressure value and the duration in which the rocker is toggled.
In the embodiment of the present application, when a response is made to a preset input instruction, the response to a target input instruction is prohibited. Here, the preset input instruction is one of the first control instruction and the second control instruction, and the target input instruction is the other of the first control instruction and the second control instruction.
That is, when responding to the first control instruction, the response to the second control instruction is prohibited, and/or, when responding to the second control instruction, the response to the first control instruction is prohibited. In other words, at any one time under the target interface, the terminal device responds to the input of only one control module on the handle, i.e. the input of the control module of one of the rocker and key on the handle is valid.
In the embodiment of the present application, the prohibition of the response to the target input instruction includes the following two cases.
In one case, the target input instruction is not detected to inhibit responding to the target input instruction.
In this case, the switching operation for switching the manner of movement for the cursor may be triggered by other control modules on the handle, that is, by other control modules other than the input of the first control instruction and the second control instruction.
For example, if one function key on the handle is preset as a key for triggering the switching operation, and when the cursor is controlled to move on the target interface in a first movement mode through the key of the handle, if the operation on the preset one function key is received, the switching is triggered, so that the cursor is controlled to switch from the first movement mode to a second movement mode on the target interface.
In another case, a target input command is detected, but a response to the target input command is inhibited.
That is, when the control cursor moves on the target interface according to the preset input instruction, the target input instruction can be received, but the response to the target input instruction is forbidden, that is, the control cursor does not move on the target interface according to the movement mode indicated by the target input instruction.
In this case, the switching operation for switching the movement pattern for the cursor may be triggered by a control module on the handle corresponding to the target input instruction, that is, a rocker on the handle triggers the switching operation from the first movement pattern to the second movement pattern.
For example, the second control instruction may include a first switching sub-instruction and a first manipulating sub-instruction, where the first switching sub-instruction is used to control the cursor to switch from the first movement mode to the second movement mode at the target interface, and the first manipulating sub-instruction is used to control the cursor to move in the second movement mode at the target interface.
In this embodiment, the first switching sub-command and the first manipulating sub-command may be commands generated by the joystick under the same triggering action, for example, a motion of the joystick beyond a certain direction. For example, if the user wants to switch the cursor from the key control to the joystick control, the user can directly dial the joystick (corresponding to the first switching sub-command, where the dial direction and pressure may not be limited) to activate the joystick, and then control the movement of the cursor on the target interface according to the manipulation of the joystick (corresponding to the first manipulation sub-command) after the joystick is activated.
In addition, the first switching sub-command and the first manipulating sub-command may also be commands generated by the joystick under different triggering actions, for example, the first switching sub-command may be a pressing operation performed on the joystick, and the first manipulating sub-command may be a shifting operation performed on the joystick.
In the embodiment of the application, the rocker on the handle is used for triggering the switching operation from the first movement mode to the second movement mode, that is, the switching operation is triggered by the switched target object, so that switching confusion of a user can be effectively avoided, and the user cannot confirm which control mode for the cursor is currently positioned.
Step S104: and receiving a third control instruction aiming at the cursor through a key of the handle, and controlling the cursor to switch from the second movement mode to the first movement mode on the target interface.
In the embodiment of the application, the button and the rocker of the handle can be switched back and forth, that is, after the second control command is received, the cursor can be controlled to be switched from the first moving mode to the second moving mode, after that, when the third control command is received, the cursor can be controlled to be switched from the second moving mode to the first moving mode, after that, if the second control command for the cursor is received through the rocker of the handle, the cursor is controlled to be switched from the first moving mode to the second moving mode again, and the process is repeated.
In the case where the target input command is not detected, the switching operation for switching the moving manner of the cursor may be triggered by another control module on the handle, that is, by another control module other than the input of the third control command and the second control command.
For example, if another function key on the handle is preset as a key for triggering the switching operation, and when the joystick of the handle is used to control the cursor to move on the target interface in the second movement mode, if the operation on the preset another function key is received, the switching is triggered, so as to control the cursor to switch from the second movement mode to the first movement mode on the target interface.
In the case of detecting the target input command, the switching operation for switching the moving mode of the cursor may be triggered by a control module on the handle corresponding to the target input command, that is, a key on the handle triggers the switching operation from the second moving mode to the first moving mode.
For example, the third control instruction may include a second switching sub-instruction and a second manipulating sub-instruction, where the second switching sub-instruction is used to control the cursor to switch from the second movement mode to the first movement mode at the target interface, and the second manipulating sub-instruction is used to control the cursor to move in the first movement mode at the target interface.
In the embodiment of the present application, the second switching sub-instruction and the second manipulating sub-instruction may be instructions generated by the key under the same triggering action. For example, if the user wants to switch the cursor from the joystick control to the key control, the user can directly press the key (corresponding to the second switch sub-command, here, pressing any extension end of the key) to activate the key, and after the key is activated, the user can control the movement of the cursor on the target interface according to the manipulation of the key (corresponding to the second manipulation sub-command).
In addition, the second switching sub-command and the second manipulation sub-command may also be commands generated by the key under different trigger actions, for example, the second switching sub-command may be a long-press operation performed on the key, and the second manipulation sub-command may be a short-press operation performed on the joystick. Or, the second switching sub-instruction may be a predetermined number of pressing operations on the key within a predetermined time, and the second manipulation sub-instruction may be a short pressing/long pressing operation on the joystick.
In the embodiment of the present application, in a case that the key is a cross key, the switching operation may be implemented by using a specific extending end (e.g., dpad Right) of the cross key.
In a preferred embodiment, the method for controlling a cursor through a handle according to the embodiment of the present application may further include prompting, on the graphical user interface, switching of the moving mode of the cursor.
For example, after the switching of the cursor movement manner is completed, prompt information corresponding to the switched control module is displayed on the graphical user interface, and the prompt information may be, for example, text information for indicating the switched control module, or a schematic image similar/identical to the external structure of the switched control module.
In a preferred embodiment, the display transparency of the prompt message on the graphical user interface can be adjusted to avoid blocking the content displayed on the graphical user interface.
It should be understood that, in the method for controlling a cursor by a handle according to the embodiment of the present application, the two control modes of the handle, i.e., the joystick and the key, can be switched arbitrarily, and the method is not limited to the above-mentioned switching process from the key → the joystick → the key → the joystick, but can be switched repeatedly between the joystick and the key, and only needs to satisfy the switching condition.
In a preferred embodiment, the control module that is initially valid upon entry into the target interface may be selected based on the presentation properties of a plurality of controls displayed in the target interface. It should be understood that the present application is not limited thereto, and an initial control module set under the target interface may be preset, and the preset initial control module is activated each time the target interface is entered, and then the switching is performed between the rocker and the key of the handle based on the switching manner.
For example, the initial control module in an active state under the target interface may be determined by any one of: the control module selected by the user when the user exits the target interface for the last time, the default control module, the control module determined based on the display attributes of the controls displayed on the target interface, and the control module determined based on the number of times of use/frequency of use/duration of use of each control module by the user (for example, the control module with the largest number of times of use).
In a preferred embodiment, the step of determining an initial control module under the target interface based on the presentation properties of the plurality of controls displayed on the target interface may comprise: and determining an initial control module under the target interface from the keys and the rocker according to the display attributes of the plurality of controls on the target interface.
Specifically, whether the multiple controls meet the preset display condition or not can be determined according to the display attribute of each control on the target interface. Here, the preset presentation condition may be used to characterize the display complexity of the multiple controls on the target interface.
Illustratively, the presentation properties include a display layout and/or a display number of the plurality of controls on the target interface, i.e., the display complexity of the plurality of controls on the target interface is measured in terms of the display layout and/or the display number.
In this case, the preset presentation condition may include at least one of: the number of the controls displayed in any preset display area is not less than a first set value; the total number of the plurality of controls exposed on the target interface is not less than a second set value. Here, the target interface may be divided into a plurality of preset display areas, and the size of each preset display area may be the same or different.
If the preset display conditions are met, the display complexity of each control on the target interface is high, and at the moment, the rocker can be selected as an initial control module under the target interface to control the cursor to be quickly positioned around the target control.
If the preset display condition is not met, the display complexity of each control on the target interface is low, and at the moment, the key can be selected as an initial control module under the target interface to control the cursor to be accurately positioned to the target control.
Therefore, any control can be selected quickly and accurately in the target interface, and the operation steps are simplified.
In an example, assuming that the initial control module under the target interface is a rocker of the handle, at this time, a certain control around the target control may be selected by one operation on the rocker, and then the target control is accurately located through navigation of the key. For example, a control around the target control can be selected by manipulating the joystick, and then the operation is switched to key manipulation for fine tuning. Therefore, the keys can be used for navigating to the upper, lower, left and right tiny buttons close to the keys at any selected position, so that the problem of inaccurate control precision of the rocker is avoided, and the key navigation is prevented from being used for multiple times.
Through the mode, a user can freely use the rocker or the key navigation to control the movement of the cursor according to the position of the target control in the target interface, the key navigation can be used for controlling the control with a relatively close position and a relatively small target, and the rocker can be used for controlling the cursor to be quickly selected for the control with a relatively far position.
According to the method for controlling the cursor through the handle, two modes of key navigation and a free cursor based on a rocker can be supported, the operation difficulty of a handle player facing a complex UI interface can be simplified, seamless free switching can be performed between the two modes at any time, and the operation freedom of the player can be more free.
Based on the same application concept, the embodiment of the present application further provides a device for controlling a cursor through a handle, which corresponds to the method provided by the above embodiment, and as the principle of the device in the embodiment of the present application for solving the problem is similar to the method for controlling the cursor through the handle in the above embodiment of the present application, the implementation of the device can refer to the implementation of the method, and repeated details are not repeated.
Fig. 4 is a schematic structural diagram of a device for controlling a cursor through a handle according to an exemplary embodiment of the present application. As shown in fig. 4, the device 300 for controlling a cursor by a handle includes:
and the display control module 310 displays the target interface through the terminal equipment.
The cursor control module 320 receives a first control command for the cursor through the button of the handle, and controls the cursor to move in a first moving manner on the target interface, where the first moving manner includes: and controlling the cursor to move from the control currently indicated by the cursor to the next control in the direction indicated by the first control instruction on the target interface.
The first switching module 330 receives a second control instruction for the cursor through a rocker of the handle, controls the cursor to switch from a first moving mode to a second moving mode on the target interface, and controls the moving distance of the cursor on the target interface according to the second control instruction; wherein, the second mode of movement includes: and controlling the cursor to determine the moving distance of the cursor in the direction indicated by the second control instruction on the target interface according to the moving sensitivity and the time length indicated by the second control instruction.
The second switching module 340 receives a third control instruction for the cursor through the button of the handle, and controls the cursor to switch from the second moving mode to the first moving mode on the target interface.
In one possible embodiment of the present application, the cursor control module 320 prohibits responding to the target input command when responding to the preset input command, where the first switching module 330 and the second switching module 340 perform corresponding processing for the preset input command and the target input command. Here, the preset input instruction is one of the first control instruction and the second control instruction, and the target input instruction is the other of the first control instruction and the second control instruction.
In one possible implementation of the present application, the module may receive a target input command and prohibit a response to the target input command.
In one possible implementation manner of the present application, the second control instruction includes a first switching sub-instruction and a first manipulating sub-instruction, where the first switching sub-instruction is used to control the cursor to switch from the first moving manner to the second moving manner on the target interface, and the first manipulating sub-instruction is used to control the cursor to move in the second moving manner on the target interface.
In one possible implementation manner of the present application, the third control instruction includes a second switching sub-instruction and a second manipulating sub-instruction, where the second switching sub-instruction is used to control the cursor to switch from the second movement manner to the first movement manner at the target interface, and the second manipulating sub-instruction is used to control the cursor to move in the first movement manner at the target interface.
In one possible implementation of the application, the target interface is an interface displayed on the terminal device for presenting the plurality of controls in a static form.
In one possible implementation of the present application, the target interface includes an in-game configuration screen provided by the terminal device.
In one possible implementation manner of the present application, the display control module 310 displays, in a graphical user interface provided by a terminal device, a game scene picture corresponding to a virtual character located in a game scene; in response to a manipulation for the virtual character, controlling the virtual character to perform an action corresponding to the manipulation in the virtual scene; and responding to a preset trigger event, and controlling the picture displayed in the graphical user interface to be switched from the game scene picture to the target interface.
In one possible embodiment of the present application, the target interface includes a plurality of controls, and the cursor control module 320 can determine the initial control module under the target interface by: and determining an initial control module under the target interface from the keys and the rocker according to the display attributes of the plurality of controls on the target interface.
In one possible implementation of the present application, the cursor control module 320 can determine the initial control module under the target interface by: determining whether the plurality of controls meet a preset display condition according to the display attributes, wherein the preset display condition is used for representing the display complexity of the plurality of controls on the target interface; if the preset display condition is met, selecting a rocker as an initial control module under a target interface; and if the preset display condition is not met, selecting the key as an initial control module under the target interface.
In one possible embodiment of the present application, the presentation property includes a display layout and/or a display number of the plurality of controls on the target interface, wherein the preset presentation condition includes at least one of the following items: the number of the controls displayed in any preset display area is not less than a first set value, and the target interface is divided into a plurality of preset display areas; the total number of the plurality of controls exposed on the target interface is not less than a second set value.
Through the device for controlling the cursor through the handle, key navigation and a free cursor mode based on the rocker can be supported, the operation difficulty of a handle player facing a complex UI interface can be simplified, seamless free switching can be performed between the two modes at any time, and the operation freedom of the player can be more free.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present disclosure. As shown in fig. 5, the electronic device 400 includes a processor 410, a memory 420, and a bus 430.
The memory 420 stores machine-readable instructions executable by the processor 410, the processor 410 and the memory 420 communicate via the bus 430 when the electronic device 400 is running, and the machine-readable instructions, when executed by the processor 410, can perform the steps of the method for controlling a cursor by a handle according to any of the embodiments described above, specifically as follows:
displaying a target interface through the terminal equipment; receiving a first control instruction aiming at the cursor through a key of the handle, and controlling the cursor to move in a first moving mode on the target interface, wherein the first moving mode comprises the following steps: controlling the cursor to move from the control currently indicated by the cursor to the next control in the direction indicated by the first control instruction on the target interface; receiving a second control instruction aiming at the cursor through a rocker of the handle, controlling the cursor to switch from a first moving mode to a second moving mode on the target interface, and controlling the moving distance of the cursor on the target interface according to the second control instruction; wherein, the second mode of movement includes: controlling the cursor to determine the moving distance of the cursor in the direction indicated by the second control instruction on the target interface according to the moving sensitivity and the duration indicated by the second control instruction; and receiving a third control instruction aiming at the cursor through a key of the handle, and controlling the cursor to switch from the second movement mode to the first movement mode on the target interface.
In one possible embodiment of the present application, the processor 410 may further perform the following processing: and when responding to the preset input instruction, forbidding responding to the target input instruction, wherein the preset input instruction is one of the first control instruction and the second control instruction, and the target input instruction is the other of the first control instruction and the second control instruction.
In one possible embodiment of the present application, the processor 410 may further perform the following processing: and receiving a target input instruction, and forbidding responding to the target input instruction.
In one possible implementation manner of the present application, the second control instruction includes a first switching sub-instruction and a first manipulating sub-instruction, where the first switching sub-instruction is used to control the cursor to switch from the first moving manner to the second moving manner on the target interface, and the first manipulating sub-instruction is used to control the cursor to move in the second moving manner on the target interface.
In one possible implementation manner of the present application, the third control instruction includes a second switching sub-instruction and a second manipulating sub-instruction, where the second switching sub-instruction is used to control the cursor to switch from the second movement manner to the first movement manner at the target interface, and the second manipulating sub-instruction is used to control the cursor to move in the first movement manner at the target interface.
In one possible implementation manner of the application, the target interface is an interface displayed on the terminal device and used for showing the plurality of controls in a static form.
In one possible implementation of the present application, the target interface includes an in-game configuration screen provided by the terminal device.
In one possible implementation of the present application, the processor 410 may further perform the following processing: displaying a game scene picture corresponding to a virtual character in a game scene in a graphical user interface provided by the terminal equipment; in response to a manipulation for the virtual character, controlling the virtual character to perform an action corresponding to the manipulation in the virtual scene; and responding to a preset trigger event, and controlling the picture displayed in the graphical user interface to be switched from the game scene picture to the target interface.
In one possible embodiment of the present application, the target interface includes a plurality of controls, and the processor 410 may further execute the following processing to determine the initial control module under the target interface by: and determining an initial control module under the target interface from the keys and the rocker according to the display attributes of the plurality of controls on the target interface.
In one possible embodiment of the present application, the processor 410 may further perform the following process to determine an initial control model under the target interface by: determining whether the plurality of controls meet a preset display condition according to the display attributes, wherein the preset display condition is used for representing the display complexity of the plurality of controls on the target interface; if the preset display condition is met, selecting a rocker as an initial control module under a target interface; and if the preset display condition is not met, selecting the key as an initial control module under the target interface.
In one possible embodiment of the present application, the presentation property includes a display layout and/or a display number of a plurality of controls on the target interface, wherein the preset presentation condition includes at least one of the following items: the number of the controls displayed in any preset display area is not less than a first set value, and the target interface is divided into a plurality of preset display areas; the total number of the plurality of controls shown on the target interface is not less than a second set value.
Through the electronic equipment, two modes of key navigation and a free cursor based on a rocker can be supported, the operation difficulty of a handle player facing a complex UI interface can be simplified, seamless free switching can be realized between the two modes at any time, and the operation freedom of the player can be more free.
The embodiment of the present application further provides a computer-readable storage medium, where the storage medium stores a computer program, and when the computer program is executed by a processor, the computer program can perform the steps of the method for controlling a cursor through a handle in any of the above embodiments, specifically as follows:
displaying a target interface through the terminal equipment; receiving a first control instruction aiming at the cursor through a key of the handle, and controlling the cursor to move in a first moving mode on the target interface, wherein the first moving mode comprises the following steps: controlling the cursor to move from the control currently indicated by the cursor to the next control in the direction indicated by the first control instruction on the target interface; receiving a second control instruction aiming at the cursor through a rocker of the handle, controlling the cursor to switch from a first moving mode to a second moving mode on the target interface, and controlling the moving distance of the cursor on the target interface according to the second control instruction; wherein, the second mode of movement includes: controlling the cursor to determine the moving distance of the cursor in the direction indicated by the second control instruction on the target interface according to the moving sensitivity and the duration indicated by the second control instruction; and receiving a third control instruction aiming at the cursor through a key of the handle, and controlling the cursor to switch from the second movement mode to the first movement mode on the target interface.
In one possible embodiment of the present application, the processor may further perform the following processing: and when responding to the preset input instruction, forbidding responding to the target input instruction, wherein the preset input instruction is one of the first control instruction and the second control instruction, and the target input instruction is the other one of the first control instruction and the second control instruction.
In one possible embodiment of the present application, the processor may further perform the following processing: and receiving a target input instruction, and forbidding responding to the target input instruction.
In one possible implementation manner of the present application, the second control instruction includes a first switching sub-instruction and a first manipulating sub-instruction, wherein the first switching sub-instruction is used to control the cursor to switch from the first moving manner to the second moving manner at the target interface, and the first manipulating sub-instruction is used to control the cursor to move in the second moving manner at the target interface.
In one possible implementation manner of the present application, the third control instruction includes a second switching sub-instruction and a second manipulating sub-instruction, where the second switching sub-instruction is used to control the cursor to switch from the second movement manner to the first movement manner at the target interface, and the second manipulating sub-instruction is used to control the cursor to move in the first movement manner at the target interface.
In one possible implementation manner of the application, the target interface is an interface displayed on the terminal device and used for showing the plurality of controls in a static form.
In one possible implementation of the present application, the target interface includes an in-game configuration screen provided by the terminal device.
In one possible embodiment of the present application, the processor may further perform the following processing: displaying a game scene picture corresponding to a virtual character in a game scene in a graphical user interface provided by a terminal device; in response to a manipulation to the virtual character, controlling the virtual character to perform an action corresponding to the manipulation in the virtual scene; and responding to a preset trigger event, and controlling the picture displayed in the graphical user interface to be switched from the game scene picture to the target interface.
In one possible embodiment of the present application, the target interface includes a plurality of controls, and the processor 410 may further execute the following processing to determine the initial control module under the target interface by: and determining an initial control module under the target interface from the keys and the rocker according to the display attributes of the plurality of controls on the target interface.
In one possible embodiment of the present application, the processor may further perform the following process to determine the initial control model under the target interface by: determining whether the plurality of controls meet a preset display condition according to the display attributes, wherein the preset display condition is used for representing the display complexity of the plurality of controls on the target interface; if the preset display condition is met, selecting a rocker as an initial control module under a target interface; and if the preset display condition is not met, selecting the key as an initial control module under the target interface.
In one possible embodiment of the present application, the presentation property includes a display layout and/or a display number of the plurality of controls on the target interface, wherein the preset presentation condition includes at least one of the following items: the number of the controls displayed in any preset display area is not less than a first set value, and the target interface is divided into a plurality of preset display areas; the total number of the plurality of controls shown on the target interface is not less than a second set value.
The computer-readable storage medium can support two modes of key navigation and a free cursor based on a rocker, can simplify the operation difficulty of a handle player facing a complex UI interface, and can be seamlessly and freely switched at any time, so that the operation freedom of the player can be more free.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall cover the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. A method of controlling a cursor through a handle, the handle comprising a button and a rocker, the method comprising:
displaying a target interface through the terminal equipment;
receiving a first control instruction aiming at the cursor through the keys of the handle, and controlling the cursor to move in a first movement mode on the target interface, wherein the first movement mode comprises the following steps: controlling the cursor to move from the control currently indicated by the cursor to the next control in the direction indicated by the first control instruction on the target interface;
receiving a second control instruction aiming at the cursor through the rocker of the handle, controlling the cursor to be switched from a first moving mode to a second moving mode on the target interface, and controlling the moving distance of the cursor on the target interface according to the second control instruction; wherein the second moving manner comprises: controlling the cursor to determine the moving distance of the cursor in the target interface along the direction indicated by the second control instruction according to the moving sensitivity and the time length indicated by the second control instruction;
and receiving a third control instruction aiming at the cursor through the key of the handle, and controlling the cursor to be switched from the second movement mode to the first movement mode on the target interface.
2. The method of claim 1, further comprising:
and when responding to a preset input instruction, forbidding responding to a target input instruction, wherein the preset input instruction is one of the first control instruction and the second control instruction, and the target input instruction is the other one of the first control instruction and the second control instruction.
3. The method of claim 2, wherein inhibiting a response to a target input command comprises:
and receiving the target input instruction, and forbidding responding to the target input instruction.
4. The method of claim 1, wherein the second control instruction comprises a first switch sub-instruction and a first steer sub-instruction,
the first switching sub-instruction is used for controlling the cursor to switch from a first movement mode to a second movement mode on the target interface, and the first manipulating sub-instruction is used for controlling the cursor to move in the second movement mode on the target interface.
5. The method of claim 1, wherein the third control instruction comprises a second switching sub-instruction and a second steering sub-instruction,
the second switching sub-instruction is used for controlling the cursor to switch from the second moving mode to the first moving mode on the target interface, and the second control sub-instruction is used for controlling the cursor to move in the first moving mode on the target interface.
6. The method according to claim 1, wherein the target interface is an interface displayed on the terminal device for presenting a plurality of controls in a static form.
7. The method of claim 1, wherein the target interface comprises an in-game configuration screen provided via a terminal device.
8. The method of claim 7, wherein the step of displaying the target interface through the terminal device comprises:
displaying a game scene picture corresponding to a virtual character in a game scene in a graphical user interface provided by the terminal equipment;
in response to a manipulation to the virtual character, controlling the virtual character to perform an action corresponding to the manipulation in the virtual scene;
and responding to a preset trigger event, and controlling the picture displayed in the graphical user interface to be switched from the game scene picture to the target interface.
9. The method of claim 1, wherein the target interface includes a plurality of controls,
wherein the initial control model under the target interface is determined by:
and determining an initial control module under the target interface from the key and the rocker according to the display attributes of the controls on the target interface.
10. The method of claim 9, wherein the initial control model under the target interface is determined by:
determining whether the plurality of controls meet a preset display condition according to the display attributes, wherein the preset display condition is used for representing the display complexity of the plurality of controls on the target interface;
if the preset display condition is met, selecting the rocker as an initial control module under the target interface;
and if the preset display condition is not met, selecting the key as an initial control module under the target interface.
11. The method of claim 10, wherein the presentation properties include a display layout and/or a display quantity of the plurality of controls on the target interface,
wherein the preset display condition comprises at least one of the following items:
the number of the controls displayed in any preset display area is not less than a first set value, and the target interface is divided into a plurality of preset display areas;
the total number of the plurality of controls shown on the target interface is not less than a second set value.
12. A device for controlling a cursor through a handle, said handle comprising keys and a rocker, said device comprising:
the display control module displays a target interface through the terminal equipment;
the cursor control module receives a first control instruction aiming at the cursor through the keys of the handle and controls the cursor to move on the target interface in a first movement mode, wherein the first movement mode comprises the following steps: controlling the cursor to move from the control currently indicated by the cursor to the next control in the direction indicated by the first control instruction on the target interface;
the first switching module receives a second control instruction aiming at the cursor through the rocker of the handle, controls the cursor to switch from a first moving mode to a second moving mode on the target interface, and controls the moving distance of the cursor on the target interface according to the second control instruction; wherein the second movement mode comprises: controlling the cursor to determine the moving distance of the cursor in the direction indicated by the second control instruction according to the moving sensitivity and the time length indicated by the second control instruction;
and the second switching module is used for receiving a third control instruction aiming at the cursor through the keys of the handle and controlling the cursor to be switched from the second moving mode to the first moving mode on the target interface.
13. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the electronic device is operating, the processor executing the machine-readable instructions to perform the steps of the method according to any one of claims 1 to 11.
14. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, is adapted to carry out the steps of the method according to any one of claims 1 to 11.
CN202210714041.1A 2022-06-22 2022-06-22 Method and device for controlling cursor through handle, electronic equipment and storage medium Pending CN115155052A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210714041.1A CN115155052A (en) 2022-06-22 2022-06-22 Method and device for controlling cursor through handle, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210714041.1A CN115155052A (en) 2022-06-22 2022-06-22 Method and device for controlling cursor through handle, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115155052A true CN115155052A (en) 2022-10-11

Family

ID=83487057

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210714041.1A Pending CN115155052A (en) 2022-06-22 2022-06-22 Method and device for controlling cursor through handle, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115155052A (en)

Similar Documents

Publication Publication Date Title
EP2820528B1 (en) Systems and methods for presenting visual interface content
US10821360B2 (en) Data processing method and mobile terminal
JP5735472B2 (en) Game providing device
KR101662500B1 (en) Systems and methods for managing, selecting, and updating visual interface content using display-enabled keyboards, keypads, and/or other user input devices
CN110812838B (en) Virtual unit control method and device in game and electronic equipment
US20230241501A1 (en) Display method and apparatus for virtual prop, electronic device and storage medium
KR102610422B1 (en) Method and apparatus, device, and storage medium for processing avatar usage data
KR20200113834A (en) Apparatus and method for providing application information
CN115105831A (en) Virtual object switching method and device, storage medium and electronic device
CN115155052A (en) Method and device for controlling cursor through handle, electronic equipment and storage medium
JP5933069B2 (en) Game providing device
KR102557808B1 (en) Gaming service system and method for sharing memo therein
CN117899451A (en) Game processing method and device, electronic equipment and storage medium
EP3984608A1 (en) Method and apparatus for controlling virtual object, and terminal and storage medium
KR102609293B1 (en) Apparatus and method for determining game action
US20230306703A1 (en) Information processing system, program, and information processing method
KR20160126848A (en) Method for processing a gesture input of user
CN116339598A (en) Course display method, device, equipment and storage medium
CN116795269A (en) Virtual prop switching method, device, equipment and storage medium
CN115089968A (en) Operation guiding method and device in game, electronic equipment and storage medium
CN116459519A (en) Method and device for controlling virtual character in game, storage medium and electronic device
CN117298572A (en) Game control method, game control device, electronic equipment and storage medium
CN113877198A (en) Terminal operation method and device, electronic equipment and storage medium
JP2022131381A (en) program
CN116920372A (en) Game display method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination