CN117504278A - Interaction method, interaction device, computer equipment and computer readable storage medium - Google Patents

Interaction method, interaction device, computer equipment and computer readable storage medium Download PDF

Info

Publication number
CN117504278A
CN117504278A CN202311535009.8A CN202311535009A CN117504278A CN 117504278 A CN117504278 A CN 117504278A CN 202311535009 A CN202311535009 A CN 202311535009A CN 117504278 A CN117504278 A CN 117504278A
Authority
CN
China
Prior art keywords
target
control
operation identifier
functional control
functional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311535009.8A
Other languages
Chinese (zh)
Inventor
江志基
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311535009.8A priority Critical patent/CN117504278A/en
Publication of CN117504278A publication Critical patent/CN117504278A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Position Input By Displaying (AREA)

Abstract

The embodiment of the application discloses an interaction method, an interaction device, computer equipment and a computer readable storage medium, comprising the following steps: displaying the operation identifier on a graphical user interface; moving on the graphical user interface according to the first user input operation control operation identifier; determining a moving direction of an operation identifier in the graphical user interface; and when the first user input operation is continuous, responding to the second user input operation, and moving the operation identification to the target function control in the moving direction. According to the method and the device for controlling the operation of the external input device, the operation identification can be controlled to move along the moving direction on the graphical user interface according to the user input operation, the operation identification is controlled to be automatically adhered to the control closest to the operation identification in the moving direction by the user input operation again, time and energy for controlling the external input device by the user are saved, control precision and control efficiency of the operation identification controlled by the external input device are improved, game interaction efficiency is improved, and game experience of a player is improved.

Description

Interaction method, interaction device, computer equipment and computer readable storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an interaction method, an interaction device, a computer device, and a computer readable storage medium.
Background
With the continuous development of computer communication technology, a large number of popular applications of terminals such as smart phones, tablet computers and notebook computers are becoming indispensable equipment in life and work. In order to meet the pursuit of people for mental life, entertainment games capable of being operated on terminals have been developed, for example, types of games such as role-playing games, tactical game games, shooting game games, or massively multiplayer online role-playing games developed based on client or server architecture. In the game, the player can manipulate the virtual character in the screen to play the game, and can perform the related operations of walking, running, jumping, picking up props, fighting and the like in the game scene based on the first view angle or the third view angle of the character operated by the player, so that the player can experience the visual impact brought by the game in an immersive manner. Currently, players can play games through terminals such as mobile phones, computers and televisions, and game pictures of the games are displayed through display screens of the terminal devices. For example, since the display screen of the television terminal has a large display size and a clear screen, the effect of the game screen presented is good, and more players will choose to play the game through the television terminal.
In the prior art, a player may play a game through an external input device such as a keyboard, a mouse and/or a game pad, for example, after accessing the external input device such as the game pad, an operation identifier, for example, a virtual cursor, is displayed on a graphical user interface provided by a display screen of a television terminal, and the player may control the virtual cursor to move in position on the graphical user interface by moving a joystick of the game pad, so as to implement an operation on the graphical user interface through the game pad. However, when the player uses the joystick of the game handle to control the virtual cursor, for example, when the player needs to control the virtual cursor to move for a long distance to select the control, the player needs to spend time and effort to operate the joystick to move the virtual cursor to the position of the control, and the control precision of the virtual cursor through the game handle is not high, so that the control efficiency of the virtual cursor is low, and the game interactivity is poor.
Disclosure of Invention
The embodiment of the application provides an interaction method, an interaction device, computer equipment and a computer readable storage medium, wherein a user can generate user input operation through external input equipment, control an operation identifier to move along a moving direction on a graphical user interface, and the user can generate user input operation again through the external input equipment when the operation identifier moves, so that the operation identifier can be controlled to be automatically adhered to a control closest to the operation identifier in the moving direction, the operation of adsorbing the operation identifier to the control closest to the operation identifier can be quickly triggered when the user manually operates, the moving step of the operation identifier is simplified, the time and energy of the user for controlling the external input equipment are saved, the control precision and the control efficiency of the external input equipment for controlling the operation identifier are improved, and the game interaction efficiency is improved, so that the game experience of a player is improved.
The embodiment of the application provides an interaction method, which comprises the following steps:
displaying an operation identifier on a graphical user interface;
controlling the operation mark to move on a graphical user interface according to a first user input operation;
determining a moving direction of the operation mark on the graphical user interface;
and when the first user input operation is continuous, responding to a second user input operation, and moving the operation identification to the target function control in the moving direction.
Correspondingly, the embodiment of the application also provides an interaction device, which comprises:
a display unit for displaying the operation identifier on the graphical user interface;
the control unit is used for controlling the operation identifier to move on the graphical user interface according to the first user input operation;
a determining unit, configured to determine a movement direction of the operation identifier in the graphical user interface;
and the mobile unit is used for responding to a second user input operation when the first user input operation is continuous, and moving the operation identifier to the target function control in the moving direction.
Accordingly, embodiments of the present application further provide a computer device, including a processor, a memory, and a computer program stored on the memory and capable of running on the processor, where the computer program when executed by the processor implements the steps of any of the interaction methods.
Accordingly, embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of any of the interaction methods.
The embodiment of the application provides an interaction method, an interaction device, computer equipment and a computer readable storage medium, wherein operation identifiers are displayed on a graphical user interface; then, controlling the operation mark to move on the graphical user interface according to the first user input operation; next, determining a moving direction of the operation mark on the graphical user interface; and finally, when the first user input operation is continuous, responding to a second user input operation, and moving the operation identification to the target function control in the moving direction. According to the embodiment of the application, the user can generate user input operation through the external input device, the operation identifier is controlled to move on the graphical user interface along the moving direction, the user can generate user input operation again through the external input device when the operation identifier moves, the operation identifier can be controlled to be automatically adhered to a control closest to the operation identifier in the moving direction, and therefore the operation of the user for triggering the operation identifier to be adsorbed to the control closest to the operation identifier in a manual operation mode is achieved, the moving step of the operation identifier is simplified, the time and energy of the user for controlling the external input device are saved, the control precision and the control efficiency of the external input device for controlling the operation identifier are improved, the game interaction efficiency is improved, and the game experience of a player is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a game data processing system according to an embodiment of the present application.
Fig. 2 is a schematic flow chart of an interaction method provided in an embodiment of the present application.
Fig. 3 is a schematic view of a scenario of an interaction method provided in an embodiment of the present application.
Fig. 4 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 5 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 6 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 7 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 8 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 9 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 10 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 11 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 12 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 13 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 14 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 15 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 16 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 17 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 18 is another schematic view of an interaction method provided in an embodiment of the present application.
Fig. 19 is a schematic structural diagram of an interaction device according to an embodiment of the present application.
Fig. 20 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are merely some, but not all embodiments of the present disclosure. Based on the embodiments in this disclosure, all other embodiments that a person of ordinary skill in the art would obtain without making any inventive effort are within the scope of the disclosure.
The embodiment of the application provides an interaction method, an interaction device, computer equipment and a computer readable storage medium. Specifically, the interaction method of the embodiment of the application may be performed by a computer device, where the computer device may be a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA), and the like, and the terminal may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms.
For example, when the interactive method is run on the terminal, the terminal device stores a game application and is used to present a virtual scene in a game screen. The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
Referring to fig. 1, fig. 1 is a schematic view of a scenario of an interactive system according to an embodiment of the present application. The system may include at least one external input device, at least one terminal, at least one server, at least one database, and a network. The user can control the virtual cursor to move on the graphical user interface provided by the terminal through the external input device, and the terminal held by the user can be connected to the servers of different games through the network. A terminal is any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, when the system includes a plurality of terminals, a plurality of servers, and a plurality of networks, different terminals may be connected to each other through different networks, through different servers. The external input device may be a mouse, a keyboard, a joystick, or the like, and the network may be a wireless network or a wired network, for example, the wireless network may be a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, or the like. In addition, the different terminals may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. For example, multiple users may be online through different terminals to connect and synchronize with each other through an appropriate network to support multiplayer games. In addition, the system may include multiple databases coupled to different servers and information related to the gaming environment may be continuously stored in the databases as different users play multiplayer games online.
The embodiment of the application provides an interaction method which can be executed by a terminal or a server. The embodiments of the present application are described with an interaction method performed by a terminal as an example. The terminal may include a touch display screen and a processor (of course, the terminal may also use peripheral devices such as a mouse, a keyboard, and a game pad as input devices, which are only illustrated here by using the touch display screen as an example), where the touch display screen is used to present a graphical user interface and receive an operation instruction generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the touch display screen, the graphical user interface can control the local content of the terminal by responding to the received operation instruction, and can also control the content of the opposite-end server by responding to the received operation instruction. For example, the user-generated operational instructions for the graphical user interface include instructions for launching the gaming application, and the processor is configured to launch the gaming application after receiving the user-provided instructions for launching the gaming application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch-sensitive display screen. A touch display screen is a multi-touch-sensitive screen capable of sensing touch or slide operations performed simultaneously by a plurality of points on the screen. The user performs touch operation on the graphical user interface by using a finger, and when the graphical user interface detects the touch operation, the graphical user interface controls different virtual objects in the graphical user interface of the game to perform actions corresponding to the touch operation.
It should be noted that, the schematic view of the scenario of the interactive system shown in fig. 1 is merely an example, and the interactive system and the scenario described in the embodiments of the present application are for more clearly describing the technical solutions of the embodiments of the present application, and do not constitute a limitation on the technical solutions provided in the embodiments of the present application, and as a person of ordinary skill in the art can know that, with the appearance of a new service scenario, the technical solutions provided in the embodiments of the present application are equally applicable to similar technical problems.
Based on the above problems, the embodiments of the present application provide an interaction method, an interaction device, a computer device, and a computer readable storage medium, which can improve the control accuracy and control efficiency of an external input device for controlling an operation identifier, and improve the interaction efficiency of a game. The following will describe in detail. The following description of the embodiments is not intended to limit the preferred embodiments.
The embodiment of the application provides an interaction method, which can be executed by a terminal or a server, and the embodiment of the application is explained by taking the interaction method as an example executed by the terminal, and can be executed by displaying an operation identifier on a graphical user interface; then, controlling the operation mark to move on the graphical user interface according to the first user input operation; next, determining a moving direction of the operation mark on the graphical user interface; and finally, when the first user input operation is continuous, responding to a second user input operation, and moving the operation identification to the target function control in the moving direction. According to the embodiment of the application, the user can generate user input operation through the external input device, the operation identifier is controlled to move on the graphical user interface along the moving direction, the user can generate user input operation again through the external input device when the operation identifier moves, the operation identifier can be controlled to be automatically adhered to a control closest to the operation identifier in the moving direction, and therefore the operation of the user for triggering the operation identifier to be adsorbed to the control closest to the operation identifier in a manual operation mode is achieved, the moving step of the operation identifier is simplified, the time and energy of the user for controlling the external input device are saved, the control precision and the control efficiency of the external input device for controlling the operation identifier are improved, the game interaction efficiency is improved, and the game experience of a player is improved.
Referring to fig. 2, fig. 2 is a schematic flow chart of an interaction method according to an embodiment of the present application, and a specific flow of the interaction method may be as follows:
101, displaying operation identification on a graphical user interface.
For example, referring to fig. 3, a graphical user interface 10 may be provided through a computer device, and when the computer device is connected to an external input device, an operation identifier 11 may be displayed on the graphical user interface 10, where a first function control 12 and a second function control 13 are also displayed.
In this embodiment of the present application, the operation identifier may be a cursor identifier of a gamepad for controlling a cursor control in the target terminal device, and the operation identifier may also be a cursor identifier of a mouse for controlling a cursor control in the target terminal device.
102, controlling the operation identifier to move on the graphical user interface according to the first user input operation.
In this embodiment of the present application, the user may trigger a first user input operation through an external input device, and the computer device may control the operation identifier 11 to move on the graphical user interface 10 according to the first user input operation.
103, determining the moving direction of the operation mark on the graphical user interface.
For example, referring to fig. 4, in an embodiment of the present application, the computer device may determine, according to the first user input operation, a moving direction (a direction indicated by an arrow in fig. 4) of the operation identifier 11 in the graphical user interface 10, so as to determine a moving trend of the operation identifier 11.
104, when the first user input operation is continuous, responding to a second user input operation, and moving the operation identification to the target function control in the moving direction.
Specifically, when the first user input operation is continuous, responding to a second user input operation, moving the operation identifier to a target function control in the moving direction, and taking the target function control as an operation target.
In this embodiment of the present application, when the computer device responds to the duration of the first user input operation, if a second user input operation performed by the user through the external input device is detected, the computer device responds to the second user input operation, and moves the operation identifier 11 to the target function control in the moving direction, so as to implement a shortcut adsorption operation.
In light of the foregoing description, the following examples will further illustrate the interaction method of the present application, and specific embodiments of the interaction method are described below.
In an embodiment, the step of "moving the operation identifier onto the target function control in the moving direction while the first user input operation is continuous and in response to the second user input operation", the method may include:
when the first user input operation is continuous, responding to a second user input operation, and determining a target extension line based on the current position of the operation mark and the moving direction;
acquiring a functional control positioned on the target extension line, and determining a target functional control based on the relative position relation between the functional control and the operation identifier;
and moving the operation identifier to the target function control.
For example, referring to fig. 5 and fig. 6 together, a graphical user interface 10 may be provided through a computer device, when the computer device is connected to an external input device, an operation identifier 11 may be displayed on the graphical user interface 10, where the graphical user interface 10 further displays a first function control 12 and a second function control 13, and when the first user input operation continues, the computer device may determine, in response to the second user input operation, a target extension line (i.e., indicated by a dashed line in fig. 5) based on a current position of the operation identifier 11 and the moving direction (indicated by an arrow in fig. 5), where the computer device may obtain a function control located on the target extension line, and determine, based on a relative position relationship between the function control and the operation identifier 11, a target function control.
In another embodiment, the step of "obtaining the function control located on the target extension line, determining the target function control based on the relative positional relationship between the function control and the operation identifier", the method may include:
acquiring a plurality of functional controls positioned on the target extension line;
and determining a function control closest to the display position of the operation identifier from the plurality of function controls based on the relative position relation between each function control and the operation identifier, and taking the function control as a target function control.
For example, referring to fig. 7 and fig. 8 together, a graphical user interface 10 may be provided through a computer device, when the computer device is connected to an external input device, an operation identifier 11 may be displayed on the graphical user interface 10, where the graphical user interface 10 further displays a first function control 12 and a second function control 13, and the computer device may obtain the first function control 12 and the second function control 13 located on the target extension line, then determine, from the plurality of function controls, a function control closest to the display position of the operation identifier 11 as a target function control based on a relative positional relationship between each function control and the operation identifier 11, where, although the first function control 12 and the second function control 13 are both on the target extension line, a relative distance between the second function control 13 and the operation identifier 11 is smaller than a relative distance between the first function control 12 and the operation identifier 11, that is, the second function control 13 is the function control closest to the display position of the operation identifier 11, and automatically moves the second function control 13 to the second function control 11 as the target function control.
Optionally, the step of acquiring the function control located on the target extension line and determining the target function control based on the relative positional relationship between the function control and the operation identifier may include:
acquiring a first function control and a second function control which are positioned on the target extension line;
and if the distance between the first functional control and the operation identifier is the same as the distance between the second functional control and the operation identifier, determining a target functional control based on preset selection logic, the display position of the first functional control and the display position of the second functional control.
For example, referring to fig. 9 and fig. 10 together, a graphical user interface 10 may be provided through a computer device, when the computer device is connected to an external input device, an operation identifier 11 may be displayed on the graphical user interface 10, the graphical user interface 10 further displays a first function control 12 and a second function control 13, the computer device may obtain the first function control 12 and the second function control 13 located on the target extension line, where the first function control 12 and the second function control 13 are both located on the target extension line, and a relative distance between the first function control 12 and the operation identifier 11 and a relative distance between the second function control 13 and the operation identifier 11 are the same, and at this time, the preset selection logic may be: the function control located at the left side of the target extension line is selected as the target function control, so that the first function control 12 can be determined as the target function control, and the operation identifier 11 can be automatically moved to the first function control 12.
In an embodiment, the method further comprises:
if the functional control does not exist on the target extension line, a judging range is generated based on the target extension line, the functional control in the judging range is obtained, and the functional control in the judging range is determined to be the target functional control.
Specifically, in the step of determining the functional control within the determination range as the target functional control, the method may include:
if the judging range comprises a plurality of functional controls, determining the functional control closest to the target extension line from the plurality of functional controls based on the relative position relation between each functional control and the target extension line, and taking the functional control closest to the target extension line as a target functional control.
For example, referring to fig. 11, 12 and 13 together, when the first user input operation is continued, the computer device responds to the second user input operation, and determines, based on the current position of the operation identifier 11 and the moving direction, whether there is a function control on the target extension line, where no function control is present on the target extension line, and therefore, a preset determination range 14 is generated based on the display position of the target extension line, a function control located in the preset determination range 14 is obtained, and the first function control 12 closest to the target extension line is used as the target function control.
In another embodiment, after displaying the operation identifier on the graphical user interface, the method further comprises:
responding to a third user input operation, and generating a target judgment range based on the current display position of the operation identifier and a preset distance;
acquiring a functional control located in the target judgment range, and determining at least one functional control in the target judgment range as a target functional control;
and moving the operation identifier to the target function control.
For example, referring to fig. 14 and 15 together, a graphical user interface 10 may be provided through a computer device, when the computer device is connected with an external input device, an operation identifier 11 may be displayed on the graphical user interface 10, the graphical user interface 10 further displays a first function control 12 and a second function control 13, the computer device may generate, in response to a third user input operation, a target judgment range 15 based on a current display position of the operation identifier 11 and a preset distance, and acquire a function control located in the target judgment range 15, where the function control located in the target judgment range 15 is the second function control 13, so that the second function control 13 is used as a target function control, and the operation identifier 11 is moved to the second function control 13.
Optionally, in the step of determining at least one functional control within the target judgment range as a target functional control, the method may include:
and selecting one functional control from the target judgment range as the target functional control based on the distance between the functional control in the target judgment range and the operation identifier and/or based on the size of the space occupied by the functional control in the target judgment range.
For example, when the first functional control and the second functional control exist in the target judgment range at the same time, determining the distance between the first functional control and the second functional control and the operation identifier, and if the distance between the first functional control and the operation identifier is closer than the distance between the second functional control and the operation identifier, taking the first functional control as the target functional control.
For another example, when the first functional control and the second functional control exist in the target judgment range at the same time, determining the space size occupied by the first functional control and the second functional control in the target judgment range respectively, and if the space size occupied by the first functional control in the target judgment range is larger than the space size occupied by the second functional control in the target judgment range, taking the first functional control as the target functional control.
Further, after "move the operation identifier onto the target function control in the moving direction" in step, the method may include:
responding to a fourth user input operation, and generating a specified judgment range based on the display position of the target functional control;
acquiring the functional control within the specified judging range, and determining at least one functional control within the specified judging range as a new target functional control;
and moving the operation identifier to the new target functional control.
Specifically, the computer device may generate a specified judgment range based on the display position of the target function control in response to a fourth user input operation; then, acquiring a functional control within the specified judging range, and determining at least one functional control within the specified judging range as a new target functional control; and finally, the operation identifier is moved to the new target function control. Taking the new target function control as an operation target.
For example, referring to fig. 16, the current display position of the operation identifier 11 (as shown by the operation identifier 11 with a solid line in fig. 16) is displayed on the second function control 13, at this time, the computer device generates a specified judgment range based on the display position of the target function control in response to the fourth user input operation, acquires the first function control 12 located within the specified judgment range, determines the first function control 12 as a new target function control, and moves the operation identifier 11 onto the first function control 12 (as shown by the operation identifier 11 with a dotted line in fig. 16).
In light of the foregoing, the following will further describe, by way of example, the interaction method of the present application, with reference to fig. 8 and 17, taking the external input device as an example of the game pad 20, where the game pad 20 includes a joystick 21, and the joystick 21 may be moved by a 306 ° rotation control operation identifier 11, and the joystick 21 may also perform a pressing operation, and a specific embodiment of the interaction method is as follows:
(1) In this embodiment of the present application, a graphical user interface 10 may be provided through a computer device, when the computer device is connected to the game pad 20, an operation identifier 11 may be displayed on the graphical user interface 10, where the graphical user interface 10 further displays a first function control 12 and a second function control 13, and at this time, the user may dial the game rocker 21 in a target direction (such as a direction indicated by a long arrow on the game rocker 21 in fig. 17) to generate a first user input operation, the computer device controls the operation identifier 11 to move on the graphical user interface 10 according to the first user input operation, and the computer device may determine, according to the first user input operation, a moving direction of the operation identifier 11 on the graphical user interface 10 (such as a direction indicated by an arrow on the operation identifier 11 in fig. 17), so as to determine a moving trend of the operation identifier 11. Then, the user may generate a second user input operation by pressing the game rocker 21 (as indicated by the short arrow on the game rocker 21 in fig. 17) while keeping the game rocker 21 moving toward the target direction (as indicated by the long arrow on the game rocker 21 in fig. 17), and when the first user input operation is continued, the computer device may obtain the first function control 12 and the second function control 13 located on the target extension line in response to the second user input operation, and then determine, from the plurality of function controls, the function control closest to the display position of the operation identifier 11 as the target function control, although the first function control 12 and the second function control 13 are both on the target extension line, the relative distance between the second function control 13 and the operation identifier 11 is smaller than the relative distance between the first function control 12 and the operation identifier 11, that is, the second function control 13 is the function closest to the display position of the operation identifier 11, and then automatically moves the second function control 13 to the target function control 11 as the target function control 13.
(2) The user may press the game rocker 21 (as indicated by a short arrow on the game rocker 21 in fig. 17) to generate a fourth user input operation, the computer device generates a specified judgment range based on the display position of the target function control in response to the fourth user input operation, then obtains a function control located in the specified judgment range, determines a new target function control based on the relative positional relationship between the function control and the target function control, and finally moves the operation identifier 11 to the new target function control.
In light of the foregoing, the following will further describe, by way of example, the interaction method of the present application, with reference to fig. 15 and 18, taking an external input device as an example of a joystick 20, where the joystick 20 includes a joystick 21, and the joystick 21 may be moved by a 360 ° rotation control operation identifier 11, and the joystick 21 may also perform a pressing operation, and another specific embodiment of the interaction method is as follows:
(1) In this embodiment of the present application, a graphical user interface 10 may be provided through a computer device, when the computer device is connected to a game pad 20, an operation identifier 11 may be displayed on the graphical user interface 10, where the graphical user interface 10 further displays a first function control 12 and a second function control 13, at this time, a user may perform a pressing operation on a game rocker 21 (as indicated by a short arrow on the game rocker 21 in fig. 18) to generate a third user input operation, and the computer device may respond to the third user input operation, generate a target judgment range 15 based on a current display position of the operation identifier 11 and a preset distance, and obtain a function control located in the target judgment range 15, at this time, the function control located in the target judgment range 15 is the second function control 13, so that the second function control 13 is used as a target function control, and move the operation identifier 11 to the second function control 13.
(2) The user may press the game rocker 21 (as indicated by a short arrow on the game rocker 21 in fig. 18) to generate a fourth user input operation, the computer device generates a specified judgment range based on the display position of the target function control in response to the fourth user input operation, then obtains a function control located in the specified judgment range, determines a new target function control based on the relative positional relationship between the function control and the target function control, and finally moves the operation identifier 11 to the new target function control.
In summary, the embodiments of the present application provide an interaction method, by displaying an operation identifier on a graphical user interface; then, controlling the operation mark to move on the graphical user interface according to the first user input operation; next, determining a moving direction of the operation mark on the graphical user interface; and finally, when the first user input operation is continuous, responding to a second user input operation, and moving the operation identification to the target function control in the moving direction. According to the embodiment of the application, the user can generate user input operation through the external input device, the operation identifier is controlled to move on the graphical user interface along the moving direction, the user can generate user input operation again through the external input device when the operation identifier moves, the operation identifier can be controlled to be automatically adhered to a control closest to the operation identifier in the moving direction, and therefore the operation of the user for triggering the operation identifier to be adsorbed to the control closest to the operation identifier in a manual operation mode is achieved, the moving step of the operation identifier is simplified, the time and energy of the user for controlling the external input device are saved, the control precision and the control efficiency of the external input device for controlling the operation identifier are improved, the game interaction efficiency is improved, and the game experience of a player is improved.
In order to facilitate better implementation of the interaction method provided by the embodiment of the application, the embodiment of the application also provides an interaction device based on the interaction method. Where the meaning of a noun is the same as in the interaction method described above, specific implementation details may be referred to the description in the method embodiment.
Referring to fig. 19, fig. 19 is a schematic structural diagram of an interaction device according to an embodiment of the present application, where the interaction device includes:
a display unit 201 for displaying an operation identifier on a graphical user interface;
a control unit 202 for controlling the operation identifier to move on the graphical user interface according to a first user input operation;
a determining unit 203, configured to determine a movement direction of the operation identifier in the graphical user interface;
and the mobile unit 204 is used for responding to the second user input operation and moving the operation identifier to the target function control in the moving direction when the first user input operation is continuous.
In some embodiments, the interaction means comprises:
a first response subunit, configured to determine, in response to a second user input operation when the first user input operation continues, a target extension line based on a current position of the operation identifier and the movement direction;
The first acquisition subunit is used for acquiring the functional control on the target extension line and determining a target functional control based on the relative position relation between the functional control and the operation identifier;
and the first moving subunit is used for moving the operation identifier to the target function control.
In some embodiments, the interaction means comprises:
the second acquisition subunit is used for acquiring a plurality of function controls positioned on the target extension line;
the first determining subunit is configured to determine, from the plurality of functional controls, a functional control closest to a display position of the operation identifier as a target functional control based on a relative positional relationship between each functional control and the operation identifier.
In some embodiments, the interaction means comprises:
the third acquisition subunit is used for acquiring the first functional control and the second functional control which are positioned on the target extension line;
and the second determining subunit is configured to determine, if the distance between the first functional control and the operation identifier is the same as the distance between the second functional control and the operation identifier, a target functional control based on a preset selection logic, a display position of the first functional control, and a display position of the second functional control.
In some embodiments, the interaction means comprises:
and the third determination subunit is configured to generate a judgment range based on the target extension line if no functional control exists on the target extension line, acquire a functional control located in the judgment range, and determine the functional control in the judgment range as a target functional control.
In some embodiments, the interaction means comprises:
and the fourth determination subunit is configured to determine, if the judging range includes a plurality of functional controls, a functional control closest to the target extension line from the plurality of functional controls based on a relative positional relationship between each functional control and the target extension line, as a target functional control.
In some embodiments, the interaction means comprises:
the first generation subunit is used for responding to a third user input operation and generating a target judgment range based on the current display position of the operation identifier and a preset distance;
a fourth determining subunit, configured to obtain a functional control located in the target determination range, and determine at least one functional control in the target determination range as a target functional control;
and the second moving subunit is used for moving the operation identifier to the target function control.
In some embodiments, the interaction means comprises:
and the selecting subunit is used for selecting one functional control from the target judging range as the target functional control based on the distance between the functional control in the target judging range and the operation identifier and/or based on the size of the space occupied by the functional control in the target judging range.
In some embodiments, the interaction means comprises:
the second response subunit is used for responding to a fourth user input operation and generating a specified judging range based on the display position of the target functional control;
a fifth determining subunit, configured to obtain a functional control located in the specified determination range, and determine at least one functional control in the specified determination range as a new target functional control;
and the third moving subunit is used for moving the operation identifier to the new target functional control.
The embodiment of the application discloses an interaction device, wherein an operation identifier is displayed on a graphical user interface through a display unit 201; the control unit 202 controls the operation identifier to move on the graphical user interface according to the first user input operation; the determining unit 203 determines a moving direction of the operation mark on the graphical user interface; the mobile unit 204 moves the operation identifier to the target function control in the movement direction in response to a second user input operation while the first user input operation is continued. According to the embodiment of the application, the user can generate user input operation through the external input device, the operation identifier is controlled to move on the graphical user interface along the moving direction, the user can generate user input operation again through the external input device when the operation identifier moves, the operation identifier can be controlled to be automatically adhered to a control closest to the operation identifier in the moving direction, and therefore the operation of the user for triggering the operation identifier to be adsorbed to the control closest to the operation identifier in a manual operation mode is achieved, the moving step of the operation identifier is simplified, the time and energy of the user for controlling the external input device are saved, the control precision and the control efficiency of the external input device for controlling the operation identifier are improved, the game interaction efficiency is improved, and the game experience of a player is improved.
Correspondingly, the embodiment of the application also provides a computer device, which can be a terminal or a server, wherein the terminal can be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA) and the like. As shown in fig. 20, fig. 20 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer device 300 includes a processor 301 having one or more processing cores, a memory 302 having one or more computer readable storage media, and a computer program stored on the memory 302 and executable on the processor. The processor 301 is electrically connected to the memory 302. It will be appreciated by those skilled in the art that the computer device structure shown in the figures is not limiting of the computer device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Processor 301 is a control center of computer device 300 and utilizes various interfaces and lines to connect various portions of the overall computer device 300, and to perform various functions of computer device 300 and process data by running or loading software programs and/or modules stored in memory 302 and invoking data stored in memory 302, thereby performing overall monitoring of computer device 300.
In the embodiment of the present application, the processor 301 in the computer device 300 loads the instructions corresponding to the processes of one or more application programs into the memory 302 according to the following steps, and the processor 301 executes the application programs stored in the memory 302, so as to implement various functions:
displaying an operation identifier on a graphical user interface;
controlling the operation mark to move on a graphical user interface according to a first user input operation;
determining a moving direction of the operation mark on the graphical user interface;
and when the first user input operation is continuous, responding to a second user input operation, and moving the operation identification to the target function control in the moving direction.
In an embodiment, the moving the operation identifier onto the target function control in the moving direction in response to the second user input operation while the first user input operation is continuous includes:
when the first user input operation is continuous, responding to a second user input operation, and determining a target extension line based on the current position of the operation mark and the moving direction;
acquiring a functional control positioned on the target extension line, and determining a target functional control based on the relative position relation between the functional control and the operation identifier;
And moving the operation identifier to the target function control.
In an embodiment, the obtaining the function control located on the target extension line, and determining the target function control based on the relative positional relationship between the function control and the operation identifier includes:
acquiring a plurality of functional controls positioned on the target extension line;
and determining a function control closest to the display position of the operation identifier from the plurality of function controls based on the relative position relation between each function control and the operation identifier, and taking the function control as a target function control.
In an embodiment, the obtaining the function control located on the target extension line, and determining the target function control based on the relative positional relationship between the function control and the operation identifier includes:
acquiring a first function control and a second function control which are positioned on the target extension line;
and if the distance between the first functional control and the operation identifier is the same as the distance between the second functional control and the operation identifier, determining a target functional control based on preset selection logic, the display position of the first functional control and the display position of the second functional control.
In an embodiment, the method further comprises:
if the functional control does not exist on the target extension line, a judging range is generated based on the target extension line, the functional control in the judging range is obtained, and the functional control in the judging range is determined to be the target functional control.
In an embodiment, the determining the function control within the judging range as the target function control includes:
if the judging range comprises a plurality of functional controls, determining the functional control closest to the target extension line from the plurality of functional controls based on the relative position relation between each functional control and the target extension line, and taking the functional control closest to the target extension line as a target functional control.
In an embodiment, after displaying the operation identifier on the graphical user interface, the method further comprises:
responding to a third user input operation, and generating a target judgment range based on the current display position of the operation identifier and a preset distance;
acquiring a functional control located in the target judgment range, and determining at least one functional control in the target judgment range as a target functional control;
and moving the operation identifier to the target function control.
In an embodiment, determining at least one functional control within the target judgment range as a target functional control includes:
And selecting one functional control from the target judgment range as the target functional control based on the distance between the functional control in the target judgment range and the operation identifier and/or based on the size of the space occupied by the functional control in the target judgment range.
In an embodiment, after moving the operation identifier onto the target function control in the moving direction, the method further includes:
responding to a fourth user input operation, and generating a specified judgment range based on the display position of the target functional control;
acquiring the functional control within the specified judging range, and determining at least one functional control within the specified judging range as a new target functional control;
and moving the operation identifier to the new target functional control.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 20, the computer device 300 further includes: a touch display 303, a radio frequency circuit 304, an audio circuit 305, an input unit 306, and a power supply 307. The processor 301 is electrically connected to the touch display 303, the radio frequency circuit 304, the audio circuit 305, the input unit 306, and the power supply 307, respectively. Those skilled in the art will appreciate that the computer device structure shown in FIG. 20 is not limiting of the computer device and may include more or fewer components than shown, or may be combined with certain components, or a different arrangement of components.
The touch display 303 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display 303 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of a computer device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 301, and can receive and execute commands sent from the processor 301. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 301 to determine the type of touch event, and the processor 301 then provides a corresponding visual output on the display panel in accordance with the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 303 to implement the input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch-sensitive display 303 may also implement an input function as part of the input unit 306.
In the present embodiment, a graphical user interface is generated on touch display 303 by processor 301 executing a gaming application. The touch display 303 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The radio frequency circuitry 304 may be used to transceive radio frequency signals to establish wireless communications with a network device or other computer device via wireless communications.
The audio circuit 305 may be used to provide an audio interface between a user and a computer device through a speaker, microphone. The audio circuit 305 may transmit the received electrical signal after audio data conversion to a speaker, and convert the electrical signal into a sound signal for output by the speaker; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 305 and converted into audio data, which are processed by the audio data output processor 301 for transmission to, for example, another computer device via the radio frequency circuit 304, or which are output to the memory 302 for further processing. The audio circuit 305 may also include an ear bud jack to provide communication of the peripheral ear bud with the computer device.
The input unit 306 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 307 is used to power the various components of the computer device 300. Alternatively, the power supply 307 may be logically connected to the processor 301 through a power management system, so as to perform functions of managing charging, discharging, and power consumption management through the power management system. The power supply 307 may also include one or more of any components, such as a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 20, the computer device 300 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which will not be described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment displays the operation identifier on the graphical user interface; then, controlling the operation mark to move on the graphical user interface according to the first user input operation; next, determining a moving direction of the operation mark on the graphical user interface; and finally, when the first user input operation is continuous, responding to a second user input operation, and moving the operation identification to the target function control in the moving direction. According to the embodiment of the application, the user can generate user input operation through the external input device, the operation identifier is controlled to move on the graphical user interface along the moving direction, the user can generate user input operation again through the external input device when the operation identifier moves, the operation identifier can be controlled to be automatically adhered to a control closest to the operation identifier in the moving direction, and therefore the operation of the user for triggering the operation identifier to be adsorbed to the control closest to the operation identifier in a manual operation mode is achieved, the moving step of the operation identifier is simplified, the time and energy of the user for controlling the external input device are saved, the control precision and the control efficiency of the external input device for controlling the operation identifier are improved, the game interaction efficiency is improved, and the game experience of a player is improved.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium having stored therein a plurality of computer programs that can be loaded by a processor to perform steps in any of the interaction methods provided by embodiments of the present application. For example, the computer program may perform the steps of:
displaying an operation identifier on a graphical user interface;
controlling the operation mark to move on a graphical user interface according to a first user input operation;
determining a moving direction of the operation mark on the graphical user interface;
and when the first user input operation is continuous, responding to a second user input operation, and moving the operation identification to the target function control in the moving direction.
In an embodiment, the moving the operation identifier onto the target function control in the moving direction in response to the second user input operation while the first user input operation is continuous includes:
When the first user input operation is continuous, responding to a second user input operation, and determining a target extension line based on the current position of the operation mark and the moving direction;
acquiring a functional control positioned on the target extension line, and determining a target functional control based on the relative position relation between the functional control and the operation identifier;
and moving the operation identifier to the target function control.
In an embodiment, the obtaining the function control located on the target extension line, and determining the target function control based on the relative positional relationship between the function control and the operation identifier includes:
acquiring a plurality of functional controls positioned on the target extension line;
and determining a function control closest to the display position of the operation identifier from the plurality of function controls based on the relative position relation between each function control and the operation identifier, and taking the function control as a target function control.
In an embodiment, the obtaining the function control located on the target extension line, and determining the target function control based on the relative positional relationship between the function control and the operation identifier includes:
acquiring a first function control and a second function control which are positioned on the target extension line;
And if the distance between the first functional control and the operation identifier is the same as the distance between the second functional control and the operation identifier, determining a target functional control based on preset selection logic, the display position of the first functional control and the display position of the second functional control.
In an embodiment, the method further comprises:
if the functional control does not exist on the target extension line, a judging range is generated based on the target extension line, the functional control in the judging range is obtained, and the functional control in the judging range is determined to be the target functional control.
In an embodiment, the determining the function control within the judging range as the target function control includes:
if the judging range comprises a plurality of functional controls, determining the functional control closest to the target extension line from the plurality of functional controls based on the relative position relation between each functional control and the target extension line, and taking the functional control closest to the target extension line as a target functional control.
In an embodiment, after displaying the operation identifier on the graphical user interface, the method further comprises:
responding to a third user input operation, and generating a target judgment range based on the current display position of the operation identifier and a preset distance;
Acquiring a functional control located in the target judgment range, and determining at least one functional control in the target judgment range as a target functional control;
and moving the operation identifier to the target function control.
In an embodiment, determining at least one functional control within the target judgment range as a target functional control includes:
and selecting one functional control from the target judgment range as the target functional control based on the distance between the functional control in the target judgment range and the operation identifier and/or based on the size of the space occupied by the functional control in the target judgment range.
In an embodiment, after moving the operation identifier onto the target function control in the moving direction, the method further includes:
responding to a fourth user input operation, and generating a specified judgment range based on the display position of the target functional control;
acquiring the functional control within the specified judging range, and determining at least one functional control within the specified judging range as a new target functional control;
and moving the operation identifier to the new target functional control.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
Thanks to the computer program stored in the storage medium, the operation identification can be displayed on the graphical user interface; then, controlling the operation mark to move on the graphical user interface according to the first user input operation; next, determining a moving direction of the operation mark on the graphical user interface; and finally, when the first user input operation is continuous, responding to a second user input operation, and moving the operation identification to the target function control in the moving direction. According to the embodiment of the application, the user can generate user input operation through the external input device, the operation identifier is controlled to move on the graphical user interface along the moving direction, the user can generate user input operation again through the external input device when the operation identifier moves, the operation identifier can be controlled to be automatically adhered to a control closest to the operation identifier in the moving direction, and therefore the operation of the user for triggering the operation identifier to be adsorbed to the control closest to the operation identifier in a manual operation mode is achieved, the moving step of the operation identifier is simplified, the time and energy of the user for controlling the external input device are saved, the control precision and the control efficiency of the external input device for controlling the operation identifier are improved, the game interaction efficiency is improved, and the game experience of a player is improved.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
The foregoing describes in detail an interaction method, apparatus, computer device and computer readable storage medium provided in the embodiments of the present application, and specific examples are applied to illustrate principles and implementations of the present disclosure, where the foregoing description of the embodiments is only for helping to understand the technical solutions of the present disclosure and core ideas thereof; those of ordinary skill in the art will appreciate that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present disclosure.

Claims (12)

1. An interaction method, comprising:
displaying an operation identifier on a graphical user interface;
controlling the operation mark to move on a graphical user interface according to a first user input operation;
determining a moving direction of the operation mark on the graphical user interface;
and when the first user input operation is continuous, responding to a second user input operation, and moving the operation identification to the target function control in the moving direction.
2. The interaction method according to claim 1, wherein said moving the operation identifier onto the target functionality control in the moving direction in response to a second user input operation while the first user input operation is continued, comprises:
when the first user input operation is continuous, responding to a second user input operation, and determining a target extension line based on the current position of the operation mark and the moving direction;
acquiring a functional control positioned on the target extension line, and determining a target functional control based on the relative position relation between the functional control and the operation identifier;
and moving the operation identifier to the target function control.
3. The interaction method according to claim 2, wherein the obtaining the functionality control located on the target extension line, and determining the target functionality control based on the relative positional relationship between the functionality control and the operation identifier, includes:
acquiring a plurality of functional controls positioned on the target extension line;
and determining a function control closest to the display position of the operation identifier from the plurality of function controls based on the relative position relation between each function control and the operation identifier, and taking the function control as a target function control.
4. The interaction method according to claim 2, wherein the obtaining the functionality control located on the target extension line, and determining the target functionality control based on the relative positional relationship between the functionality control and the operation identifier, includes:
acquiring a first function control and a second function control which are positioned on the target extension line;
and if the distance between the first functional control and the operation identifier is the same as the distance between the second functional control and the operation identifier, determining a target functional control based on preset selection logic, the display position of the first functional control and the display position of the second functional control.
5. The method of interaction of claim 2, wherein the method further comprises:
if the functional control does not exist on the target extension line, a judging range is generated based on the target extension line, the functional control in the judging range is obtained, and the functional control in the judging range is determined to be the target functional control.
6. The interaction method according to claim 5, wherein the determining the functional control within the judgment range as the target functional control includes:
If the judging range comprises a plurality of functional controls, determining the functional control closest to the target extension line from the plurality of functional controls based on the relative position relation between each functional control and the target extension line, and taking the functional control closest to the target extension line as a target functional control.
7. The interaction method of claim 1, further comprising, after displaying the operation identifier on the graphical user interface:
responding to a third user input operation, and generating a target judgment range based on the current display position of the operation identifier and a preset distance;
acquiring a functional control located in the target judgment range, and determining at least one functional control in the target judgment range as a target functional control;
and moving the operation identifier to the target function control.
8. The interaction method according to claim 7, wherein determining at least one functionality control within the target judgment range as a target functionality control comprises:
and selecting one functional control from the target judgment range as the target functional control based on the distance between the functional control in the target judgment range and the operation identifier and/or based on the size of the space occupied by the functional control in the target judgment range.
9. The interaction method of any of claims 1 or 7, further comprising, after moving the operation identifier onto a target functionality control in the movement direction:
responding to a fourth user input operation, and generating a specified judgment range based on the display position of the target functional control;
acquiring the functional control within the specified judging range, and determining at least one functional control within the specified judging range as a new target functional control;
and moving the operation identifier to the new target functional control.
10. An interactive apparatus, comprising:
a display unit for displaying the operation identifier on the graphical user interface;
the control unit is used for controlling the operation identifier to move on the graphical user interface according to the first user input operation;
a determining unit, configured to determine a movement direction of the operation identifier in the graphical user interface;
and the mobile unit is used for responding to a second user input operation when the first user input operation is continuous, and moving the operation identifier to the target function control in the moving direction.
11. A computer device comprising a processor, a memory and a computer program stored on the memory and capable of running on the processor, which when executed by the processor implements the interaction method of any of claims 1 to 9.
12. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the interaction method of any of claims 1 to 9.
CN202311535009.8A 2023-11-16 2023-11-16 Interaction method, interaction device, computer equipment and computer readable storage medium Pending CN117504278A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311535009.8A CN117504278A (en) 2023-11-16 2023-11-16 Interaction method, interaction device, computer equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311535009.8A CN117504278A (en) 2023-11-16 2023-11-16 Interaction method, interaction device, computer equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN117504278A true CN117504278A (en) 2024-02-06

Family

ID=89758228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311535009.8A Pending CN117504278A (en) 2023-11-16 2023-11-16 Interaction method, interaction device, computer equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN117504278A (en)

Similar Documents

Publication Publication Date Title
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN113101650A (en) Game scene switching method and device, computer equipment and storage medium
CN115040873A (en) Game grouping processing method and device, computer equipment and storage medium
WO2024045528A1 (en) Game control method and apparatus, and computer device and storage medium
CN115501581A (en) Game control method and device, computer equipment and storage medium
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
CN112245914B (en) Viewing angle adjusting method and device, storage medium and computer equipment
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN113867873A (en) Page display method and device, computer equipment and storage medium
CN117504278A (en) Interaction method, interaction device, computer equipment and computer readable storage medium
CN116920384A (en) Information display method and device in game, computer equipment and storage medium
CN116920390A (en) Control method and device of virtual weapon, computer equipment and storage medium
CN116999835A (en) Game control method, game control device, computer equipment and storage medium
CN117482516A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN117919694A (en) Game control method, game control device, computer equipment and storage medium
CN115501582A (en) Game interaction control method and device, computer equipment and storage medium
CN115212567A (en) Information processing method, information processing device, computer equipment and computer readable storage medium
CN115430150A (en) Game skill release method and device, computer equipment and storage medium
CN115337641A (en) Switching method and device of game props, computer equipment and storage medium
CN116764191A (en) Virtual scene area selection method and device, storage medium and computer equipment
CN115317893A (en) Virtual resource processing method and device, computer equipment and storage medium
CN115382221A (en) Method and device for transmitting interactive information, electronic equipment and readable storage medium
CN116966544A (en) Region prompting method, device, storage medium and computer equipment
CN116999847A (en) Virtual character control method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination