CN110780788A - Method and equipment for executing touch operation - Google Patents

Method and equipment for executing touch operation Download PDF

Info

Publication number
CN110780788A
CN110780788A CN201911020189.XA CN201911020189A CN110780788A CN 110780788 A CN110780788 A CN 110780788A CN 201911020189 A CN201911020189 A CN 201911020189A CN 110780788 A CN110780788 A CN 110780788A
Authority
CN
China
Prior art keywords
focus
application
movement
touch operation
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911020189.XA
Other languages
Chinese (zh)
Other versions
CN110780788B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201911020189.XA priority Critical patent/CN110780788B/en
Publication of CN110780788A publication Critical patent/CN110780788A/en
Application granted granted Critical
Publication of CN110780788B publication Critical patent/CN110780788B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

An object of the present application is to provide a method and an apparatus for performing a touch operation, the method comprising: acquiring a first touch operation of a user on a focus controller in a window applied on a touch terminal, wherein the focus controller is used for controlling the movement of a focus indicator in the window of the application; moving the focus indicator in response to the first touch operation and determining one or more selectable objects pointed to by the focus indicator as target objects; and starting the target object to execute corresponding task information. According to the method and the device, when the user controls the focus indicator through touch operation on the focus controller, the user can continuously observe the optional object for obtaining the focus in the application window at the same time, and production behaviors extremely sensitive to intermittent behaviors are guaranteed not to be interfered.

Description

Method and equipment for executing touch operation
Technical Field
The present application relates to the field of communications, and in particular, to a technique for performing touch operations.
Background
In the touch terminal, the existing touch interaction technology supports a user to select different target objects to execute a task, for example, the user selects an object by touching the object to execute the task and starts the target object to execute the task.
Disclosure of Invention
An object of the present application is to provide a method and apparatus for performing a touch operation.
According to an aspect of the present application, there is provided a method of performing a touch operation, the method including:
acquiring a first touch operation of a user on a focus controller in a window applied on a touch terminal, wherein the focus controller is used for controlling the movement of a focus indicator in the window of the application;
moving the focus indicator in response to the first touch operation and determining one or more selectable objects pointed to by the focus indicator as target objects;
and starting the target object to execute corresponding task information.
According to an aspect of the present application, there is provided an apparatus for performing a touch operation, the apparatus including:
the system comprises a module, a module and a control module, wherein the module is used for acquiring a first touch operation of a user on a focus controller in a window applied on a touch terminal, and the focus controller is used for controlling the movement of a focus indicator in the window of the application;
a second module for moving the focus indicator in response to the first touch operation and determining one or more selectable objects pointed to by the focus indicator as target objects;
and the three modules are used for starting the target object to execute corresponding task information.
According to an aspect of the present application, there is provided an apparatus for performing a touch operation, wherein the apparatus includes:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
acquiring a first touch operation of a user on a focus controller in a window applied on a touch terminal, wherein the focus controller is used for controlling the movement of a focus indicator in the window of the application;
moving the focus indicator in response to the first touch operation and determining one or more selectable objects pointed to by the focus indicator as target objects;
and starting the target object to execute corresponding task information.
According to one aspect of the application, there is provided a computer-readable medium storing instructions that, when executed, cause a system to:
acquiring a first touch operation of a user on a focus controller in a window applied on a touch terminal, wherein the focus controller is used for controlling the movement of a focus indicator in the window of the application;
moving the focus indicator in response to the first touch operation and determining one or more selectable objects pointed to by the focus indicator as target objects;
and starting the target object to execute corresponding task information.
Compared with the prior art, the method and the device have the advantages that the movement of the focus indicator is controlled through the focus controller, one or more optional objects pointed by the focus indicator are determined as target objects to execute corresponding tasks, so that a user can continuously observe optional objects obtaining focus in the application window when controlling the focus indicator through touch operation on the focus controller, and production behaviors extremely sensitive to intermittent behaviors are guaranteed not to be interfered.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 illustrates a system topology diagram for performing touch operations according to some embodiments of the present application;
FIG. 2 illustrates a system topology diagram for performing touch operations according to some embodiments of the present application;
FIG. 3 illustrates a flow chart of a method of performing a touch operation according to some embodiments of the present application;
FIG. 4 illustrates a block diagram of an apparatus for performing touch operations according to some embodiments of the present application;
FIG. 5 illustrates an exemplary system that can be used to implement the various embodiments described in this application.
The same or similar reference numbers in the drawings identify the same or similar elements.
Detailed Description
The present application is described in further detail below with reference to the attached figures.
In a typical configuration of the present application, the terminal, the device serving the network, and the trusted party each include one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The device referred to in this application includes, but is not limited to, a user device, a network device, or a device formed by integrating a user device and a network device through a network. The user equipment includes, but is not limited to, any mobile electronic product, such as a smart phone, a tablet computer, etc., capable of performing human-computer interaction with a user (e.g., human-computer interaction through a touch panel), and the mobile electronic product may employ any operating system, such as an android operating system, an iOS operating system, etc. The network device includes an electronic device capable of automatically performing numerical calculation and information processing according to a preset or stored instruction, and hardware thereof includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like. The network device includes but is not limited to a computer, a network host, a single network server, a plurality of network server sets or a cloud of a plurality of servers; here, the Cloud is composed of a large number of computers or web servers based on Cloud Computing (Cloud Computing), which is a kind of distributed Computing, one virtual supercomputer consisting of a collection of loosely coupled computers. Including, but not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a VPN network, a wireless Ad Hoc network (Ad Hoc network), etc. Preferably, the device may also be a program running on the user device, the network device, or a device formed by integrating the user device and the network device, the touch terminal, or the network device and the touch terminal through a network.
Of course, those skilled in the art will appreciate that the foregoing is by way of example only, and that other existing or future devices, which may be suitable for use in the present application, are also encompassed within the scope of the present application and are hereby incorporated by reference.
In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
Fig. 1 illustrates a system topology diagram for performing touch operations according to some embodiments of the present application.
As shown in fig. 1, the resident indicator is a focus indicator for indicating one or more selected observation objects that are focused in the application window, the wheel controller is a focus controller for controlling the resident indicator, and the user can move the resident indicator through touch operation on the wheel controller, so as to select the observation object pointed by the resident indicator and use the observation object as a target object, and then if the user selects the command button 1, the target object starts to execute the command task 1 corresponding to the command button 1, or if the user selects the command button 2, the target object starts to execute the command task 2 corresponding to the command button 2.
FIG. 2 illustrates a system topology diagram for performing touch operations according to some embodiments of the present application.
As shown in fig. 2, a rectangular area in the form of a dotted line is added in the application window as a focus area, the observation object located in the focus area is an executable action target object, and the resident pointer can be moved through a touch operation on the wheel controller, so that the observation object pointed by the resident pointer is selected as a target object to execute a corresponding instruction task. The user may set, as the executable action target object, the observation object that is outside the focus area before the relative displacement and within the focus area after the relative displacement, and set, as the non-executable action target object, the observation object that is inside the focus area before the relative displacement and outside the focus area after the relative displacement, by moving or zooming the focus area, or by moving the current scene of the application, so that the focus area is relatively displaced with respect to the current scene of the application, thereby covering different portions of the current scene of the application.
In the prior art, if different observation objects need to be selected as target objects in an application window to execute different tasks, it is usually necessary to manually select an observation object as a target object through a touch operation (for example, selecting the object by clicking a certain object), which is relatively low in precision and easy to make an error, and in some cases, if the target object needs to be continuously focused, the touch operation may interfere with continuous observation of the target object.
Compared with the prior art, the method and the device have the advantages that the movement of the focus indicator is controlled through the focus controller, one or more optional objects pointed by the focus indicator are determined as target objects to execute corresponding tasks, so that a user can continuously observe optional objects obtaining focus in the application window when controlling the focus indicator through touch operation on the focus controller, and production behaviors extremely sensitive to intermittent behaviors are guaranteed not to be interfered.
Fig. 3 is a flowchart illustrating a method of performing a touch operation according to some embodiments of the present application, the method including steps S11, S12, and S13. In step S11, a user equipment obtains a first touch operation of a user on a focus controller in a window of an application on a touch terminal, where the focus controller is configured to control movement of a focus indicator in the window of the application; in step S12, the user device moves the focus indicator in response to the first touch operation, and determines one or more selectable objects pointed to by the focus indicator as target objects; in step S13, the user equipment starts the target object to execute the corresponding task information.
In step S11, the user equipment obtains a first touch operation of a user on a focus controller in a window of an application on the touch terminal, where the focus controller is configured to control movement of a focus indicator in the window of the application. In some embodiments, the focus controller may be a wheel controller, the focus indicator may be a resident indicator, the focus indicator is configured to indicate one or more selected objects of view that obtain focus in the application window, the focus controller is configured to control the focus indicator, and the user may move the focus indicator through a touch operation on the focus controller, wherein the touch operation on the focus controller and the movement of the focus indicator form a mapping relationship, and the focus indicator moves synchronously according to an associated algorithm as the touch operation on the focus controller. The user can control the moving direction and the moving distance of the focus indicator by a sliding operation on the focus controller, for example, wherein the moving direction of the focus indicator is the same as the sliding direction of the user on the focus controller, the moving distance of the focus indicator is a fixed multiple (e.g., 10 times) of the sliding distance of the user on the focus controller, in this way, the user can control the movement of the focus indicator in the larger application window by touch operation on the smaller focus controller, which is typically the area used to view the target, due to the fact that the operation range of the focus controller is small, when a user controls the focus indicator through touch operation on the focus controller, the user can continuously observe an observation object obtaining the focus in the application window at the same time, and it is guaranteed that production behaviors extremely sensitive to intermittent behaviors are not interfered.
In step S12, the user device moves the focus indicator in response to the first touch operation, and determines one or more selectable objects pointed to by the focus indicator as target objects. In some embodiments, the selectable objects include, but are not limited to, various static or dynamic viewing objects displayed in the application window that can be selected, the target object can be a selected selectable object that obtains the focus of the application window, the user can move the focus indicator through a touch operation on the focus controller and determine one or more selectable objects pointed to by the focus indicator as the target object, the target object can be regarded as the focus of the obtained application window, and the target object can be highlighted or framed to distinguish other viewing objects than the target object. For example, the user may control the moving direction and the moving distance of the focus indicator through a sliding operation on the focus controller, and when the focus indicator moves into a display area of a certain selectable object, the selectable object is determined as a target object, or when the focus indicator moves into the display area of a certain selectable object, the selectable object pointed by the focus indicator and closest to the focus indicator may be determined as a target object through a single-click or double-click operation on the focus controller, or if the focus indicator is a directional control. For another example, the user may first perform a finger-down operation on the focus controller, then adjust the size of the virtual rectangular frame having the position of the previous finger-down operation as a fixed vertex by a sliding operation while keeping the finger touch, and then perform a gesture-up operation to determine one or more selectable objects within the display area of the virtual rectangular frame at the moment when the gesture is up as target objects while hiding and displaying the virtual rectangular frame.
In step S13, the user equipment starts the target object to execute the corresponding task information. In some embodiments, the task information is used to indicate the task to be performed by the target object, including but not limited to information such as task name, task content, task time, task object, task location, etc., and the task information may be preset and may be triggered subsequently in real time, for example, after the user determines the target object, the user selects the task information to be performed by the target object (e.g., release the skill S1) and starts the target object to perform the selected task information, so that the target object releases the skill S1.
In some embodiments, the focus controller comprises a wheel controller; wherein the moving the focus indicator in response to the first touch operation comprises: in response to the first touch operation, performing a second movement of the focus indicator according to a first movement of the first touch operation on the wheel controller. In some embodiments, the focus controller includes a wheel controller, and the user performs a gesture pressing operation in the wheel controller, and then moves the wheel in the wheel controller by a sliding operation while keeping the finger touch, so as to indirectly move the focus indicator, wherein the wheel movement and the focus indicator movement form a mutual mapping relationship, and the focus indicator is synchronously moved based on a correlation algorithm according to the movement of the wheel until the user performs a gesture lifting operation, so as to stop moving the focus indicator. For example, if the wheel disc is in the non-center area of the wheel disc controller, the focus indicator is moved according to the current real-time direction of the wheel disc and the predetermined moving speed according to the current real-time direction of the wheel disc relative to the center area of the wheel disc controller until the wheel disc moves to the center area of the wheel disc controller or the user performs gesture lifting operation, and the movement of the focus indicator is stopped.
In some embodiments, the second movement satisfies at least one of:
1) the moving direction of the second movement is identical to the moving direction of the first movement
For example, a user moves the wheel in a direction within the wheel control, and the focus indicator moves in that direction as well. As another example, the user moves the wheel clockwise within the wheel controller, and the focus indicator correspondingly moves clockwise on the plurality of selectable objects.
2) The moving direction of the second movement is consistent with the moving direction of the first movement
For example, a user moves the wheel in a direction within the wheel control and the focus indicator moves in a direction opposite to that direction. As another example, the user moves the wheel clockwise within the wheel controller, and the focus indicator correspondingly moves counterclockwise over the plurality of selectable objects.
3) The displacement value of the second movement is positively correlated with the displacement value of the first movement
For example, the user moves the wheel within the wheel control in direction L1 with a displacement value of 50 pixels and the focus indicator is correspondingly moved in direction L1 with a concomitant movement of 200 pixels, and then the user moves the wheel within the wheel control in direction L2 with a displacement value of 100 pixels and the focus indicator is correspondingly moved in direction L2 with a concomitant movement of 400 pixels.
In some embodiments, the focus indicator remains visible in a window of the application when the focus controller is in an active state. In some embodiments, the focus controller has two states, active and inactive, and the user can change the state of the focus controller from the inactive state to the active state through some touch operation (e.g., a gesture pressing operation by the user in the wheel controller), at which point the focus indicator remains visible in the application window, and then the user can also change the state of the focus controller from the active state to the inactive state through some touch operation (e.g., a gesture lifting operation by the user in the wheel controller), at which point the focus indicator remains hidden and invisible in the application window.
In some embodiments, the window of the application includes a focus area, the focus indicator is located within the focus area, and objects in the current scene of the application that are located within the focus area are selectable objects. In some embodiments, the application window may not completely display the entire area of the current scene of the application, and at this time, a focus area is added in the application window, where the focus area may be a rectangular area or an area of any other shape, and the focus area may be visible or generally invisible and becomes visible only when the focus indicator moves to a specific position of the focus area (for example, a boundary of the focus area), and the size of the focus area is generally smaller than that of the application window, and the indicator is located in the focus area and can move within the range of the focus area, and an observation object located in the focus area in the current scene of the application is a selectable object and can be pointed to by the focus indicator and determined as a target object.
In some embodiments, objects in the current scene of the application that are outside the focal region are non-selectable objects. In some embodiments, the observed objects in the current scene that are outside the focal region are not selectable objects, may not be pointed to by the focal indicator and determined to be target objects.
In some embodiments, the method further comprises: when the focus area and the current scene of the application generate relative displacement, setting an object newly entering the focus area as an optional object, or setting an object newly leaving the focus area as an unselected object. In some embodiments, the focus area may be moved or zoomed, or, alternatively, the current scene of the application may be moved such that the focus area is relatively displaced with respect to the current scene of the application to cover different portions of the current scene of the application, and the viewing objects that were outside the focus area before the relative displacement and within the focus area after the relative displacement may be set as selectable objects, may be pointed to by the focus indicator and determined as target objects, and the viewing objects that were inside the focus area before the relative displacement and outside the focus area after the relative displacement may be set as non-selectable objects, and may not be pointed to by the focus indicator and determined as target objects.
In some embodiments, the method further comprises: when the focus indicator is located at a preset position of the focus area, in response to a second touch operation of the user on the focus controller, adjusting the size of the focus area or moving the current scene of the application to generate a relative displacement between the focus area and the current scene of the application. In some embodiments, if the focus indicator is located at a predetermined position of the focus area, the predetermined position including, but not limited to, a boundary of the focus area, a certain point on the boundary, or a center of the focus area, the focus area is resized, moved, or indirectly moved by moving a current scene of the application in response to a touch operation (e.g., a finger-down operation, a sliding operation, a finger-up operation, etc.) of the focus controller by the user. Preferably, if the focus indicator is located at the center of the focus area or at another point on the boundary of the focus area except for the vertex, the position of the focus area may be moved through a touch operation or the focus area may be indirectly moved by moving the current scene of the application, and if the focus indicator is located at the vertex of the focus area, the size of the focus area may be adjusted through the touch operation.
In some embodiments, the focus controller is located outside the focus area. In some embodiments, the focus controller is located outside the focus area, so that touch operation on the focus controller does not interfere with the focus area, and the focus controller has positive significance for some production behaviors which need continuous attention and are extremely sensitive to intermittent behaviors.
In some embodiments, the step S13 includes: and responding to the execution triggering operation of the user on an instruction button in the window of the application, and starting the target object to execute the task information corresponding to the instruction button. In some embodiments, one or more instruction buttons are also presented in the application window at the same time, and in response to an execution trigger operation (e.g., a click operation, etc.) of the instruction button by the user, the selected one or more target objects that have obtained the focus are started to execute the task information corresponding to the instruction button.
Fig. 4 illustrates an apparatus for performing touch operations according to some embodiments of the present application, which includes a one-module 11, a two-module 12, and a three-module 13. A module 11, configured to obtain a first touch operation of a user on a focus controller in a window of an application on a touch terminal, where the focus controller is configured to control movement of a focus indicator in the window of the application; a secondary module 12, configured to move the focus indicator in response to the first touch operation, and determine one or more selectable objects pointed to by the focus indicator as target objects; and a third module 13, configured to start the target object to execute corresponding task information.
A module 11, configured to obtain a first touch operation of a user on a focus controller in a window of an application on a touch terminal, where the focus controller is configured to control movement of a focus indicator in the window of the application. In some embodiments, the focus controller may be a wheel controller, the focus indicator may be a resident indicator, the focus indicator is configured to indicate one or more selected objects of view that obtain focus in the application window, the focus controller is configured to control the focus indicator, and the user may move the focus indicator through a touch operation on the focus controller, wherein the touch operation on the focus controller and the movement of the focus indicator form a mapping relationship, and the focus indicator moves synchronously according to an associated algorithm as the touch operation on the focus controller. The user can control the moving direction and the moving distance of the focus indicator by a sliding operation on the focus controller, for example, wherein the moving direction of the focus indicator is the same as the sliding direction of the user on the focus controller, the moving distance of the focus indicator is a fixed multiple (e.g., 10 times) of the sliding distance of the user on the focus controller, in this way, the user can control the movement of the focus indicator in the larger application window by touch operation on the smaller focus controller, which is typically the area used to view the target, due to the fact that the operation range of the focus controller is small, when a user controls the focus indicator through touch operation on the focus controller, the user can continuously observe an observation object obtaining the focus in the application window at the same time, and it is guaranteed that production behaviors extremely sensitive to intermittent behaviors are not interfered.
A secondary module 12, configured to move the focus indicator in response to the first touch operation, and determine one or more selectable objects pointed to by the focus indicator as target objects. In some embodiments, the selectable objects include, but are not limited to, various static or dynamic viewing objects displayed in the application window that can be selected, the target object can be a selected selectable object that obtains the focus of the application window, the user can move the focus indicator through a touch operation on the focus controller and determine one or more selectable objects pointed to by the focus indicator as the target object, the target object can be regarded as the focus of the obtained application window, and the target object can be highlighted or framed to distinguish other viewing objects than the target object. For example, the user may control the moving direction and the moving distance of the focus indicator through a sliding operation on the focus controller, and when the focus indicator moves into a display area of a certain selectable object, the selectable object is determined as a target object, or when the focus indicator moves into the display area of a certain selectable object, the selectable object pointed by the focus indicator and closest to the focus indicator may be determined as a target object through a single-click or double-click operation on the focus controller, or if the focus indicator is a directional control. For another example, the user may first perform a finger-down operation on the focus controller, then adjust the size of the virtual rectangular frame having the position of the previous finger-down operation as a fixed vertex by a sliding operation while keeping the finger touch, and then perform a gesture-up operation to determine one or more selectable objects within the display area of the virtual rectangular frame at the moment when the gesture is up as target objects while hiding and displaying the virtual rectangular frame.
And a third module 13, configured to start the target object to execute corresponding task information. In some embodiments, the task information is used to indicate the task to be performed by the target object, including but not limited to information such as task name, task content, task time, task object, task location, etc., and the task information may be preset and may be triggered subsequently in real time, for example, after the user determines the target object, the user selects the task information to be performed by the target object (e.g., release the skill S1) and starts the target object to perform the selected task information, so that the target object releases the skill S1.
In some embodiments, the focus controller comprises a wheel controller; wherein the moving the focus indicator in response to the first touch operation comprises: in response to the first touch operation, performing a second movement of the focus indicator according to a first movement of the first touch operation on the wheel controller. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 3, and therefore, the description thereof is omitted, and the related operations are incorporated herein by reference.
In some embodiments, the second movement satisfies at least one of:
1) the moving direction of the second movement is identical to the moving direction of the first movement
2) The moving direction of the second movement is consistent with the moving direction of the first movement
3) The displacement value of the second movement is positively correlated with the displacement value of the first movement
Here, the related operations are the same as or similar to those of the embodiment shown in fig. 3, and therefore are not described again, and are included herein by reference.
In some embodiments, the focus indicator remains visible in a window of the application when the focus controller is in an active state. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 3, and therefore are not described again, and are included herein by reference.
In some embodiments, the window of the application includes a focus area, the focus indicator is located within the focus area, and objects in the current scene of the application that are located within the focus area are selectable objects. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 3, and therefore are not described again, and are included herein by reference.
In some embodiments, objects in the current scene of the application that are outside the focal region are non-selectable objects. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 3, and therefore are not described again, and are included herein by reference.
In some embodiments, the apparatus further comprises a quad-module 14 (not shown). A fourth module 14, configured to set an object newly entering the focus area as an optional object or set an object newly leaving the focus area as an unselected object when the focus area and the current scene of the application generate a relative displacement. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 3, and therefore are not described again, and are included herein by reference.
In some embodiments, the apparatus further comprises a five module 15 (not shown). A fifthly module 15, configured to, when the focus indicator is located at the preset position of the focus area, respond to a second touch operation of the user on the focus controller, adjust the size of the focus area or move the current scene of the application, so as to generate a relative displacement between the focus area and the current scene of the application. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 3, and therefore are not described again, and are included herein by reference.
In some embodiments, the focus controller is located outside the focus area. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 3, and therefore are not described again, and are included herein by reference.
In some embodiments, the one-three module 13 is configured to: and responding to the execution triggering operation of the user on an instruction button in the window of the application, and starting the target object to execute the task information corresponding to the instruction button. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 3, and therefore are not described again, and are included herein by reference.
FIG. 5 illustrates an exemplary system that can be used to implement the various embodiments described in this application.
In some embodiments, as shown in FIG. 5, the system 300 can be implemented as any of the devices in the various embodiments described. In some embodiments, system 300 may include one or more computer-readable media (e.g., system memory or NVM/storage 320) having instructions and one or more processors (e.g., processor(s) 305) coupled with the one or more computer-readable media and configured to execute the instructions to implement modules to perform the actions described herein.
For one embodiment, system control module 310 may include any suitable interface controllers to provide any suitable interface to at least one of processor(s) 305 and/or any suitable device or component in communication with system control module 310.
The system control module 310 may include a memory controller module 330 to provide an interface to the system memory 315. Memory controller module 330 may be a hardware module, a software module, and/or a firmware module.
System memory 315 may be used, for example, to load and store data and/or instructions for system 300. For one embodiment, system memory 315 may include any suitable volatile memory, such as suitable DRAM. In some embodiments, the system memory 315 may include a double data rate type four synchronous dynamic random access memory (DDR4 SDRAM).
For one embodiment, system control module 310 may include one or more input/output (I/O) controllers to provide an interface to NVM/storage 320 and communication interface(s) 325.
For example, NVM/storage 320 may be used to store data and/or instructions. NVM/storage 320 may include any suitable non-volatile memory (e.g., flash memory) and/or may include any suitable non-volatile storage device(s) (e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives).
NVM/storage 320 may include storage resources that are physically part of the device on which system 300 is installed or may be accessed by the device and not necessarily part of the device. For example, NVM/storage 320 may be accessible over a network via communication interface(s) 325.
Communication interface(s) 325 may provide an interface for system 300 to communicate over one or more networks and/or with any other suitable device. System 300 may wirelessly communicate with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols.
For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controller(s) (e.g., memory controller module 330) of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controller(s) of the system control module 310 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310 to form a system on a chip (SoC).
In various embodiments, system 300 may be, but is not limited to being: a server, a workstation, a desktop computing device, or a mobile computing device (e.g., a laptop computing device, a holding computing device, a tablet, a netbook, etc.). In various embodiments, system 300 may have more or fewer components and/or different architectures. For example, in some embodiments, system 300 includes one or more cameras, a keyboard, a Liquid Crystal Display (LCD) screen (including a touch screen display), a non-volatile memory port, multiple antennas, a graphics chip, an Application Specific Integrated Circuit (ASIC), and speakers.
The present application also provides a computer readable storage medium having stored thereon computer code which, when executed, performs a method as in any one of the preceding.
The present application also provides a computer program product, which when executed by a computer device, performs the method of any of the preceding claims.
The present application further provides a computer device, comprising:
one or more processors;
a memory for storing one or more computer programs;
the one or more computer programs, when executed by the one or more processors, cause the one or more processors to implement the method of any preceding claim.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, for example, implemented using Application Specific Integrated Circuits (ASICs), general purpose computers or any other similar hardware devices. In one embodiment, the software programs of the present application may be executed by a processor to implement the steps or functions described above. Likewise, the software programs (including associated data structures) of the present application may be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Additionally, some of the steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
In addition, some of the present application may be implemented as a computer program product, such as computer program instructions, which when executed by a computer, may invoke or provide methods and/or techniques in accordance with the present application through the operation of the computer. Those skilled in the art will appreciate that the form in which the computer program instructions reside on a computer-readable medium includes, but is not limited to, source files, executable files, installation package files, and the like, and that the manner in which the computer program instructions are executed by a computer includes, but is not limited to: the computer directly executes the instruction, or the computer compiles the instruction and then executes the corresponding compiled program, or the computer reads and executes the instruction, or the computer reads and installs the instruction and then executes the corresponding installed program. Computer-readable media herein can be any available computer-readable storage media or communication media that can be accessed by a computer.
Communication media includes media by which communication signals, including, for example, computer readable instructions, data structures, program modules, or other data, are transmitted from one system to another. Communication media may include conductive transmission media such as cables and wires (e.g., fiber optics, coaxial, etc.) and wireless (non-conductive transmission) media capable of propagating energy waves such as acoustic, electromagnetic, RF, microwave, and infrared. Computer readable instructions, data structures, program modules, or other data may be embodied in a modulated data signal, for example, in a wireless medium such as a carrier wave or similar mechanism such as is embodied as part of spread spectrum techniques. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. The modulation may be analog, digital or hybrid modulation techniques.
By way of example, and not limitation, computer-readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media include, but are not limited to, volatile memory such as random access memory (RAM, DRAM, SRAM); and non-volatile memory such as flash memory, various read-only memories (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memories (MRAM, FeRAM); and magnetic and optical storage devices (hard disk, tape, CD, DVD); or other now known media or later developed that can store computer-readable information/data for use by a computer system.
One embodiment according to the present application comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions are executed by the processor when the computer program instructions are executed
When the device is triggered to operate, the method and/or the technical scheme based on the plurality of the embodiments of the application are/is triggered.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.

Claims (22)

1. A method for executing touch operation, wherein the method comprises the following steps:
acquiring a first touch operation of a user on a focus controller in a window applied on a touch terminal, wherein the focus controller is used for controlling the movement of a focus indicator in the window of the application;
moving the focus indicator in response to the first touch operation and determining one or more selectable objects pointed to by the focus indicator as target objects;
and starting the target object to execute corresponding task information.
2. The method of claim 1, wherein the focus controller comprises a wheel controller;
wherein the moving the focus indicator in response to the first touch operation comprises:
in response to the first touch operation, performing a second movement of the focus indicator according to a first movement of the first touch operation on the wheel controller.
3. The method of claim 2, wherein the second movement satisfies at least one of:
the moving direction of the second movement is consistent with the moving direction of the first movement;
the moving direction of the second movement is consistent with the moving opposite direction of the first movement;
the displacement value of the second movement is positively correlated with the displacement value of the first movement.
4. The method of any of claims 1-3, wherein the focus indicator remains visible in a window of the application when the focus controller is in an active state.
5. The method of any of claims 1-4, wherein the window of the application includes a focus area, the focus indicator is located within the focus area, and objects in the current scene of the application that are located within the focus area are selectable objects.
6. The method of claim 5, wherein objects in the current scene of the application that are outside the focal region are non-selectable objects.
7. The method of claim 6, wherein the method further comprises:
when the focus area and the current scene of the application generate relative displacement, setting an object newly entering the focus area as an optional object, or setting an object newly leaving the focus area as an unselected object.
8. The method of claim 7, wherein the method further comprises:
when the focus indicator is located at a preset position of the focus area, in response to a second touch operation of the user on the focus controller, adjusting the size of the focus area or moving the current scene of the application to generate a relative displacement between the focus area and the current scene of the application.
9. The method of any of claims 5 to 8, wherein the focus controller is located outside the focus area.
10. The method of any of claims 1-9, wherein the initiating the target object to perform corresponding task information comprises:
and responding to the execution triggering operation of the user on an instruction button in the window of the application, and starting the target object to execute the task information corresponding to the instruction button.
11. An apparatus for performing a touch operation, wherein the apparatus comprises:
the system comprises a module, a module and a control module, wherein the module is used for acquiring a first touch operation of a user on a focus controller in a window applied on a touch terminal, and the focus controller is used for controlling the movement of a focus indicator in the window of the application;
a second module for moving the focus indicator in response to the first touch operation and determining one or more selectable objects pointed to by the focus indicator as target objects;
and the three modules are used for starting the target object to execute corresponding task information.
12. The apparatus of claim 11, wherein the focus controller comprises a wheel controller;
wherein the moving the focus indicator in response to the first touch operation comprises:
in response to the first touch operation, performing a second movement of the focus indicator according to a first movement of the first touch operation on the wheel controller.
13. The device of claim 12, wherein the second movement satisfies at least one of:
the moving direction of the second movement is consistent with the moving direction of the first movement;
the moving direction of the second movement is consistent with the moving opposite direction of the first movement;
the displacement value of the second movement is positively correlated with the displacement value of the first movement.
14. The device of any of claims 11 to 13, wherein the focus indicator remains visible in a window of the application when the focus controller is in an active state.
15. The device of any of claims 11-14, wherein the window of the application includes a focus area, the focus indicator is located within the focus area, and objects in the current scene of the application that are located within the focus area are selectable objects.
16. The device of claim 15, wherein objects in the current scene of the application that are outside the focal region are non-selectable objects.
17. The apparatus of claim 16, wherein the apparatus further comprises:
a quad module for:
when the focus area and the current scene of the application generate relative displacement, setting an object newly entering the focus area as an optional object, or setting an object newly leaving the focus area as an unselected object.
18. The apparatus of claim 17, wherein the apparatus further comprises:
a fifth module for:
when the focus indicator is located at a preset position of the focus area, in response to a second touch operation of the user on the focus controller, adjusting the size of the focus area or moving the current scene of the application to generate a relative displacement between the focus area and the current scene of the application.
19. The apparatus of any of claims 15-18, wherein the focus controller is located outside the focus area.
20. The apparatus of any of claims 11 to 19, wherein the one or three modules are to:
and responding to the execution triggering operation of the user on an instruction button in the window of the application, and starting the target object to execute the task information corresponding to the instruction button.
21. An apparatus for performing a touch operation, wherein the apparatus comprises:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the operations of the method of any of claims 1 to 10.
22. A computer-readable medium storing instructions that, when executed, cause a system to perform the operations of any of the methods of claims 1-10.
CN201911020189.XA 2019-10-24 2019-10-24 Method and device for executing touch operation Active CN110780788B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911020189.XA CN110780788B (en) 2019-10-24 2019-10-24 Method and device for executing touch operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911020189.XA CN110780788B (en) 2019-10-24 2019-10-24 Method and device for executing touch operation

Publications (2)

Publication Number Publication Date
CN110780788A true CN110780788A (en) 2020-02-11
CN110780788B CN110780788B (en) 2023-08-08

Family

ID=69387426

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911020189.XA Active CN110780788B (en) 2019-10-24 2019-10-24 Method and device for executing touch operation

Country Status (1)

Country Link
CN (1) CN110780788B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112162631A (en) * 2020-09-18 2021-01-01 聚好看科技股份有限公司 Interactive device, data processing method and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013189225A1 (en) * 2012-06-19 2013-12-27 中兴通讯股份有限公司 System and method for optimizing large-screen touchscreen one-handed operation of mobile terminal, and mobile terminal
CN104238945A (en) * 2014-09-19 2014-12-24 联想(北京)有限公司 Method and system for controlling control and electronic device
CN105607851A (en) * 2015-12-18 2016-05-25 上海逗屋网络科技有限公司 Scene control method and device for touch terminal
CN106648405A (en) * 2016-09-12 2017-05-10 上海斐讯数据通信技术有限公司 Method and device for controlling mobile terminal
CN107037945A (en) * 2016-02-04 2017-08-11 阿里巴巴集团控股有限公司 A kind of focus processing method, device and intelligent terminal
CN108287650A (en) * 2017-12-15 2018-07-17 维沃移动通信有限公司 One-handed performance method based on mobile terminal and mobile terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101980191B (en) * 2010-10-14 2014-03-05 优视科技有限公司 Method and device for locking focus element in webpage browsing process

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013189225A1 (en) * 2012-06-19 2013-12-27 中兴通讯股份有限公司 System and method for optimizing large-screen touchscreen one-handed operation of mobile terminal, and mobile terminal
CN104238945A (en) * 2014-09-19 2014-12-24 联想(北京)有限公司 Method and system for controlling control and electronic device
CN105607851A (en) * 2015-12-18 2016-05-25 上海逗屋网络科技有限公司 Scene control method and device for touch terminal
CN107037945A (en) * 2016-02-04 2017-08-11 阿里巴巴集团控股有限公司 A kind of focus processing method, device and intelligent terminal
CN106648405A (en) * 2016-09-12 2017-05-10 上海斐讯数据通信技术有限公司 Method and device for controlling mobile terminal
CN108287650A (en) * 2017-12-15 2018-07-17 维沃移动通信有限公司 One-handed performance method based on mobile terminal and mobile terminal

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112162631A (en) * 2020-09-18 2021-01-01 聚好看科技股份有限公司 Interactive device, data processing method and medium
CN112162631B (en) * 2020-09-18 2023-05-16 聚好看科技股份有限公司 Interactive device, data processing method and medium

Also Published As

Publication number Publication date
CN110780788B (en) 2023-08-08

Similar Documents

Publication Publication Date Title
CN110333918B (en) Method and equipment for managing boarder programs
CN110321192B (en) Method and equipment for presenting hosted program
CN111488096B (en) Method and equipment for displaying interactive presentation information in reading application
CN110321189B (en) Method and equipment for presenting hosted program in hosted program
CN110413179B (en) Method and equipment for presenting session message
CN109656363B (en) Method and equipment for setting enhanced interactive content
CN110290557B (en) Method and equipment for loading page tags in application
CN111506232B (en) Method and equipment for controlling menu display in reading application
CN111125562B (en) Method and equipment for switching display tag pages
CN111382386A (en) Method and equipment for generating webpage screenshot
CN112799733A (en) Method and equipment for presenting application page
WO2023024871A1 (en) Interface interaction method and device
CN110825242B (en) Method and device for inputting
CN109947504B (en) Method and equipment for executing hosted program in hosted program
CN110430253B (en) Method and equipment for providing novel update notification information
CN110519250B (en) Method and equipment for providing information flow
CN110413183B (en) Method and equipment for presenting page
CN109254781B (en) Method and equipment for installing application on user equipment
CN110780788B (en) Method and device for executing touch operation
US20150074597A1 (en) Separate smoothing filter for pinch-zooming touchscreen gesture response
CN114153535B (en) Method, apparatus, medium and program product for jumping pages on an open page
CN112818719A (en) Method and device for identifying two-dimensional code
CN110780787B (en) Method and device for executing task scheduling
CN107301012B (en) Method and equipment for displaying description information of operation instruction in application
CN114666652A (en) Method, device, medium and program product for playing video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant