CN113467695B - Task execution method and device, computing device and storage medium - Google Patents

Task execution method and device, computing device and storage medium Download PDF

Info

Publication number
CN113467695B
CN113467695B CN202111029352.6A CN202111029352A CN113467695B CN 113467695 B CN113467695 B CN 113467695B CN 202111029352 A CN202111029352 A CN 202111029352A CN 113467695 B CN113467695 B CN 113467695B
Authority
CN
China
Prior art keywords
task
touch
icon
gesture
icons
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111029352.6A
Other languages
Chinese (zh)
Other versions
CN113467695A (en
Inventor
邹亚
曾伟
张勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uniontech Software Technology Co Ltd
Original Assignee
Uniontech Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uniontech Software Technology Co Ltd filed Critical Uniontech Software Technology Co Ltd
Priority to CN202111029352.6A priority Critical patent/CN113467695B/en
Priority to CN202111272524.2A priority patent/CN114020204B/en
Publication of CN113467695A publication Critical patent/CN113467695A/en
Application granted granted Critical
Publication of CN113467695B publication Critical patent/CN113467695B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention discloses a task execution method, a task execution device, a computing device and a storage medium, wherein the method comprises the following steps: detecting a touch operation based on a plurality of touch points on a screen; judging whether the number of the touch points of the touch operation is a preset number or not, and if the number of the touch points of the touch operation is the preset number, responding to the touch operation and acquiring the position information of each touch point; displaying one or more task icons at touch point positions on a screen according to the position information of the plurality of touch points, wherein each touch point position is suitable for displaying at most one task icon; and receiving gesture trigger operation on the task icon, and executing the corresponding task according to the gesture trigger operation on the task icon. According to the task execution scheme, the user can execute the task in a self-defined mode and flexibly configure various gesture trigger operations, different tasks can be triggered and executed based on different gestures aiming at the same task icon, the task execution scheme is beneficial to executing more various task actions through quick operation, and the task execution efficiency is improved.

Description

Task execution method and device, computing device and storage medium
Technical Field
The present invention relates to the technical field of computers and operating systems, and in particular, to a method and an apparatus for task execution, a computing device, and a storage medium.
Background
With the development of mobile devices and touch screen devices, functions realized through gesture operation are more and more, but at present, no matter mobile devices such as mobile phones and tablets or touch screen devices based on a PC operating system, gesture actions and tasks to be executed are relatively fixed, and tasks executed under gesture multi-point touch conditions cannot be customized.
In the prior art, a scheme for executing a task based on gesture operation includes: three-finger sliding screen capture, sliding downwards from the right side of the top end to the calling control center, sliding downwards from the left side of the top end to the left side to view notification, sliding upwards from a Home Indicator to return to a Home page main picture, sliding upwards from a navigation bar at the lower end of a screen to enter split screen and the like. These gesture operations all have certain disadvantages, and a specific gesture action can only implement a specific task, for example, when the screen capture is implemented by the three-finger sliding operation, other tasks cannot be implemented by the three-finger sliding operation, and the operation is not flexible; for another example, when the screen is split by sliding upward from the navigation bar at the lower end of the screen, another task cannot be executed by sliding upward from the navigation bar.
For this reason, a task execution method is required to solve the problems in the above technical solutions.
Disclosure of Invention
To this end, the present invention provides a task execution method and apparatus in an attempt to solve or at least alleviate the above-presented problems.
According to an aspect of the present invention, there is provided a task execution method, executed in an operating system of a computing device, the method comprising the steps of: detecting a touch operation based on a plurality of touch points on a screen; judging whether the number of the touch points of the touch operation is a preset number or not, and if the number of the touch points of the touch operation is the preset number, responding to the touch operation and acquiring the position information of each touch point; displaying one or more task icons at touch point positions on a screen according to the position information of the plurality of touch points, wherein each touch point position is suitable for displaying at most one task icon; and receiving gesture trigger operation on the task icon, and executing the corresponding task according to the gesture trigger operation on the task icon.
Alternatively, in the task performing method according to the present invention, the step of displaying one or more task icons on the screen according to the position information of the plurality of touch points includes: acquiring a configuration file, and determining one or more configured task icons based on the configuration file; and displaying the configured one or more task icons at corresponding contact positions respectively based on the position sequence of the plurality of contacts.
Optionally, in the task execution method according to the present invention, the position order of the plurality of contacts is: the abscissa of the plurality of contact points is in order from small to large.
Optionally, in the task execution method according to the present invention, the gesture triggering operation includes one or more of a click operation, an upward sliding operation, and a downward sliding operation, and each gesture triggering operation of the task icon corresponds to a task.
Optionally, in the task execution method according to the present invention, the step of executing the corresponding task according to the gesture trigger operation on the task icon includes: determining a task execution script path corresponding to the gesture triggering operation of the task icon based on a configuration file, and acquiring a corresponding task execution script based on the task execution script path; and executing the corresponding task based on the task execution script.
Optionally, in the task execution method according to the present invention, the method further includes: acquiring a task execution script path which is configured for gesture trigger operation of one or more task icons on a configuration page; and generating a configuration file based on the one or more task icons and the task execution script path corresponding to the gesture trigger operation of each task icon.
Optionally, in the task execution method according to the present invention, the step of receiving a task execution script path configured for a gesture trigger operation of one or more task icons on a configuration page includes: responding to a request of a configuration task, and displaying a configuration page on a screen, wherein the configuration page comprises a plurality of task icons, each task icon comprises a plurality of configuration items, and each configuration item corresponds to a gesture trigger operation; and acquiring a task execution script path which is configured based on the configuration items of the one or more task icons and corresponds to the gesture trigger operation of the task icon.
Optionally, in the task execution method according to the present invention, before responding to the touch operation, the method further includes: timing the touch operation to determine the duration of the touch operation; responding to the touch operation when the duration time is determined to reach the preset time.
Alternatively, in the task execution method according to the present invention, the predetermined number is 5, and the predetermined time is 2 seconds.
Optionally, in the task execution method according to the present invention, the task includes deleting a file stored in a predetermined location, opening a predetermined application, and modifying read-write permission of the predetermined file.
According to an aspect of the present invention, there is provided a task execution device residing in an operating system, the device including: the touch control device comprises a detection module, a touch control module and a touch control module, wherein the detection module is suitable for detecting touch control operation based on a plurality of touch points on a screen, judging whether the number of the touch points of the touch control operation is a preset number, and if the number of the touch points is determined to be the preset number, responding to the touch control operation and acquiring position information of each touch point; the display module is suitable for displaying one or more task icons at the contact positions on the screen according to the position information of the plurality of contacts, wherein each contact position is suitable for displaying one task icon; and the processing module is suitable for receiving gesture trigger operation on the task icon and executing the corresponding task according to the gesture trigger operation on the task icon.
According to an aspect of the present invention, there is provided a computing device comprising: at least one processor; and a memory storing program instructions, wherein the program instructions are configured to be executed by the at least one processor, the program instructions comprising instructions for performing the task execution method as described above.
According to an aspect of the present invention, there is provided a readable storage medium storing program instructions which, when read and executed by a computing device, cause the computing device to perform the method as described above.
According to the technical scheme, the task execution method can be used for pre-configuring tasks corresponding to a plurality of task icons, wherein a plurality of gesture trigger operations can be configured for each task icon, and one task is configured for each gesture trigger operation. The user can call a plurality of task icons through multi-touch operation (such as five-finger touch operation), and the task icons are displayed according to each touch point position of the user when the user triggers the operation, so that the user can operate the task icons more accurately. And based on the preset configuration, for the same task icon, the user can trigger different tasks based on different gesture trigger operations. Therefore, according to the technical scheme of the invention, the user can configure the task in a self-defined mode and flexibly configure various gesture trigger operations, different tasks are triggered and executed based on different gestures, the implementation of more various task actions through quick operation is facilitated, and the task execution efficiency is improved.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which are indicative of various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description read in conjunction with the accompanying drawings. Throughout this disclosure, like reference numerals generally refer to like parts or elements.
FIG. 1 shows a schematic diagram of a computing device 100, according to one embodiment of the invention;
FIG. 2 shows a flow diagram of a task execution method 200 according to one embodiment of the invention;
FIG. 3 is a diagram illustrating a touch operation based on multiple touch points on a screen according to an embodiment of the present invention;
FIG. 4 shows a schematic diagram of displaying task icons on a screen according to one embodiment of the invention;
FIG. 5 illustrates a configuration page diagram according to one embodiment of the invention; and
fig. 6 shows a schematic diagram of a task performing device 600 according to an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
According to the task execution scheme, a user can configure various tasks in a user-defined mode, and various gesture triggering operations can be flexibly configured for each task icon, so that the user can trigger execution of different tasks by operating the task icons through different gestures, execution of more kinds of task actions can be realized through quick operation, and task execution efficiency is improved.
Fig. 1 is a schematic block diagram of an example computing device 100.
As shown in FIG. 1, in a basic configuration 102, a computing device 100 typically includes a system memory 106 and one or more processors 104. A memory bus 108 may be used for communication between the processor 104 and the system memory 106.
Depending on the desired configuration, the processor 104 may be any type of processing, including but not limited to: a microprocessor (UP), a microcontroller (UC), a digital information processor (DSP), or any combination thereof. The processor 104 may include one or more levels of cache, such as a level one cache 110 and a level two cache 112, a processor core 114, and registers 116. The example processor core 114 may include an Arithmetic Logic Unit (ALU), a Floating Point Unit (FPU), a digital signal processing core (DSP core), or any combination thereof. The example memory controller 118 may be used with the processor 104, or in some implementations the memory controller 118 may be an internal part of the processor 104.
Depending on the desired configuration, system memory 106 may be any type of memory, including but not limited to: volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.), or any combination thereof. System memory 106 may include an operating system 120, one or more applications 122, and program data 124. In some implementations, the application 122 can be arranged to execute instructions on an operating system with program data 124 by one or more processors 104.
Computing device 100 also includes a storage device 132, storage device 132 including removable storage 136 and non-removable storage 138.
Computing device 100 may also include a storage interface bus 134. The storage interface bus 134 enables communication from the storage devices 132 (e.g., removable storage 136 and non-removable storage 138) to the basic configuration 102 via the bus/interface controller 130. At least a portion of the operating system 120, applications 122, and data 124 may be stored on removable storage 136 and/or non-removable storage 138, and loaded into system memory 106 via storage interface bus 134 and executed by the one or more processors 104 when the computing device 100 is powered on or the applications 122 are to be executed.
Computing device 100 may also include an interface bus 140 that facilitates communication from various interface devices (e.g., output devices 142, peripheral interfaces 144, and communication devices 146) to the basic configuration 102 via the bus/interface controller 130. The example output device 142 includes a graphics processing unit 148 and an audio processing unit 150. They may be configured to facilitate communication with various external devices, such as a display or speakers, via one or more a/V ports 152. Example peripheral interfaces 144 may include a serial interface controller 154 and a parallel interface controller 156, which may be configured to facilitate communication with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device) or other peripherals (e.g., printer, scanner, etc.) via one or more I/O ports 158. An example communication device 146 may include a network controller 160, which may be arranged to facilitate communications with one or more other computing devices 162 over a network communication link via one or more communication ports 164.
A network communication link may be one example of a communication medium. Communication media may typically be embodied by computer readable instructions, data structures, program modules, and may include any information delivery media, such as carrier waves or other transport mechanisms, in a modulated data signal. A "modulated data signal" may be a signal that has one or more of its data set or its changes made in a manner that encodes information in the signal. By way of non-limiting example, communication media may include wired media such as a wired network or private-wired network, and various wireless media such as acoustic, Radio Frequency (RF), microwave, Infrared (IR), or other wireless media. The term computer readable media as used herein may include both storage media and communication media.
Computing device 100 may be implemented as a personal computer including both desktop and notebook computer configurations. Of course, computing device 100 may also be implemented as part of a small-form factor portable (or mobile) electronic device such as a cellular telephone, a digital camera, a Personal Digital Assistant (PDA), a personal media player device, a wireless web-watch device, a personal headset, an application specific device, or a hybrid device that include any of the above functions. And may even be implemented as a server, such as a file server, a database server, an application server, a WEB server, and so forth. The embodiments of the present invention are not limited thereto.
In an embodiment in accordance with the invention, the operating system 120 of the computing device 100 is configured to perform a task execution method 200 in accordance with the invention. The operating system 120 of the computing device 100 includes a plurality of program instructions thereon for executing the task execution method 200 of the present invention, so that the task execution method 200 of the present invention can be executed in the operating system 120 of the computing device 100.
According to an embodiment of the present invention, the task execution device 600 is included on the operating system 120, and the task execution device 600 includes a plurality of program instructions for executing the task execution method 200 of the present invention, so that the task execution method 200 of the present invention can be executed in the task execution device 600.
FIG. 2 shows a flow diagram of a task execution method 200 according to one embodiment of the invention. The task execution method 200 may be performed in an operating system of a computing device (e.g., the computing device 100 described above). Where computing device 100 includes a touch screen connected to an operating system.
It should be noted that the task execution method 200 according to the present invention can be widely applied to a shortcut task execution scheme of various operating systems including the UOS.
In one embodiment, the operating system of computing device 100 may be a desktop operating system, such as a UOS operating system, which may interface with a touch screen. However, it should be noted that the present invention is not limited to the specific types of computing devices and operating systems, and the task execution method 200 of the present invention may also be executed in an operating system of a mobile terminal such as a mobile phone or tablet.
As shown in fig. 2, the method 200 begins at step S210.
In step S210, a touch operation of a user on a screen based on a plurality of touch points is detected. For example, a user may simultaneously trigger the screen to form multiple touch points by multiple fingers, and the operating system of the computing device 100 may detect a touch operation of the user on the screen based on the multiple touch points. Here, refer to fig. 3 which illustrates a schematic diagram of a touch operation based on a plurality of touch points on a screen according to an embodiment of the present invention.
Subsequently, in step S220, it is determined whether the number of touch points of the touch operation is a predetermined number, and if it is determined that the number is the predetermined number, the position information of each touch point is acquired in response to the touch operation. Here, it is determined whether the detected touch operation based on the plurality of touch points satisfies the multi-touch condition by determining whether the number of touch points is a predetermined number. When the number of touch points of the touch operation is a predetermined number, that is, when the touch operation includes the predetermined number of touch points, it is determined that the touch operation satisfies the multi-touch condition, so that it is possible to display a task icon according to the position information of each touch point in response to the touch operation, specifically, by continuing to perform the following step S230.
In one embodiment, the predetermined number is 5, that is, when the detected touch operation includes 5 touch points, the touch operation may be responded to. As shown in fig. 3, a user can touch the screen with five fingers simultaneously to form a touch operation including 5 touch points, and at this time, the operating system can determine that the touch operation satisfies the multi-touch condition and respond to the touch operation.
In step S230, one or more task icons are displayed at the touch point position on the screen according to the position information of the plurality of touch points (i.e., the predetermined number of touch points). FIG. 4 shows a schematic diagram of displaying task icons on a screen according to one embodiment of the present invention, each contact location being adapted to display at most one task icon, as shown in FIG. 4. Here, it should be noted that, when the number of the configured task icons is equal to the number of the contacts, one task icon may be displayed at each contact position, and when the number of the configured task icons is less than the number of the contacts, one or more configured task icons may be displayed at one or more contact positions, so that there are one or more contact positions that do not display any task icon.
It should be noted that each task icon may correspond to a different execution task, and it should be noted that the present invention is not limited to the specific task corresponding to each task icon. It should be understood that the task icon is displayed according to each contact position when the user triggers the operation, so that the operation habit of the user is better met, and the user can operate the task icon more accurately.
It should also be noted that one or more of the task icons and corresponding specific tasks shown herein may be pre-configured. In one embodiment, by obtaining a configuration file generated by pre-configuration, one or more task icons pre-configured may be determined based on the configuration file, and the configured one or more task icons are respectively displayed at corresponding touch point positions.
In one embodiment, the configured one or more task icons may be respectively displayed at corresponding touch point positions based on the position sequence of the plurality of touch points, and each touch point position may display at most one task icon. Here, the position order of the plurality of touch points may be an order of abscissa of the plurality of touch points from small to large, based on which the configured one or more task icons may be displayed at the corresponding touch point positions in an order of abscissa of the plurality of touch points from small to large, respectively.
It should be noted that the present invention is not limited to a specific number of pre-configured task icons. When the number of the pre-configured task icons is the same as the number of the contacts and is a predetermined number, each task icon can be respectively allocated to one contact position, and finally, one task icon is respectively displayed at each contact position. In addition, if the number of the arranged task icons is less than the predetermined number, after the arranged one or more task icons are respectively displayed at the corresponding touch point positions based on the position order of the plurality of touch points, at this time, there is a case where none of the task icons is displayed at one or more touch point positions.
In one embodiment, after determining that the number of touch points of the touch operation is the predetermined number and before responding to the touch operation, the touch operation based on the predetermined number of touch points is also timed to determine the duration of the touch operation, when the determined duration reaches the predetermined time (i.e., when the duration of the touch operation based on the predetermined number of touch points reaches the predetermined time), it is determined that the touch operation satisfies the multi-touch condition, and then one or more task icons are displayed at touch point positions on the screen in response to the touch operation.
In one embodiment, the predetermined number is 5, and the predetermined time is 2 seconds. That is, when the detected touch operation includes 5 touch points and the duration of the touch operation based on the 5 touch points reaches 2 seconds (greater than or equal to 2 seconds), it is determined that the touch operation satisfies the multi-touch condition, and in response to the touch operation, one or more task icons are displayed at the touch point position on the screen.
Finally, in step S230, a gesture triggering operation on the task icon is received, and the corresponding task is executed according to the gesture triggering operation on the task icon.
Here, it should be noted that the gesture triggering operation for each task icon may include one or more types, and for each task icon, a task may be configured for each gesture triggering operation, so that each gesture triggering operation of the task icon corresponds to an executable task. In this way, for the same task icon, different tasks may be triggered to be executed based on different gesture trigger operations.
In one embodiment, the gesture triggering operation on the task icon includes one or more of a click operation, an upward sliding operation and a downward sliding operation, for example, so that each gesture triggering operation (click operation, upward sliding operation and downward sliding operation) of the task icon can correspond to one task respectively. It should be noted that the present invention is not limited to the types of gesture trigger operations enumerated above.
According to one embodiment, when a corresponding task is executed according to a gesture trigger operation on a task icon, a task execution script path corresponding to the gesture trigger operation of the task icon may be determined based on a configuration file by acquiring the configuration file, and a corresponding task execution script may be acquired based on the task execution script path. Subsequently, the corresponding task may be performed based on the task execution script.
In one implementation, the tasks that the method 200 according to the present invention can implement quick execution according to the pre-configuration may be implemented as: deleting a file stored at a predetermined location (e.g., emptying a file in a recycle bin), opening a predetermined application, modifying read and write permissions of a predetermined file, etc., but the present invention is not limited to the kind of executable tasks listed herein.
According to the embodiment of the invention, corresponding tasks can be configured for the gesture triggering operation of one or more task icons according to the request of a user. Specifically, upon receiving a request for a configuration task sent by a user, a configuration page may be displayed on a screen in response to the request for the configuration task, so that the user configures the task based on the configuration page. In one implementation, when the method 200 is executed on a UOS operating system (connected to a touch screen), a user may call up a configuration page at a control center to flexibly configure tasks based on the configuration page.
Here, refer to fig. 5, which is a schematic diagram of a configuration page according to an embodiment of the present invention. After the configuration page is displayed, the user can configure corresponding task execution script paths for gesture triggering operations of one or more task icons on the configuration page. It should be noted that, for each task icon, multiple gesture trigger operations may be configured, and one task may be configured for each gesture trigger operation, so that a corresponding task execution script path may be configured for each gesture trigger operation of each task icon. After the configuration operation of the user is completed, a task execution script path configured by the user for the gesture triggering operation of the one or more task icons on the configuration page may be acquired, and a configuration file is generated and stored in the computing device based on the one or more task icons configured by the user on the configuration page and the task execution script path corresponding to the gesture triggering operation of each task icon.
In this way, when a task corresponding to the gesture triggering operation of the task icon is executed in response to the gesture triggering operation of the user on the task icon, a task execution script path corresponding to the gesture triggering operation of the task icon can be acquired from the configuration file, a task execution script is acquired based on the corresponding task execution script path, and the corresponding task is executed through the task execution script, so that the corresponding task execution action is realized. It should be understood that, by configuring a corresponding task execution script path for the gesture triggering operation of the task icon, it is also achieved that a corresponding task is configured for the gesture triggering operation of the task icon. Corresponding task execution script paths are respectively configured for each gesture triggering operation of the task icons, so that a user can trigger the same task icon through different gestures to trigger execution of different tasks.
In one implementation, as shown in fig. 5, the configuration page includes a plurality of task icons, each task icon includes a plurality of configuration items, and each configuration item corresponds to a gesture trigger operation, for example, each task icon shown in fig. 5 includes a single-click configuration item, a slide-up configuration item, and a slide-down configuration item, so that when configuring a task for a task icon, a task execution script corresponding to the gesture trigger operation of the task icon can be configured for the configuration item corresponding to each gesture trigger operation of each task icon. Specifically, a task execution script task corresponding to the clicking operation of the task icon is configured based on the clicking configuration item, so that the user can trigger the execution of the corresponding task by clicking the task icon; configuring a task corresponding to the upward sliding operation of the task icon based on the upward sliding configuration item, so that the user can trigger the corresponding task to be executed by performing the upward sliding operation on the task icon; and configuring the task execution script corresponding to the downward sliding operation of the task icon based on the downward sliding configuration item, so that the user can trigger the corresponding task to be executed by performing the downward sliding operation on the task icon.
After the configuration operation of the user is completed, a configuration file is generated by acquiring a task execution script path which is configured by the user based on the configuration items of the one or more task icons and corresponds to the gesture trigger operation of the task icon, and generating a task execution script path which corresponds to the gesture trigger operation of each task icon based on the one or more task icons.
Thus, according to the method 200 of the present invention, the user can execute the task by self and flexibly configure various gesture triggering operations, so that different tasks can be triggered and executed based on different gestures, thereby facilitating the execution of more various task actions through shortcut operations and improving the task execution efficiency.
Fig. 6 shows a schematic diagram of a task performing device 600 according to an embodiment of the invention. The task performing device 600 resides in an operating system of a computing device and is adapted to perform the task performing method 200 of the present invention.
The task performing device 600 includes a detecting module 610, a displaying module 620, and a processing module 630 connected thereto. The detecting module 610 is configured to detect a touch operation based on a plurality of touch points on a screen, determine whether the number of touch points of the touch operation is a predetermined number, and if the number of touch points is determined to be the predetermined number, obtain position information of each touch point in response to the touch operation. The display module 620 displays one or more task icons at touch point positions on the screen according to the position information of the plurality of touch points, wherein each touch point position is suitable for displaying one task icon. The processing module 630 receives the gesture triggering operation on the task icon, and executes the corresponding task according to the gesture triggering operation on the task icon.
It should be noted that the detecting module 610 is configured to perform the aforementioned steps S210 to S220, the displaying module 620 is configured to perform the aforementioned step S230, and the processing module 630 is configured to perform the aforementioned step S240. Here, for the specific execution logic of the detecting module 610, the displaying module 620 and the processing module 630, reference is made to the detailed description of steps S210 to S240 in the method 200, and details are not repeated here.
According to the task execution scheme, tasks corresponding to a plurality of task icons can be configured in advance, wherein a plurality of gesture trigger operations can be configured for each task icon, and one task is configured for each gesture trigger operation. The user can call a plurality of task icons through multi-touch operation (such as five-finger touch operation), and the task icons are displayed according to each touch point position of the user when the user triggers the operation, so that the user can operate the task icons more accurately. And based on the preset configuration, for the same task icon, the user can trigger different tasks based on different gesture trigger operations. Therefore, according to the technical scheme of the invention, the user can configure various tasks in a self-defined manner and flexibly configure various gesture trigger operations, different tasks are triggered and executed based on different gestures, the implementation of more various task actions through quick operation is facilitated, and the task execution efficiency is improved.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as removable hard drives, U.S. disks, floppy disks, CD-ROMs, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to execute the multilingual spam-text recognition method of the present invention according to instructions in said program code stored in the memory.
By way of example, and not limitation, readable media may comprise readable storage media and communication media. Readable storage media store information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of readable media.
In the description provided herein, algorithms and displays are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with examples of this invention. The required structure for constructing such a system will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (9)

1. A task execution method, performed in an operating system of a computing device, the method comprising the steps of:
detecting a touch operation based on a plurality of touch points on a screen;
judging whether the number of the touch points of the touch operation is a preset number or not, and if the number of the touch points of the touch operation is the preset number, responding to the touch operation and acquiring the position information of each touch point;
displaying one or more task icons at touch point positions on a screen according to the position information of the plurality of touch points, wherein each touch point position is suitable for displaying at most one task icon; and
receiving gesture trigger operation on a task icon, and executing a corresponding task according to the gesture trigger operation on the task icon, wherein the gesture trigger operation comprises one or more of click operation, upward sliding operation and downward sliding operation, and each gesture trigger operation of the task icon corresponds to one task.
2. The method of claim 1, wherein the displaying one or more task icons on the screen according to the position information of the plurality of touch points comprises:
acquiring a configuration file, and determining one or more configured task icons based on the configuration file;
and displaying the configured one or more task icons at corresponding contact positions respectively based on the position sequence of the plurality of contacts.
3. The method of claim 2, wherein the plurality of contacts are positioned in the order: the abscissa of the plurality of contact points is in order from small to large.
4. The method of any of claims 1-3, wherein performing the respective task in accordance with the gesture-triggered operation on the task icon comprises:
determining a task execution script path corresponding to the gesture triggering operation of the task icon based on a configuration file, and acquiring a corresponding task execution script based on the task execution script path;
and executing the corresponding task based on the task execution script.
5. A method according to any one of claims 1-3, further comprising the step of:
acquiring a task execution script path which is configured for gesture trigger operation of one or more task icons on a configuration page;
and generating a configuration file based on the one or more task icons and the task execution script path corresponding to the gesture trigger operation of each task icon.
6. The method of claim 5, wherein receiving a gesture at a configuration page for one or more task icons triggers operation of a configured task execution script path comprises:
responding to a request of a configuration task, and displaying a configuration page on a screen, wherein the configuration page comprises a plurality of task icons, each task icon comprises a plurality of configuration items, and each configuration item corresponds to a gesture trigger operation;
and acquiring a task execution script path which is configured based on the configuration items of the one or more task icons and corresponds to the gesture trigger operation of the task icon.
7. A task execution device residing in an operating system, the device comprising:
the touch control device comprises a detection module, a touch control module and a touch control module, wherein the detection module is suitable for detecting touch control operation based on a plurality of touch points on a screen, judging whether the number of the touch points of the touch control operation is a preset number, and if the number of the touch points is determined to be the preset number, responding to the touch control operation and acquiring position information of each touch point;
a display module adapted to display one or more task icons at touch point positions on a screen according to position information of a plurality of touch points, wherein each touch point position is adapted to display at most one task icon; and
the processing module is suitable for receiving gesture trigger operation on the task icon and executing the corresponding task according to the gesture trigger operation on the task icon, wherein the gesture trigger operation comprises one or more of click operation, upward sliding operation and downward sliding operation, and each gesture trigger operation of the task icon corresponds to one task.
8. A computing device, comprising:
at least one processor; and
a memory storing program instructions, wherein the program instructions are configured to be adapted to be executed by the at least one processor, the program instructions comprising instructions for performing the method of any of claims 1-6.
9. A readable storage medium storing program instructions that, when read and executed by a computing device, cause the computing device to perform the method of any of claims 1-6.
CN202111029352.6A 2021-09-03 2021-09-03 Task execution method and device, computing device and storage medium Active CN113467695B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111029352.6A CN113467695B (en) 2021-09-03 2021-09-03 Task execution method and device, computing device and storage medium
CN202111272524.2A CN114020204B (en) 2021-09-03 2021-09-03 Task execution method, device, computing equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111029352.6A CN113467695B (en) 2021-09-03 2021-09-03 Task execution method and device, computing device and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202111272524.2A Division CN114020204B (en) 2021-09-03 2021-09-03 Task execution method, device, computing equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113467695A CN113467695A (en) 2021-10-01
CN113467695B true CN113467695B (en) 2021-12-07

Family

ID=77868019

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111029352.6A Active CN113467695B (en) 2021-09-03 2021-09-03 Task execution method and device, computing device and storage medium
CN202111272524.2A Active CN114020204B (en) 2021-09-03 2021-09-03 Task execution method, device, computing equipment and storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202111272524.2A Active CN114020204B (en) 2021-09-03 2021-09-03 Task execution method, device, computing equipment and storage medium

Country Status (1)

Country Link
CN (2) CN113467695B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107209625A (en) * 2014-12-17 2017-09-26 得利捷美国股份有限公司 The soft trigger of floating of touch display on electronic equipment
CN107450800A (en) * 2017-07-25 2017-12-08 维沃移动通信有限公司 A kind of task method to set up, mobile terminal and computer-readable recording medium
CN109062464A (en) * 2018-06-27 2018-12-21 Oppo广东移动通信有限公司 touch operation method, device, storage medium and electronic equipment
CN112286358A (en) * 2020-11-02 2021-01-29 恒大新能源汽车投资控股集团有限公司 Screen operation method and device, electronic equipment and computer-readable storage medium

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7509588B2 (en) * 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US9311061B2 (en) * 2011-02-10 2016-04-12 International Business Machines Corporation Designing task execution order based on location of the task icons within a graphical user interface
US8508494B2 (en) * 2011-06-01 2013-08-13 Motorola Mobility Llc Using pressure differences with a touch-sensitive display screen
KR102148809B1 (en) * 2013-04-22 2020-08-27 삼성전자주식회사 Apparatus, method and computer readable recording medium for displaying shortcut window
US20140372896A1 (en) * 2013-06-14 2014-12-18 Microsoft Corporation User-defined shortcuts for actions above the lock screen
US10747422B2 (en) * 2014-09-18 2020-08-18 Drupe Mobile Ltd. Client terminal user interface for interacting with contacts
US10741273B1 (en) * 2015-07-15 2020-08-11 OHUM Healthcare Solutions Inc. User friendly medical records systems, apparatuses and methods
CN105912173B (en) * 2016-04-11 2019-04-30 中电莱斯信息系统有限公司 A kind of adaptive 3 recognition methods suitable for optical touch panel
CN107786717A (en) * 2016-08-24 2018-03-09 深圳市鼎芯无限科技有限公司 A kind of control method of intelligent lamp
CN107203326A (en) * 2017-06-05 2017-09-26 瓦戈科技(上海)有限公司 A kind of floated push-botton operation control method
CN108268194A (en) * 2017-08-29 2018-07-10 广州市动景计算机科技有限公司 Using the display methods of app channel menus, device and mobile terminal
CN110275665A (en) * 2019-05-23 2019-09-24 深圳龙图腾创新设计有限公司 A kind of operating method of touch panel, electronic equipment and storage medium
CN111338554A (en) * 2020-02-14 2020-06-26 深圳小佳科技有限公司 Suspension ball operating system and method based on large-screen touch

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107209625A (en) * 2014-12-17 2017-09-26 得利捷美国股份有限公司 The soft trigger of floating of touch display on electronic equipment
CN107450800A (en) * 2017-07-25 2017-12-08 维沃移动通信有限公司 A kind of task method to set up, mobile terminal and computer-readable recording medium
CN109062464A (en) * 2018-06-27 2018-12-21 Oppo广东移动通信有限公司 touch operation method, device, storage medium and electronic equipment
CN112286358A (en) * 2020-11-02 2021-01-29 恒大新能源汽车投资控股集团有限公司 Screen operation method and device, electronic equipment and computer-readable storage medium

Also Published As

Publication number Publication date
CN114020204A (en) 2022-02-08
CN114020204B (en) 2023-07-07
CN113467695A (en) 2021-10-01

Similar Documents

Publication Publication Date Title
CN104794105A (en) Touch event processing for web pages
US10521248B2 (en) Electronic device and method thereof for managing applications
US9383858B2 (en) Method and device for executing an operation on a mobile device
CN107370874B (en) Application starting method, mobile terminal and storage medium
CN109683841B (en) Control display method and computing device in multi-display environment
US20140113688A1 (en) Method for operating mobile device using vibration sensor
US9588607B2 (en) Method for improving touch recognition and electronic device thereof
WO2016173307A1 (en) Message copying method and device, and smart terminal
CN111966260B (en) Window display method and computing device
CN113342452A (en) Window display method, computing device and readable storage medium
CN113467695B (en) Task execution method and device, computing device and storage medium
CN112256178A (en) Application icon adjusting method, device and system
WO2023093661A1 (en) Interface control method and apparatus, and electronic device and storage medium
CN114153538A (en) Window switching method, computing device and storage medium
US20200150820A1 (en) Method and apparatus for processing edge of touch screen
CN113885993A (en) System desktop split-screen display method, split-screen display control device and computing equipment
CN113190340B (en) Task switching method and computing device
WO2022057141A1 (en) Icon arrangement method and apparatus, user terminal, and storage medium
CN114270298A (en) Touch event processing method and device, mobile terminal and storage medium
CN114461113A (en) Interaction method based on taskbar, processing device and computing equipment
CN110321050B (en) Interactive operation method and device, electronic equipment and storage medium
CN113741758B (en) Working area switching method, computing device and storage medium
CN113467690A (en) Mouse control method and computing device
CN113741777A (en) Method for opening file based on drag operation, computing device and storage medium
CN113568732A (en) Application program switching method, computing device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant