CN110780787A - Method and equipment for executing task scheduling - Google Patents

Method and equipment for executing task scheduling Download PDF

Info

Publication number
CN110780787A
CN110780787A CN201911020187.0A CN201911020187A CN110780787A CN 110780787 A CN110780787 A CN 110780787A CN 201911020187 A CN201911020187 A CN 201911020187A CN 110780787 A CN110780787 A CN 110780787A
Authority
CN
China
Prior art keywords
objects
task information
window
application
focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911020187.0A
Other languages
Chinese (zh)
Other versions
CN110780787B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201911020187.0A priority Critical patent/CN110780787B/en
Publication of CN110780787A publication Critical patent/CN110780787A/en
Application granted granted Critical
Publication of CN110780787B publication Critical patent/CN110780787B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/485Task life-cycle, e.g. stopping, restarting, resuming execution

Abstract

The purpose of the application is to provide a method and equipment for executing task scheduling, wherein the method comprises the following steps: in the process of executing first task information by a plurality of first objects in a window of an application on a touch terminal, responding to touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application; stopping the one or more second objects from executing the first task information or starting the one or more second objects to execute second task information, wherein the other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information. The method and the device can meet the requirements of scheduling the execution tasks of a plurality of observation objects and performing separation operation on the selected observation objects in the set.

Description

Method and equipment for executing task scheduling
Technical Field
The present application relates to the field of communications, and in particular, to a technique for performing task scheduling.
Background
In a touch terminal, the existing touch interaction technology supports a user to select one or more target objects to execute the same task, for example, the user selects an object by touching the object to execute the task, or a user draws an area by sliding on a touch interface, takes all objects in the area as target objects, and starts the target object(s) to execute a task.
Disclosure of Invention
An object of the present application is to provide a method and apparatus for performing task scheduling.
According to an aspect of the present application, there is provided a method of performing task scheduling, the method including:
in the process of executing first task information by a plurality of first objects in a window of an application on a touch terminal, responding to touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application;
stopping the one or more second objects from executing the first task information or starting the one or more second objects to execute second task information, wherein the other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information.
According to an aspect of the present application, there is provided an apparatus for performing task scheduling, the apparatus including:
a module, configured to, in a process of executing first task information on a plurality of first objects in a window applied on a touch terminal, in response to a user's touch operation on a focus controller in the window of the application, select one or more second objects from the plurality of first objects by moving a focus indicator, where the focus controller is configured to control movement of the focus indicator in the window of the application;
and a second module, configured to stop the one or more second objects from executing the first task information or start the one or more second objects to execute the second task information, where a first object other than the one or more second objects in the plurality of first objects continues to execute the first task information.
According to an aspect of the present application, there is provided an apparatus for performing task scheduling, wherein the apparatus includes:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
in the process of executing first task information by a plurality of first objects in a window of an application on a touch terminal, responding to touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application;
stopping the one or more second objects from executing the first task information or starting the one or more second objects to execute second task information, wherein the other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information.
According to one aspect of the application, there is provided a computer-readable medium storing instructions that, when executed, cause a system to:
in the process of executing first task information by a plurality of first objects in a window of an application on a touch terminal, responding to touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application;
stopping the one or more second objects from executing the first task information or starting the one or more second objects to execute second task information, wherein the other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information.
Compared with the prior art, the method controls the movement of the focus indicator through the focus controller, determines one or more first observation objects which are pointed by the focus indicator and are executing a first task as second observation objects to stop executing the first task or start executing the second task, can select any observation object in a set formed by a plurality of observation objects through touch operation of the focus controller when the same task is executed, interrupts the current task or executes a new task, and simultaneously other observation objects in the set continue to execute the original task and end, can deal with a complex production environment in a scene using a touch terminal, meets the requirements of executing task scheduling of the plurality of observation objects and separating the selected observation objects in the set, and does not influence the rest of the observation objects which are not selected to continue to execute the original task, and the observation objects which are used for obtaining the focus in the application window can be continuously observed at the same time, so that some production behaviors which are extremely sensitive to the discontinuous behaviors are not interfered.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 illustrates a system topology diagram for performing task scheduling according to some embodiments of the present application;
FIG. 2 illustrates a flow diagram of a method of performing task scheduling according to some embodiments of the present application;
FIG. 3 illustrates a schematic diagram of performing task scheduling according to some embodiments of the present application;
FIG. 4 illustrates a schematic diagram of performing task scheduling according to some embodiments of the present application;
FIG. 5 illustrates a schematic diagram of performing task scheduling according to some embodiments of the present application;
FIG. 6 illustrates a block diagram of an apparatus for performing task scheduling according to some embodiments of the present application;
FIG. 7 illustrates an exemplary system that can be used to implement the various embodiments described in this application.
The same or similar reference numbers in the drawings identify the same or similar elements.
Detailed Description
The present application is described in further detail below with reference to the attached figures.
In a typical configuration of the present application, the terminal, the device serving the network, and the trusted party each include one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The device referred to in this application includes, but is not limited to, a user device, a network device, or a device formed by integrating a user device and a network device through a network. The user equipment includes, but is not limited to, any mobile electronic product, such as a smart phone, a tablet computer, etc., capable of performing human-computer interaction with a user (e.g., human-computer interaction through a touch panel), and the mobile electronic product may employ any operating system, such as an android operating system, an iOS operating system, etc. The network device includes an electronic device capable of automatically performing numerical calculation and information processing according to a preset or stored instruction, and hardware thereof includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like. The network device includes but is not limited to a computer, a network host, a single network server, a plurality of network server sets or a cloud of a plurality of servers; here, the Cloud is composed of a large number of computers or web servers based on Cloud Computing (Cloud Computing), which is a kind of distributed Computing, one virtual supercomputer consisting of a collection of loosely coupled computers. Including, but not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a VPN network, a wireless Ad Hoc network (Ad Hoc network), etc. Preferably, the device may also be a program running on the user device, the network device, or a device formed by integrating the user device and the network device, the touch terminal, or the network device and the touch terminal through a network.
Of course, those skilled in the art will appreciate that the foregoing is by way of example only, and that other existing or future devices, which may be suitable for use in the present application, are also encompassed within the scope of the present application and are hereby incorporated by reference.
In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
FIG. 1 illustrates a system topology diagram for performing task scheduling according to some embodiments of the present application.
As shown in fig. 1, the resident pointer is a focus pointer for pointing one or more selected viewing objects getting focus in the application window, the wheel controller is a focus controller for controlling the resident pointer, during the process of executing the current task by the plurality of observed objects, the user can move the resident indicator through the touch operation on the focus controller, so that the observed object pointed by the resident indicator is selected as the second observed object, then, if the user selects the instruction button 1, the second observed object stops executing the current task and starts executing the instruction task 1 corresponding to the instruction button 1, or, if the user selects the instruction button 2, the second observation object stops executing the current task and starts executing the instruction task 2 corresponding to the instruction button 2, and the other observation objects except the second observation object continue executing the current task without being affected.
In the prior art, if different observation objects need to be selected as target objects in an application window to execute different tasks, the observation objects need to be manually selected as the target objects through a touch operation (for example, an object is selected by clicking the object), the accuracy is low and errors easily occur, in some cases, if the target objects need to be continuously focused, the touch operation may interfere with the continuous observation of the target objects, and when multiple target objects are performing the same task in a set, one or more target objects in the set need to be individually selected to stop executing the current task or start executing a new task.
Compared with the prior art, the method controls the movement of the focus indicator through the focus controller, determines one or more first observation objects which are pointed by the focus indicator and are executing a first task as second observation objects to stop executing the first task or start executing the second task, can select any observation object in a set formed by a plurality of observation objects through touch operation of the focus controller when the same task is executed, interrupts the current task or executes a new task, and simultaneously other observation objects in the set continue to execute the original task and end, can deal with a complex production environment in a scene using a touch terminal, meets the requirements of executing task scheduling of the plurality of observation objects and separating the selected observation objects in the set, and does not influence the rest of the observation objects which are not selected to continue to execute the original task, and the observation objects which are used for obtaining the focus in the application window can be continuously observed at the same time, so that some production behaviors which are extremely sensitive to the discontinuous behaviors are not interfered.
FIG. 2 is a flowchart illustrating a method of performing task scheduling according to an embodiment of the present application, the method including step S11 and step S12. In step S11, in the process that the user equipment executes first task information on a plurality of first objects in a window applied on the touch terminal, in response to a user' S touch operation on a focus controller in the window of the application, selecting one or more second objects from the plurality of first objects by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application; in step S12, the user equipment stops the one or more second objects from executing the first task information or starts the one or more second objects to execute the second task information, wherein the other first objects of the plurality of first objects except the one or more second objects continue to execute the first task information.
In step S11, in the process that the user equipment executes the first task information on the plurality of first objects in the window applied on the touch terminal, one or more second objects are selected from the plurality of first objects by moving the focus indicator in response to a touch operation of a user on a focus controller in the window of the application, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application. In some embodiments, the focus controller may be a wheel controller, the focus indicator to indicate one or more selected viewing objects in the application window that have gained focus, the focus controller to control the focus indicator, the user may move the focus indicator through a touch operation on the focus controller, and selects a plurality of first observing objects which are performing the first task and pointed to by the focus indicator as second observing objects which can be regarded as the focus of the obtained application window, the second observing objects can be displayed in a manner of highlighting or framing and the like so as to distinguish other observing objects except the second observing objects, the touch operation on the focus controller and the movement of the focus indicator form a mutual mapping relation, and the focus indicator moves synchronously according to a related algorithm along with the touch operation on the focus controller.
For example, the user may control the moving direction and the moving distance of the focus indicator by a sliding operation on the focus controller, determine a certain first observation object as a second observation object when the focus indicator moves into the display region of the first observation object, or determine a certain first observation object as a second observation object by a single-click or double-click operation on the focus controller when the focus indicator moves into the display region of the first observation object, or determine a first observation object, which is closest to the focus indicator and is pointed by the focus indicator, as the second observation object if the focus indicator is a directional control. For another example, the user may first perform a finger-down operation on the focus controller, then perform a sliding operation while keeping the finger touch to adjust the size of the virtual rectangular frame having the position of the previous finger-down operation as a fixed vertex, and then perform a gesture-up operation to determine one or more first observation objects in the display area of the virtual rectangular frame at the time of the gesture-up as second observation objects while hiding and displaying the virtual rectangular frame. Therefore, a user can control the movement of the focus indicator in the larger application window through touch operation on the smaller focus controller, the focus area is an area for observing a target in a normal condition, and the operation range of the focus controller is smaller, so that the user can continuously observe an observation object with a focus in the application window when controlling the focus indicator through touch operation on the focus controller, and production behaviors which are extremely sensitive to intermittent behaviors are ensured not to be interfered.
In step S12, the user equipment stops the one or more second objects from executing the first task information or starts the one or more second objects to execute the second task information, wherein the other first objects of the plurality of first objects except the one or more second objects continue to execute the first task information. In some embodiments, the task information is used to indicate a task to be executed by the target object, including but not limited to information such as task name, task content, task time, task object, task location, etc., and the selected one or more second observation objects may stop executing only the first task that was being executed before, or may also stop executing the first task and start executing the second task, where the second task may be preset, and may be triggered in real time subsequently, and the first observation objects other than the second observation objects may still continue to execute the first task until the execution of the first task is completed or is suspended. In some embodiments, the target object performs the task by running a predetermined task program to achieve a predetermined task result, for example, the task may be used to move the target object (e.g., to move the target object from location L1 to location L2, etc.), to make the target object perform some action (e.g., to let the target object release skills, or to let the target object attack some enemy), or to make the target object modify its attribute information (e.g., to let the target object attack + 1).
For example, when a first observation object Target1, Target2, Target3 is currently performing a first task (e.g., releasing skill S1), and the user moves the focus indicator through a touch operation of the focus controller, thereby selecting Target1 as a second observation object, Target1 stops releasing skill S1 and starts performing a second task (e.g., releasing skill S2), Target2 and Target3 are not affected, and release of skill S1 is continued until release of skill S1 is completed or suspended. Therefore, when a set formed by a plurality of observation objects executes the same task, a user can move the focus indicator to select any observation object in the set through touch operation of the focus controller, interrupt the current task or execute a new task, and simultaneously other observation objects in the set continue to execute the original task until the end. In some embodiments, the method further comprises, before step S11, step S13 (not shown) and step S14 (not shown). In step S13, the user device selects the plurality of first objects from one or more selectable objects by moving the focus indicator in response to a touch operation of the focus controller by the user; in step S14, the user equipment starts the plurality of first objects to execute the first task information. In some embodiments, the user may first perform a finger-down operation on the focus controller, then adjust the size of the virtual rectangular frame with the position of the previous finger-down operation as a fixed vertex by a sliding operation while keeping the finger touch, then perform a gesture-up operation, determine a plurality of selectable objects within the display area of the virtual rectangular frame at the moment when the gesture-up operation is performed as the first viewing object, and simultaneously hide and display the virtual rectangular frame, where the selectable objects include, but are not limited to, various static or dynamic viewing objects displayed in the application window that can be selected, and then simultaneously start the plurality of first viewing objects to perform the same task, i.e., the first task.
In some embodiments, the step S13 includes: the user equipment responds to the touch operation of the user on the focus controller, and selects the plurality of first objects from one or more selectable objects by moving the focus indicator, wherein each first object is set to be in a selected state; the step S14 includes: and the user equipment starts the plurality of first objects to execute the first task information and updates the state of each first object in the plurality of first objects into an optional state. In some embodiments, a plurality of first observation objects are determined from one or more selectable objects, the state of each first observation object is set to be a selected state, after the first observation objects are started to execute a first task, the state of each first observation object is updated to be a selectable state, so that the first observation objects can be selected to stop executing the first task or can be selected to execute a second task different from the first task, and the observation objects in different states can be displayed respectively in a highlighting or framing mode, so that a user can distinguish the observation objects in the selectable state from the observation objects in the selected state.
In some embodiments, the step S11 includes: in the process that a user device executes first task information on a plurality of first objects in a window applied on a touch terminal, responding to a touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application, and the state of each second object in the one or more second objects is updated from a selectable state to a selected state; the step S12 includes: and the user equipment stops the one or more second objects from executing the first task information or starts the one or more second objects to execute the second task information, and updates the state of each of the one or more second objects to be an optional state, wherein other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information. In some embodiments, one or more second observations are determined from the plurality of first observations, the status of each second observation is updated from an optional state to a selected state, then the selected second observations will stop executing the first task that was executing before, start executing the second task, and change the status of each second observation from the selected state to the optional state, and the first observations other than the second observations will continue executing the first task until the first task execution is complete or suspended.
In some embodiments, the method further comprises step S15 (not shown) and step S16 (not shown). In step S15, the user device selects one or more third objects from the one or more second objects by moving the focus indicator in response to the user' S touch operation on the focus controller during the execution of the second task information by the one or more second objects; in step S16, the user equipment stops or starts the one or more third objects to execute the second task information, wherein other second objects of the one or more second objects except the one or more third objects continue to execute the second task information. In some embodiments, in the process that one or more second observation objects execute the second task, by moving the focus indicator in response to a touch operation of the user on the focus controller, the one or more second observation objects pointed by the focus indicator are selected and determined as third observation objects, then the selected third observation objects stop executing the second task that was executed before, and start executing the third task, and the other second observation objects except the third observation objects continue to execute the second task until the execution of the second task is completed or is suspended. As shown in fig. 3, the first observation objects Target1, Target2, Target3 are currently performing the first task (e.g., release skill S1), at which time, the user moves the focus indicator by the touch operation of the focus controller, thereby selecting Target1 and Target2 as the second observation object, Target1 and Target2 stop releasing skill S1, starts performing the second task (e.g., release skill S2), Target3 is not affected, and still continues to release skill S1 until the skill S1 release is completed or suspended, and then the user continues to move the focus indicator by the touch operation of the focus controller, selects Target1 among Target1 and Target2 and serves as the third observation object, Target1 stops releasing skill S2, starts performing the third task (e.g., release skill S3), Target 395 is not affected, and still continues to release skill S2 or suspended until the skill S57323 is completed.
In some embodiments, the step S15 includes: and in the process of executing the second task information by the one or more second objects, the user equipment selects one or more third objects from the selectable objects in the window of the application by moving the focus indicator in response to the touch operation of the user on the focus controller, wherein the selectable objects in the window of the application comprise the one or more second objects. In some embodiments, during the execution of the second task by the one or more second observation objects, the user moves the focus indicator through a touch operation of the focus controller, selects one or more selectable objects from all selectable objects in the application window as third observation objects, the selected third observation objects stop executing the previously executed second task and start executing the third task, and the second observation objects except the third observation objects continue to execute the second task until the execution of the second task is completed or suspended, wherein the selectable objects in the application window include the second objects executing the second task. For example, the selectable objects within the application window include Target1, Target2, and Target3, where Target2 and Target3 are performing the second task as the second observation object, at which time the user moves the focus indicator through a touch operation of the focus controller, and Target2 is selected from the selectable objects as the third observation object.
In some embodiments, the in-window selectable objects of the application further include other first objects of the plurality of first objects in addition to the one or more second objects. In some embodiments, the selectable objects within the application window include, in addition to the second viewing object, other first viewing objects in addition to the second viewing object, which the user may select as third viewing objects. For example, the selectable objects within the application window include Target1, Target2, and Target3, where Target1 is a first observation object, Target1 may be executing a first task, may have already executed to complete the first task or be interrupted from executing the first task, and Target2 and Target3 are executing a second task as a second observation object, at which time, the user moves the focus indicator through a touch operation of the focus controller, and Target1 is selected from the selectable objects as a third observation object.
In some embodiments, the step S11 includes: in the process that a user device executes first task information on a plurality of first objects in a window applied on a touch terminal, responding to a touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects and other selectable objects in the window of the application by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application, and at least one second object is selected from the plurality of first objects. In some embodiments, during the first task performed by the plurality of first observation objects, the user may move the focus indicator through a touch operation of the focus controller, and select one or more second observation objects from the plurality of first observation objects and other selectable objects within the application window other than the first observation object, wherein at least one second observation object is selected from the plurality of first observation objects. As shown in fig. 4, first observation objects Target1 and Target2 are currently performing a first task (e.g., release skill S1), Target3 is another selectable object in the application window except Target1 and Target2, at this time, the user moves the focus indicator through the touch operation of the focus controller, selects Target1, and takes Target1 as a second observation object, Target1 stops releasing skill S1, starts performing a second task (e.g., release skill S2), Target2 is not affected, and still continues to release skill S1 until the release of skill S1 is completed or suspended.
In some embodiments, at least one of the second objects is selected from other selectable objects within a window of the application. In some embodiments, during the first task performed by the plurality of first observation objects, the user may move the focus indicator through a touch operation of the focus controller, and select one or more second observation objects from the plurality of first observation objects and other selectable objects within the application window other than the first observation object, wherein at least one second observation object is selected from the other selectable objects within the application window. As shown in fig. 5, first observation objects Target1 and Target2 are currently performing a first task (e.g., release skill S1), Target3 is another selectable object in the application window other than Target1 and Target2, at which time, the user moves the focus indicator through a touch operation of the focus controller, selects Target2 and Target3, and takes Target2 and Target3 as second observation objects, Target2 and Target3 stop releasing skill S1, and starts performing a second task (e.g., release skill S2), Target2 is not affected, and release skill S1 continues until release of skill S1 is completed or suspended.
In some embodiments, the step S12 includes: and the user equipment responds to the execution triggering operation of the user on an instruction button in the window of the application, if the instruction button corresponds to the first task information, the one or more second objects are stopped from executing the first task information, otherwise, the one or more second objects are started to execute the second task information corresponding to the instruction button, wherein other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information. In some embodiments, the user clicks or double-clicks an instruction button in the application window, if the instruction button is used to suspend the first task, the one or more second viewing objects stop executing the first task, if the instruction button is used to start the second task, the one or more second viewing objects stop executing the first task and start executing the second task, and the first viewing objects except the second viewing objects continue to execute the first task without being affected. For example, first observation objects Target1, Target2, and Target3 are currently executing a first task (e.g., release skill S1), at which time the user moves the focus indicator through a touch operation of the focus controller, thereby selecting Target1 as a second observation object, at which time, if the user clicks an instruction button Btn1 for suspending the first task located in the application window, Target1 stops releasing skill S1, and if the user clicks an instruction button Btn2 for starting a second task located in the application window, Target1 stops releasing skill S1 and starts executing the second task (e.g., release skill S2), Target2 and Target3 are not affected, and still continue to release skill S1 until the release of skill S1 is completed or suspended.
Fig. 3 illustrates a user equipment for performing task scheduling according to some embodiments of the present application, which includes a one-module 11 and a two-module 12. A module 11, configured to, in a process of executing first task information on a plurality of first objects in a window applied on a touch terminal, in response to a user's touch operation on a focus controller in the window of the application, select one or more second objects from the plurality of first objects by moving a focus indicator, where the focus controller is configured to control movement of the focus indicator in the window of the application; a second module 12, configured to stop the one or more second objects from executing the first task information or start the one or more second objects to execute the second task information, where a first object other than the one or more second objects in the plurality of first objects continues to execute the first task information.
A module 11, configured to, in a process of executing first task information on a plurality of first objects in a window applied on a touch terminal, respond to a user's touch operation on a focus controller in the window of the application, select one or more second objects from the plurality of first objects by moving a focus indicator, where the focus controller is configured to control movement of the focus indicator in the window of the application. In some embodiments, the focus controller may be a wheel controller, the focus indicator to indicate one or more selected viewing objects in the application window that have gained focus, the focus controller to control the focus indicator, the user may move the focus indicator through a touch operation on the focus controller, and selects a plurality of first observing objects which are performing the first task and pointed to by the focus indicator as second observing objects which can be regarded as the focus of the obtained application window, the second observing objects can be displayed in a manner of highlighting or framing and the like so as to distinguish other observing objects except the second observing objects, the touch operation on the focus controller and the movement of the focus indicator form a mutual mapping relation, and the focus indicator moves synchronously according to a related algorithm along with the touch operation on the focus controller.
For example, the user may control the moving direction and the moving distance of the focus indicator by a sliding operation on the focus controller, determine a certain first observation object as a second observation object when the focus indicator moves into the display region of the first observation object, or determine a certain first observation object as a second observation object by a single-click or double-click operation on the focus controller when the focus indicator moves into the display region of the first observation object, or determine a first observation object, which is closest to the focus indicator and is pointed by the focus indicator, as the second observation object if the focus indicator is a directional control. For another example, the user may first perform a finger-down operation on the focus controller, then perform a sliding operation while keeping the finger touch to adjust the size of the virtual rectangular frame having the position of the previous finger-down operation as a fixed vertex, and then perform a gesture-up operation to determine one or more first observation objects in the display area of the virtual rectangular frame at the time of the gesture-up as second observation objects while hiding and displaying the virtual rectangular frame.
A second module 12, configured to stop the one or more second objects from executing the first task information or start the one or more second objects to execute the second task information, where a first object other than the one or more second objects in the plurality of first objects continues to execute the first task information. In some embodiments, the task information is used to indicate a task to be executed by the target object, including but not limited to information such as task name, task content, task time, task object, task location, etc., and the selected one or more second observation objects may stop executing only the first task that was being executed before, or may also stop executing the first task and start executing the second task, where the second task may be preset, and may be triggered in real time subsequently, and the first observation objects other than the second observation objects may still continue to execute the first task until the execution of the first task is completed or is suspended. In some embodiments, the target object performs the task by running a predetermined task program to achieve a predetermined task result, for example, the task may be used to move the target object (e.g., to move the target object from location L1 to location L2, etc.), to make the target object perform some action (e.g., to let the target object release skills, or to let the target object attack some enemy), or to make the target object modify its attribute information (e.g., to let the target object attack + 1).
For example, when a first observation object Target1, Target2, Target3 is currently performing a first task (e.g., releasing skill S1), and the user moves the focus indicator through a touch operation of the focus controller, thereby selecting Target1 as a second observation object, Target1 stops releasing skill S1 and starts performing a second task (e.g., releasing skill S2), Target2 and Target3 are not affected, and release of skill S1 is continued until release of skill S1 is completed or suspended. Therefore, when a set formed by a plurality of observation objects executes the same task, a user can move the focus indicator to select any observation object in the set through touch operation of the focus controller, interrupt the current task or execute a new task, and simultaneously other observation objects in the set continue to execute the original task until the end.
In some embodiments, the apparatus further comprises a three-module 13 (not shown) and a four-module 14 (not shown). A third module 13, responsive to a touch operation of the focus controller by the user, for selecting the first plurality of objects from one or more selectable objects by moving the focus indicator; a fourth module 14 is configured to initiate the plurality of first objects to execute the first task information. Here, the specific implementation manners of the third module 13 and the fourth module 14 are the same as or similar to those of the embodiments related to steps S13 and S14 in fig. 2, and therefore, the detailed descriptions thereof are omitted, and the detailed descriptions thereof are incorporated herein by reference.
In some embodiments, the one-three module 13 is configured to: selecting the plurality of first objects from one or more selectable objects by moving the focus indicator in response to a touch operation of the user on the focus controller, wherein each first object is set to a selected state; the step S14 includes: and the user equipment starts the plurality of first objects to execute the first task information and updates the state of each first object in the plurality of first objects into an optional state. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 2, and therefore are not described again, and are included herein by reference.
In some embodiments, the module 11 is configured to: in the process of executing first task information by a plurality of first objects in a window applied on a touch terminal, responding to the touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application, and the state of each second object in the one or more second objects is updated from a selectable state to a selected state; the first and second modules are configured to: stopping the one or more second objects from executing the first task information or starting the one or more second objects to execute the second task information, and updating the state of each of the one or more second objects to be an optional state, wherein the other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 2, and therefore are not described again, and are included herein by reference.
In some embodiments, the apparatus further comprises a five module 15 (not shown) and a six module 16 (not shown). A fifth module 15, configured to select one or more third objects from the one or more second objects by moving the focus indicator in response to a touch operation of the user on the focus controller during the execution of the second task information by the one or more second objects; a sixth module 16, configured to stop the one or more third objects from executing the second task information or start the one or more third objects to execute the third task information, where a second object other than the one or more third objects in the one or more second objects continues to execute the second task information. Here, the specific implementation manners of the fifth module 15 and the sixth module 16 are the same as or similar to those of the embodiments related to steps S15 and S16 in fig. 2, and therefore, the detailed descriptions thereof are omitted, and the detailed descriptions thereof are incorporated herein by reference.
In some embodiments, the one-five module 15 is configured to: and in the process of executing the second task information by the one or more second objects, responding to the touch operation of the user on the focus controller, and selecting one or more third objects from the selectable objects in the window of the application by moving the focus indicator, wherein the selectable objects in the window of the application comprise the one or more second objects. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 2, and therefore are not described again, and are included herein by reference.
In some embodiments, the in-window selectable objects of the application further include other first objects of the plurality of first objects in addition to the one or more second objects. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 2, and therefore are not described again, and are included herein by reference.
In some embodiments, the module 11 is configured to: in the process of executing first task information by a plurality of first objects in a window of an application on a touch terminal, responding to touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects and other optional objects in the window of the application by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application, and at least one second object is selected from the plurality of first objects. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 2, and therefore are not described again, and are included herein by reference.
In some embodiments, at least one of the second objects is selected from other selectable objects within a window of the application. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 2, and therefore are not described again, and are included herein by reference.
In some embodiments, the secondary module 12 is configured to: and responding to an execution triggering operation of the user on an instruction button in the window of the application, if the instruction button corresponds to the first task information, stopping the one or more second objects from executing the first task information, otherwise, starting the one or more second objects to execute the second task information corresponding to the instruction button, wherein other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information. Here, the related operations are the same as or similar to those of the embodiment shown in fig. 2, and therefore are not described again, and are included herein by reference.
FIG. 4 illustrates an exemplary system that can be used to implement the various embodiments described in this application.
In some embodiments, as shown in FIG. 4, the system 300 can be implemented as any of the devices in the various embodiments described. In some embodiments, system 300 may include one or more computer-readable media (e.g., system memory or NVM/storage 320) having instructions and one or more processors (e.g., processor(s) 305) coupled with the one or more computer-readable media and configured to execute the instructions to implement modules to perform the actions described herein.
For one embodiment, system control module 310 may include any suitable interface controllers to provide any suitable interface to at least one of processor(s) 305 and/or any suitable device or component in communication with system control module 310.
The system control module 310 may include a memory controller module 330 to provide an interface to the system memory 315. Memory controller module 330 may be a hardware module, a software module, and/or a firmware module.
System memory 315 may be used, for example, to load and store data and/or instructions for system 300. For one embodiment, system memory 315 may include any suitable volatile memory, such as suitable DRAM. In some embodiments, the system memory 315 may include a double data rate type four synchronous dynamic random access memory (DDR4 SDRAM).
For one embodiment, system control module 310 may include one or more input/output (I/O) controllers to provide an interface to NVM/storage 320 and communication interface(s) 325.
For example, NVM/storage 320 may be used to store data and/or instructions. NVM/storage 320 may include any suitable non-volatile memory (e.g., flash memory) and/or may include any suitable non-volatile storage device(s) (e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives).
NVM/storage 320 may include storage resources that are physically part of the device on which system 300 is installed or may be accessed by the device and not necessarily part of the device. For example, NVM/storage 320 may be accessible over a network via communication interface(s) 325.
Communication interface(s) 325 may provide an interface for system 300 to communicate over one or more networks and/or with any other suitable device. System 300 may wirelessly communicate with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols.
For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controller(s) (e.g., memory controller module 330) of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controller(s) of the system control module 310 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310 to form a system on a chip (SoC).
In various embodiments, system 300 may be, but is not limited to being: a server, a workstation, a desktop computing device, or a mobile computing device (e.g., a laptop computing device, a holding computing device, a tablet, a netbook, etc.). In various embodiments, system 300 may have more or fewer components and/or different architectures. For example, in some embodiments, system 300 includes one or more cameras, a keyboard, a Liquid Crystal Display (LCD) screen (including a touch screen display), a non-volatile memory port, multiple antennas, a graphics chip, an Application Specific Integrated Circuit (ASIC), and speakers.
The present application also provides a computer readable storage medium having stored thereon computer code which, when executed, performs a method as in any one of the preceding.
The present application also provides a computer program product, which when executed by a computer device, performs the method of any of the preceding claims.
The present application further provides a computer device, comprising:
one or more processors;
a memory for storing one or more computer programs;
the one or more computer programs, when executed by the one or more processors, cause the one or more processors to implement the method of any preceding claim.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, for example, implemented using Application Specific Integrated Circuits (ASICs), general purpose computers or any other similar hardware devices. In one embodiment, the software programs of the present application may be executed by a processor to implement the steps or functions described above. Likewise, the software programs (including associated data structures) of the present application may be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Additionally, some of the steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
In addition, some of the present application may be implemented as a computer program product, such as computer program instructions, which when executed by a computer, may invoke or provide methods and/or techniques in accordance with the present application through the operation of the computer. Those skilled in the art will appreciate that the form in which the computer program instructions reside on a computer-readable medium includes, but is not limited to, source files, executable files, installation package files, and the like, and that the manner in which the computer program instructions are executed by a computer includes, but is not limited to: the computer directly executes the instruction, or the computer compiles the instruction and then executes the corresponding compiled program, or the computer reads and executes the instruction, or the computer reads and installs the instruction and then executes the corresponding installed program. Computer-readable media herein can be any available computer-readable storage media or communication media that can be accessed by a computer.
Communication media includes media by which communication signals, including, for example, computer readable instructions, data structures, program modules, or other data, are transmitted from one system to another. Communication media may include conductive transmission media such as cables and wires (e.g., fiber optics, coaxial, etc.) and wireless (non-conductive transmission) media capable of propagating energy waves such as acoustic, electromagnetic, RF, microwave, and infrared. Computer readable instructions, data structures, program modules, or other data may be embodied in a modulated data signal, for example, in a wireless medium such as a carrier wave or similar mechanism such as is embodied as part of spread spectrum techniques. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. The modulation may be analog, digital or hybrid modulation techniques.
By way of example, and not limitation, computer-readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media include, but are not limited to, volatile memory such as random access memory (RAM, DRAM, SRAM); and non-volatile memory such as flash memory, various read-only memories (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memories (MRAM, FeRAM); and magnetic and optical storage devices (hard disk, tape, CD, DVD); or other now known media or later developed that can store computer-readable information/data for use by a computer system.
One embodiment according to the present application comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions are executed by the processor when the computer program instructions are executed
When the device is triggered to operate, the method and/or the technical scheme based on the plurality of the embodiments of the application are/is triggered.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.

Claims (22)

1. A method of performing task scheduling, wherein the method comprises:
in the process of executing first task information by a plurality of first objects in a window of an application on a touch terminal, responding to touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application;
stopping the one or more second objects from executing the first task information or starting the one or more second objects to execute second task information, wherein the other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information.
2. The method of claim 1, wherein the method selects one or more second objects from the plurality of first objects by moving a focus indicator in response to a user's touch operation of a focus controller in a window of the application during execution of first task information by the plurality of first objects within the window of the application on the touch terminal, wherein the focus controller is configured to control movement of the focus indicator in the window of the application, and further comprising:
selecting the plurality of first objects from one or more selectable objects by moving the focus indicator in response to a touch operation of the focus controller by the user;
starting the plurality of first objects to execute the first task information.
3. The method of claim 2, wherein said selecting the plurality of first objects from one or more selectable objects by moving the focus indicator in response to the user's touch operation of the focus controller comprises:
selecting the plurality of first objects from one or more selectable objects by moving the focus indicator in response to a touch operation of the user on the focus controller, wherein each first object is set to a selected state;
the starting the plurality of first objects to execute the first task information comprises:
and starting the plurality of first objects to execute the first task information, and updating the state of each first object in the plurality of first objects to be an optional state.
4. The method according to any one of claims 1 to 3, wherein the selecting one or more second objects from the plurality of first objects by moving a focus indicator in response to a user's touch operation of a focus controller in a window of an application during execution of first task information by the plurality of first objects within the window of the application on the touch terminal, wherein the focus controller is configured to control movement of the focus indicator in the window of the application, comprises:
in the process of executing first task information by a plurality of first objects in a window applied on a touch terminal, responding to the touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application, and the state of each second object in the one or more second objects is updated from a selectable state to a selected state;
the stopping the one or more second objects from executing the first task information or starting the one or more second objects to execute second task information, wherein the continuing of the execution of the first task information by the first objects other than the one or more second objects in the plurality of first objects comprises:
stopping the one or more second objects from executing the first task information or starting the one or more second objects to execute the second task information, and updating the state of each of the one or more second objects to be an optional state, wherein the other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information.
5. The method of any of claims 1-4, wherein the method further comprises:
selecting one or more third objects from the one or more second objects by moving the focus indicator in response to a touch operation of the user on the focus controller during the one or more second objects perform the second task information;
stopping the one or more third objects from executing the second task information or starting the one or more third objects to execute the third task information, wherein other second objects except the one or more third objects in the one or more second objects continue to execute the second task information.
6. The method of claim 5, wherein the selecting one or more third objects from the one or more second objects by moving the focus indicator in response to the user's touch operation of the focus controller during the execution of the second task information by the one or more second objects comprises:
and in the process of executing the second task information by the one or more second objects, responding to the touch operation of the user on the focus controller, and selecting one or more third objects from the selectable objects in the window of the application by moving the focus indicator, wherein the selectable objects in the window of the application comprise the one or more second objects.
7. The method of claim 6, wherein the in-window selectable objects of the application further comprise other first objects of the plurality of first objects than the one or more second objects.
8. The method according to any one of claims 1 to 7, wherein the selecting one or more second objects from the plurality of first objects by moving a focus indicator in response to a user's touch operation of a focus controller in a window of an application during execution of first task information by the plurality of first objects within the window of the application on the touch terminal, wherein the focus controller is configured to control movement of the focus indicator in the window of the application, comprises:
in the process of executing first task information by a plurality of first objects in a window of an application on a touch terminal, responding to touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects and other optional objects in the window of the application by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application, and at least one second object is selected from the plurality of first objects.
9. The method of claim 8, wherein at least one of the second objects is selected from other selectable objects within a window of the application.
10. The method of any of claims 1-9, wherein the stopping or starting the one or more second objects from executing the first task information or starting the one or more second objects from executing second task information, wherein the continuing of the first task information by the first objects other than the one or more second objects comprises:
and responding to an execution triggering operation of the user on an instruction button in the window of the application, if the instruction button corresponds to the first task information, stopping the one or more second objects from executing the first task information, otherwise, starting the one or more second objects to execute the second task information corresponding to the instruction button, wherein other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information.
11. An apparatus for performing task scheduling, wherein the apparatus comprises:
a module, configured to, in a process of executing first task information on a plurality of first objects in a window applied on a touch terminal, in response to a user's touch operation on a focus controller in the window of the application, select one or more second objects from the plurality of first objects by moving a focus indicator, where the focus controller is configured to control movement of the focus indicator in the window of the application;
and a second module, configured to stop the one or more second objects from executing the first task information or start the one or more second objects to execute the second task information, where a first object other than the one or more second objects in the plurality of first objects continues to execute the first task information.
12. The apparatus of claim 11, wherein the apparatus further comprises:
a third module for selecting the plurality of first objects from one or more selectable objects by moving the focus indicator in response to a touch operation of the focus controller by the user;
a fourth module for initiating the plurality of first objects to execute the first task information.
13. The apparatus of claim 12, wherein the one or three modules are to:
selecting the plurality of first objects from one or more selectable objects by moving the focus indicator in response to a touch operation of the user on the focus controller, wherein each first object is set to a selected state;
the four modules are used for:
and starting the plurality of first objects to execute the first task information, and updating the state of each first object in the plurality of first objects to be an optional state.
14. The apparatus of any of claims 11-13, wherein the one-to-one module is to:
in the process of executing first task information by a plurality of first objects in a window applied on a touch terminal, responding to the touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application, and the state of each second object in the one or more second objects is updated from a selectable state to a selected state;
the first and second modules are configured to:
stopping the one or more second objects from executing the first task information or starting the one or more second objects to execute the second task information, and updating the state of each of the one or more second objects to be an optional state, wherein the other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information.
15. The apparatus of any of claims 11 to 14, wherein the apparatus further comprises:
a fifth module, configured to select one or more third objects from the one or more second objects by moving the focus indicator in response to a touch operation of the user on the focus controller during execution of the second task information by the one or more second objects;
a sixth module, configured to stop the one or more third objects from executing the second task information or start the one or more third objects to execute the third task information, where a second object other than the one or more third objects in the one or more second objects continues to execute the second task information.
16. The apparatus of claim 15, wherein the fifty-five module is to:
and in the process of executing the second task information by the one or more second objects, responding to the touch operation of the user on the focus controller, and selecting one or more third objects from the selectable objects in the window of the application by moving the focus indicator, wherein the selectable objects in the window of the application comprise the one or more second objects.
17. The device of claim 16, wherein the in-window selectable objects of the application further comprise other first objects of the plurality of first objects than the one or more second objects.
18. The apparatus of any of claims 11-17, wherein the one-to-one module is to:
in the process of executing first task information by a plurality of first objects in a window of an application on a touch terminal, responding to touch operation of a user on a focus controller in the window of the application, and selecting one or more second objects from the plurality of first objects and other optional objects in the window of the application by moving a focus indicator, wherein the focus controller is used for controlling the movement of the focus indicator in the window of the application, and at least one second object is selected from the plurality of first objects.
19. The device of claim 18, wherein at least one of the second objects is selected from other selectable objects within a window of the application.
20. The apparatus of any of claims 11-19, wherein the two modules are to:
and responding to an execution triggering operation of the user on an instruction button in the window of the application, if the instruction button corresponds to the first task information, stopping the one or more second objects from executing the first task information, otherwise, starting the one or more second objects to execute the second task information corresponding to the instruction button, wherein other first objects except the one or more second objects in the plurality of first objects continue to execute the first task information.
21. An apparatus for performing task scheduling, wherein the apparatus comprises:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the operations of the method of any of claims 1 to 10.
22. A computer-readable medium storing instructions that, when executed, cause a system to perform the operations of any of the methods of claims 1-10.
CN201911020187.0A 2019-10-24 2019-10-24 Method and device for executing task scheduling Active CN110780787B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911020187.0A CN110780787B (en) 2019-10-24 2019-10-24 Method and device for executing task scheduling

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911020187.0A CN110780787B (en) 2019-10-24 2019-10-24 Method and device for executing task scheduling

Publications (2)

Publication Number Publication Date
CN110780787A true CN110780787A (en) 2020-02-11
CN110780787B CN110780787B (en) 2023-05-09

Family

ID=69387424

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911020187.0A Active CN110780787B (en) 2019-10-24 2019-10-24 Method and device for executing task scheduling

Country Status (1)

Country Link
CN (1) CN110780787B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101086711A (en) * 2006-06-11 2007-12-12 上海全成通信技术有限公司 Mission management method of multiple-mission operation system
JP2011242828A (en) * 2010-05-14 2011-12-01 Sharp Corp Operation device, electronic device and image processor provided with operation device, and information display method in operation device
CN103970474A (en) * 2013-01-31 2014-08-06 三星电子株式会社 Method and apparatus for multitasking
CN109308213A (en) * 2017-07-27 2019-02-05 南京南瑞继保电气有限公司 Based on the multitask breakpoint debugging method for improving Task Scheduling Mechanism
CN109408286A (en) * 2018-09-17 2019-03-01 北京京东金融科技控股有限公司 Data processing method, device, system, computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101086711A (en) * 2006-06-11 2007-12-12 上海全成通信技术有限公司 Mission management method of multiple-mission operation system
JP2011242828A (en) * 2010-05-14 2011-12-01 Sharp Corp Operation device, electronic device and image processor provided with operation device, and information display method in operation device
CN103970474A (en) * 2013-01-31 2014-08-06 三星电子株式会社 Method and apparatus for multitasking
CN109308213A (en) * 2017-07-27 2019-02-05 南京南瑞继保电气有限公司 Based on the multitask breakpoint debugging method for improving Task Scheduling Mechanism
CN109408286A (en) * 2018-09-17 2019-03-01 北京京东金融科技控股有限公司 Data processing method, device, system, computer readable storage medium

Also Published As

Publication number Publication date
CN110780787B (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN110333918B (en) Method and equipment for managing boarder programs
KR20160025905A (en) Electronic device including touch sensitive display and method for operating the same
CN110321192B (en) Method and equipment for presenting hosted program
CN110321189B (en) Method and equipment for presenting hosted program in hosted program
US10853152B2 (en) Touch application programming interfaces
CN110413179B (en) Method and equipment for presenting session message
CN111488096B (en) Method and equipment for displaying interactive presentation information in reading application
CN110290557B (en) Method and equipment for loading page tags in application
EP3125101A1 (en) Screen controlling method and electronic device for supporting the same
WO2016101816A1 (en) Method and device for information display in instant messaging
KR102618480B1 (en) Electronic device and method for operating thereof
KR20170033656A (en) Electronic device and Method for processing a touch input of the same
WO2019139725A1 (en) Feature usage prediction using shell application feature telemetry
CN111506232B (en) Method and equipment for controlling menu display in reading application
CN111125562B (en) Method and equipment for switching display tag pages
CN110430253B (en) Method and equipment for providing novel update notification information
CN109947504B (en) Method and equipment for executing hosted program in hosted program
KR20180051002A (en) Method for cotrolling launching of an application in a electronic device using a touch screen and the electronic device thereof
CN110519250B (en) Method and equipment for providing information flow
CN110413183B (en) Method and equipment for presenting page
CN110290058B (en) Method and equipment for presenting session message in application
CN110825242A (en) Input method and device
CN110780788B (en) Method and device for executing touch operation
CN110333810B (en) Information display method, device and medium applied to electronic equipment and computing equipment
CN110780787B (en) Method and device for executing task scheduling

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant