CN117873728A - Task processing method, electronic device, storage medium, and program product - Google Patents

Task processing method, electronic device, storage medium, and program product Download PDF

Info

Publication number
CN117873728A
CN117873728A CN202410139038.0A CN202410139038A CN117873728A CN 117873728 A CN117873728 A CN 117873728A CN 202410139038 A CN202410139038 A CN 202410139038A CN 117873728 A CN117873728 A CN 117873728A
Authority
CN
China
Prior art keywords
processed
tasks
task
threshold
waiting time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410139038.0A
Other languages
Chinese (zh)
Inventor
张飞宇
林雨琦
王倪东
丁辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mobvoi Information Technology Co ltd
Original Assignee
Shanghai Mobvoi Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mobvoi Information Technology Co ltd filed Critical Shanghai Mobvoi Information Technology Co ltd
Priority to CN202410139038.0A priority Critical patent/CN117873728A/en
Publication of CN117873728A publication Critical patent/CN117873728A/en
Pending legal-status Critical Current

Links

Abstract

The present disclosure provides a task processing method, an electronic device, a storage medium, and a program product. The task processing method provided by the disclosure comprises the following steps: acquiring the number of tasks to be processed of a task set to be processed and the first waiting time of at least one task to be processed in the task set to be processed; judging whether the number of tasks to be processed is larger than a first number threshold value and whether the first waiting time is larger than a first time length threshold value; if the number of the tasks to be processed is larger than a first number threshold or the first waiting time is larger than a first time length threshold, acquiring batches of tasks to be processed from the task set to be processed according to the number of the tasks to be processed and the first waiting time; and performing task processing according to the batch of tasks to be processed.

Description

Task processing method, electronic device, storage medium, and program product
Technical Field
The present disclosure relates to a task processing method, an electronic device, a storage medium, and a program product.
Background
With the development of computer network technology, the task amount of various service bearers is continuously increasing. Taking a large-scale data reasoning service as an example, the tasks of the service are characterized by high concurrency, strong real-time performance, large calculated amount and the like.
In order to improve the processing efficiency of such tasks, a batch processing mode is generally adopted to process the tasks, that is, the tasks are processed after forming a batch according to a fixed number.
However, the number of tasks must reach the number to process, otherwise, the task can only wait, and the hardware is in an idle state in the waiting process, so that the hardware resource is wasted, and the utilization rate of the hardware resource is low.
Disclosure of Invention
The present disclosure provides a task processing method, an electronic device, a storage medium, and a program product.
According to one aspect of the present disclosure, there is provided a task processing method including:
acquiring the number of tasks to be processed of a task set to be processed and the first waiting time of at least one task to be processed in the task set to be processed;
judging whether the number of tasks to be processed is larger than a first number threshold and whether the first waiting time is larger than a first time length threshold;
if the number of the tasks to be processed is larger than the first number threshold or the first waiting time is larger than the first time length threshold, acquiring batch tasks to be processed from the task set to be processed according to the number of the tasks to be processed and the first waiting time;
and performing task processing according to the batch of tasks to be processed.
According to at least one embodiment of the present disclosure, before the determining whether the number of tasks to be processed is greater than a first number threshold and the first waiting time is greater than a first time length threshold, the task processing method further includes:
acquiring a first priority of at least one task to be processed in the set of tasks to be processed;
the determining whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a first time length threshold includes: judging whether the number of the tasks to be processed is larger than a first quantity threshold value or not and whether the first waiting time is larger than a first time length threshold value or not according to the first priority.
According to a task processing method of at least one embodiment of the present disclosure, the determining, according to a first priority, whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a first time length threshold includes:
updating the first time length threshold according to the first priority to obtain a second time length threshold;
and judging whether the number of the tasks to be processed is larger than a first number threshold and whether the first waiting time is larger than a second duration threshold.
The task processing method according to at least one embodiment of the present disclosure further includes:
acquiring performance data of the task processing;
and adjusting the first quantity threshold and/or the first time length threshold according to the performance data.
The task processing method according to at least one embodiment of the present disclosure further includes:
judging whether the first waiting time exceeds a third duration threshold;
the processing of the timeout task is terminated,
the overtime task is a task of which the first waiting time in the set of tasks to be processed exceeds a third time threshold.
According to a task processing method of at least one embodiment of the present disclosure, if the number of tasks to be processed is greater than the first number threshold or the first waiting time is greater than the first time threshold, obtaining a batch of tasks to be processed from the set of tasks to be processed according to the number of tasks to be processed and the first waiting time includes:
if the number of the tasks to be processed is larger than the first number threshold, acquiring the first number threshold of tasks from the task set to be processed as batch tasks to be processed;
and if the first waiting time is greater than the first time length threshold, the tasks to be processed in the task set to be processed are used as batch tasks to be processed.
According to a task processing method of at least one embodiment of the present disclosure, when the task set to be processed is managed by a fifo queue, the obtaining the number of tasks to be processed of the task set to be processed and a first waiting time of at least one task to be processed in the task set to be processed includes:
and acquiring the number of the tasks to be processed of the task set to be processed and the first waiting time of the head task according to the first-in first-out queue.
According to another aspect of the present disclosure, there is provided an electronic device including: a memory storing execution instructions; and a processor executing the execution instructions stored in the memory, so that the processor executes the task processing method according to any embodiment of the present disclosure.
According to yet another aspect of the present disclosure, there is provided a readable storage medium having stored therein execution instructions which, when executed by a processor, are to implement the task processing method of any one of the embodiments of the present disclosure.
According to yet another aspect of the present disclosure, there is provided a computer program product comprising a computer program/instruction which, when executed by a processor, implements the task processing method of any of the embodiments of the present disclosure.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the disclosure and together with the description serve to explain the principles of the disclosure.
Fig. 1 is a flowchart of a task processing method provided in embodiment 1 of the present disclosure.
Fig. 2 is a second flowchart of a task processing method provided in embodiment 1 of the present disclosure.
Fig. 3 is a flowchart of a method for determining whether the first condition is satisfied in the task processing method shown in fig. 2.
Fig. 4 is a flowchart III of a task processing method provided in embodiment 1 of the present disclosure.
Fig. 5 is a flowchart of a task processing method provided in embodiment 1 of the present disclosure.
Fig. 6 is a schematic structural diagram of a task processing device according to embodiment 2 of the present disclosure.
Fig. 7 is a schematic diagram of a task processing device according to embodiment 2 of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the drawings and the embodiments. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant content and not limiting of the present disclosure. It should be further noted that, for convenience of description, only a portion relevant to the present disclosure is shown in the drawings.
In addition, embodiments of the present disclosure and features of the embodiments may be combined with each other without conflict. The technical aspects of the present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Unless otherwise indicated, the exemplary implementations/embodiments shown are to be understood as providing exemplary features of various details of some ways in which the technical concepts of the present disclosure may be practiced. Thus, unless otherwise indicated, features of the various implementations/embodiments may be additionally combined, separated, interchanged, and/or rearranged without departing from the technical concepts of the present disclosure.
The use of cross-hatching and/or shading in the drawings is typically used to clarify the boundaries between adjacent components. As such, the presence or absence of cross-hatching or shading does not convey or represent any preference or requirement for a particular material, material property, dimension, proportion, commonality between illustrated components, and/or any other characteristic, attribute, property, etc. of a component, unless indicated. In addition, in the drawings, the size and relative sizes of elements may be exaggerated for clarity and/or descriptive purposes. While the exemplary embodiments may be variously implemented, the specific process sequences may be performed in a different order than that described. For example, two consecutively described processes may be performed substantially simultaneously or in reverse order from that described. Moreover, like reference numerals designate like parts.
When an element is referred to as being "on" or "over", "connected to" or "coupled to" another element, it can be directly on, connected or coupled to the other element or intervening elements may be present. However, when an element is referred to as being "directly on," "directly connected to," or "directly coupled to" another element, there are no intervening elements present. For this reason, the term "connected" may refer to physical connections, electrical connections, and the like, with or without intermediate components.
The terminology used herein is for the purpose of describing particular embodiments and is not intended to limit the scope of the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, when the terms "comprises" and/or "comprising," and variations thereof, are used in the present specification, the presence of stated features, integers, steps, operations, elements, components, and/or groups thereof is described, but the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof is not precluded. It is also noted that, as used herein, the terms "substantially," "about," and other similar terms are used as approximation terms and not as degree terms, and as such, are used to explain the inherent deviations of measured, calculated, and/or provided values that would be recognized by one of ordinary skill in the art.
The task processing method of the present disclosure is applicable to the task processing device of the present disclosure, which can be configured on a server.
The task processing method of the present disclosure is described in detail below with reference to fig. 1 to 5.
Example 1:
fig. 1 is a flow chart of a task processing method of one embodiment of the present disclosure.
Referring to fig. 1, a task processing method M100 according to the present embodiment includes:
step S110, the number of the tasks to be processed in the task set to be processed and the first waiting time of at least one task to be processed in the task set to be processed are obtained.
Step S120, determining whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a first time length threshold.
Step S130, if the number of the tasks to be processed is greater than the first number threshold or the first waiting time is greater than the first time threshold, obtaining batch of tasks to be processed from the task set to be processed according to the number of the tasks to be processed and the first waiting time.
And step S140, performing task processing according to the batch of tasks to be processed.
According to the task processing method, batch tasks to be processed are obtained from the task set to be processed according to the number of the tasks to be processed and the first waiting time, and task processing is achieved through the batch tasks to be processed. The number of tasks to be processed of the task set to be processed and the first waiting time of the tasks are used for determining whether to form batch tasks to be processed or not, so that the number of tasks of the batch tasks to be processed can be dynamically adjusted, resource idling or overload is reduced, task processing speed is improved, delay is reduced, utilization rate of hardware resources is improved, throughput is improved, overall task processing cost is reduced, and user experience is improved. The method solves the problems of the prior art that the number of tasks is fixed, so that the tasks can be processed only after the number is reached, otherwise, the tasks can only be waited, and hardware is in an idle state in the waiting process, so that the hardware resources are wasted, and the utilization rate of the hardware resources is low. Moreover, the batch size can be controlled by the task number, so that the processing bottleneck caused by excessive tasks in a single batch can be prevented, and the fluctuation of task processing can be reduced.
In some embodiments of the present disclosure, step S110 may be performed after the completion of the previous batch of task processing; in order to increase the task processing speed, step S110 may also be performed periodically, which may be determined according to the nature of the task, the performance of the task processing structure, and the like.
In some embodiments of the present disclosure, the tasks in the task set to be processed in step S110 may be managed by parallel task management or the like; in order to improve task management efficiency, a queue may be used to manage tasks in the task set to be processed, where the queue may be a common fifo queue, a priority queue, and the like. At least one task in the set of tasks to be processed in step S110 may be a task, such as the earliest task; multiple tasks, such as the earliest task, the task with higher priority, and the like, can also be adopted; but may also be an overall task.
In particular, when the task set to be processed is managed by the fifo queue, since the head task in the queue is the task that enters the queue earliest in the current task set to be processed, it can be determined whether the waiting time of the head task is greater than the first time threshold, and at this time, the process of obtaining the number of tasks to be processed and the first waiting time through step S110 may be specifically: and acquiring the number of the tasks to be processed of the task set to be processed and the first waiting time of the head task according to the first-in first-out queue.
In some embodiments of the present disclosure, the two determining processes in step S120 may be performed simultaneously or sequentially. The first time length threshold may be a uniform first time length threshold corresponding to all tasks, or may be a respective first time length threshold corresponding to each task, where, as shown in fig. 2, before step S120, the task processing method provided in the present disclosure may further include:
step S150, a first priority of at least one task to be processed in the set of tasks to be processed is obtained.
In some embodiments of the present disclosure, step S150 may determine its priority according to the urgency, importance, required hardware resources, etc. of the task.
At this time, step S120 is specifically to determine whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a first time threshold according to the first priority.
In some embodiments of the present disclosure, as shown in fig. 3, step S120 of determining whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a first time length threshold according to the first priority may include: step S121, updating a first time length threshold according to the first priority to obtain a second time length threshold; step S122, determining whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a second duration threshold. The method of updating the first time threshold according to the first priority in step S121 may be shortening the first time threshold corresponding to the task with the high priority, extending the first time threshold corresponding to the task with the low priority, or the like; the modification mode can be to preset a time threshold corresponding to each priority, and modify according to the corresponding relation; the ratio of a time duration threshold to each priority may be preset, modified according to the ratio, and the like.
In some embodiments of the present disclosure, by adjusting the duration threshold by task priority, priority execution of a critical task can be ensured, and overall task processing quality and user experience are improved.
In some embodiments of the present disclosure, the manner of acquiring the batch of tasks to be processed through step S130 may be: if the number of the tasks to be processed is determined to be greater than the first number threshold in step S120, acquiring the first number threshold of tasks from the set of tasks to be processed as batch of tasks to be processed; if it is determined in step S120 that the first waiting time is greater than the first time threshold, the tasks to be processed are collected as batch tasks to be processed. When the first number of threshold tasks are obtained from the task set to be processed as batch of tasks to be processed, the first number of threshold tasks can be obtained from the task set to be processed according to the waiting time as batch of tasks to be processed, and then the tasks with longer waiting time can be processed preferentially.
In some embodiments of the present disclosure, if step S110 is performed after the previous batch of task processing is completed, step S140 may perform task processing after the batch of tasks to be processed is acquired through step S130; if step S110 is periodically performed, step S140 may perform task processing after the completion of the previous batch of task processing; the processed task is a batch of tasks to be processed, which are obtained in the step S130; for convenience of processing, the batch of tasks to be processed generated through step S130 may be managed through a processing queue.
In some embodiments of the present disclosure, the specific manner of performing task processing on the batch of tasks to be processed in step S140 may be performing task processing on the batch of tasks to be processed by a local task processing module; the step S140 may also be to send the batch task to be processed to the task processing device according to the specific mode of task processing of the batch task to be processed, so as to process the batch task to be processed through the task processing device.
Further, in order to improve the utilization rate of hardware resources and prevent overload, as shown in fig. 4, the task processing method provided by the present disclosure may further include:
step S160, performance data of task processing is acquired.
In some embodiments of the present disclosure, step S160 may acquire performance data by monitoring usage of hardware resources related to task processing, such as CPU utilization, memory usage, disk I/O, network loans, etc., task processing duration (determined according to task request time and task processing completion time), etc.
Step S170, the first quantity threshold and/or the first time length threshold are/is adjusted according to the performance data.
In some embodiments of the present disclosure, when the performance data shows that the performance is excessive, adjusting the first number threshold and/or the first time length threshold by step S170 may be increasing the first number threshold and/or decreasing the first time length threshold; when the performance data shows that the performance is insufficient, the adjustment of the first number of thresholds and/or the first time length threshold by step S170 may be the decrease of the first number of thresholds and/or the increase of the first time length threshold.
In some embodiments of the present disclosure, performance data is obtained through step S160, and the first number threshold and/or the first time length threshold are adjusted through step S170, so that the first number threshold and/or the first time length threshold can be adapted to load requirements of hardware resources related to task processing, thereby improving utilization rate of hardware resources, preventing overload, improving flexibility of task processing, and adapting to different scenes/load fluctuations.
Further, as shown in fig. 5, the task processing method provided by the present disclosure may further include:
step S180, judging whether the first waiting time exceeds a third duration threshold.
In some embodiments of the present disclosure, the third time period threshold in step S180 is generally greater than the first time period threshold.
Step S190, the processing of the timeout task is terminated.
In some embodiments of the present disclosure, the timeout task in step S190 is a task in which the first waiting time in the set of pending tasks exceeds the third duration threshold. In order to enhance the user experience, after the processing of the timeout task is terminated in step S190, a related termination notification may also be generated and sent to a related party, which may be a user, a system administrator, a third party service administrator, a log system, etc. corresponding to the timeout task.
In some embodiments of the present disclosure, by using a timeout exit mechanism, it is avoided that hardware resources are occupied for a long time, robustness and efficiency of a task processing system are ensured, continuity of task processing can be ensured even under an extreme load condition, and resistance against abnormal conditions is improved.
The task processing method provided by the disclosure can dynamically balance the batch size and the processing time, so that batch processing is realized according to the waiting time during low load and batch processing is realized according to the task quantity during high load, batch processing can be adjusted according to the priority of the task, and the response speed and the processing efficiency under different loads are improved.
Fig. 6-7 illustrate exemplary diagrams of task processing devices employing hardware implementations of processing systems.
Example 2:
the embodiment of the disclosure provides a task processing device.
The apparatus may include corresponding modules that perform the steps of the flowcharts described above. Thus, each step or several steps in the flowcharts described above may be performed by respective modules, and the apparatus may include one or more of these modules. A module may be one or more hardware modules specifically configured to perform the respective steps, or be implemented by a processor configured to perform the respective steps, or be stored within a computer-readable medium for implementation by a processor, or be implemented by some combination.
The hardware architecture may be implemented using a bus architecture. The bus architecture may include any number of interconnecting buses and bridges depending on the specific application of the hardware and the overall design constraints. Bus 1100 connects together various circuits including one or more processors 1200, memory 1300, and/or hardware modules. Bus 1100 may also connect various other circuits 1400, such as peripherals, voltage regulators, power management circuits, external antennas, and the like.
Bus 1100 may be an industry standard architecture (ISA, industry Standard Architecture) bus, a peripheral component interconnect (PCI, peripheral Component) bus, or an extended industry standard architecture (EISA, extended Industry Standard Component) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, only one connection line is shown in the figure, but not only one bus or one type of bus.
Any process or method description in a flowchart or otherwise described herein may be understood as: a module, segment, or portion of code, which comprises one or more executable instructions for implementing the steps of a specified logical function(s) or process (es). The scope of the preferred embodiments of the present disclosure may include other implementations in which functions may be performed out of the order described, for example, in a substantially simultaneous manner or in an opposite order depending on the function involved, as would be understood by one of skill in the art. The processor may be used to perform the various methods and processes described above. For example, method embodiments in the present disclosure may be implemented as a software program stored on a computer readable storage medium, such as a memory. In some embodiments, part or all of the software program may be loaded and/or installed via memory and/or a communication interface. One or more of the steps of the methods described above may be performed when the software program is loaded and executed by a processor. Alternatively, in other embodiments, the processor may be configured to perform one of the methods described above in any other suitable manner (e.g., by means of firmware).
Logic and/or steps represented in the flowcharts or otherwise described herein may be embodied in any readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
For the purposes of this description, a "readable storage medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable read-only memory (CDROM). In addition, the readable storage medium may even be paper or other suitable medium for the printable program, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner if necessary, and then stored in a memory.
It should be understood that portions of the present disclosure may be implemented in hardware, software, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps implementing the methods of the embodiments described above may be implemented by a program to instruct related hardware. The program may be stored in a readable storage medium. The program, when executed, includes one or a combination of steps for implementing the method.
Furthermore, each functional unit in each embodiment of the present disclosure may be integrated into one processing module, or each unit may exist alone physically, or two or more units may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product. The storage medium may be a read-only memory, a magnetic disk or optical disk, etc.
As shown in fig. 6, a task processing device according to the present disclosure may include:
the condition acquisition module 1010 is configured to acquire a number of tasks to be processed in the task set to be processed and a first waiting time of at least one task to be processed in the task set to be processed.
The condition determining module 1020 is configured to determine whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a first time length threshold.
The batch obtaining module 1030 is configured to obtain batch of tasks to be processed from the set of tasks to be processed according to the number of tasks to be processed and the first waiting time if the number of tasks to be processed is greater than the first number threshold or the first waiting time is greater than the first time threshold.
And the batch processing module 1040 is used for processing tasks according to batch tasks to be processed.
In some embodiments of the present disclosure, the batch processing module 1040 may be specifically configured to perform task processing on a batch task to be processed, and may also be configured to send the batch task to a task processing device, so that the task processing device performs task processing on the batch task to be processed.
Further, the task processing device provided by the present disclosure may further include:
the priority obtaining module 1050 is configured to obtain a first priority of at least one task to be processed in the set of tasks to be processed.
At this time, the condition determining module 1020 is specifically configured to determine, according to the first priority, whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a first time length threshold.
The condition determination module 1020 may include:
and the updating submodule 1021 is used for updating the first time length threshold according to the first priority to obtain a second time length threshold.
The judging submodule 1022 is configured to judge whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a second duration threshold.
Further, the task processing device provided by the present disclosure may further include:
and the performance acquisition module 1060 is configured to acquire performance data of task processing.
The threshold adjustment module 1070 is configured to adjust the first number threshold and/or the first time length threshold according to the performance data.
Further, the task processing device provided by the present disclosure may further include:
a time determination module 1080, configured to determine whether the first waiting time exceeds a third duration threshold.
A timeout exit module 1090 for terminating processing of timeout tasks.
In some embodiments of the present disclosure, the timeout task in the timeout exit module 1090 is a task in the set of pending tasks for which the first waiting time exceeds the third duration threshold.
In some embodiments of the present disclosure, the task processing procedure implemented by the above modules/sub-modules is similar to that provided in embodiment 1 of the present disclosure, and will not be described in detail herein. Fig. 7 is a structure of the task processing device including all the above modules/sub-modules, and when the task processing device includes only some of the above modules/sub-modules, the structure is similar to that shown in fig. 7, and will not be described in detail here.
According to the task processing device, batch tasks to be processed are obtained from the task set to be processed according to the number of the tasks to be processed and the first waiting time, and task processing is achieved through the batch tasks to be processed. The number of tasks to be processed of the task set to be processed and the first waiting time of the tasks are used for determining whether to form batch tasks to be processed or not, so that the number of tasks of the batch tasks to be processed can be dynamically adjusted, resource idling or overload is reduced, task processing speed is improved, delay is reduced, utilization rate of hardware resources is improved, throughput is improved, overall task processing cost is reduced, and user experience is improved. The device solves the problems of the prior art that the task number must reach the number to process because of the fixed number, otherwise, the task can only wait, and the hardware is in an idle state in the waiting process, so that the hardware resource is wasted and the utilization rate of the hardware resource is lower. Moreover, the batch size can be controlled by the task number, so that the processing bottleneck caused by excessive tasks in a single batch can be prevented, and the fluctuation of task processing can be reduced.
The present disclosure also provides an electronic device, including: a memory storing execution instructions; and a processor or other hardware module that executes the execution instructions stored in the memory, such that the processor or other hardware module performs the task processing method described above.
The disclosure also provides a readable storage medium, in which execution instructions are stored, the execution instructions being used to implement the task processing method when executed by a processor.
The present disclosure also provides a computer program product comprising a computer program/instruction which, when executed by a processor, implements the task processing method described above.
In the description of the present specification, reference to the terms "one embodiment/example," "some embodiments/examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment/example or example is included in at least one embodiment/mode or example of the present application. In this specification, the schematic representations of the above terms are not necessarily in the same implementation/example or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more implementations/embodiments or examples. Furthermore, the various embodiments/examples or examples described in this specification and the features of the various embodiments/embodiments or examples may be combined and combined by persons skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "plurality" is at least two, such as two, three, etc., unless explicitly defined otherwise.
It will be appreciated by those skilled in the art that the above-described embodiments are merely for clarity of illustration of the disclosure, and are not intended to limit the scope of the disclosure. Other variations or modifications will be apparent to persons skilled in the art from the foregoing disclosure, and such variations or modifications are intended to be within the scope of the present disclosure.

Claims (10)

1. A method of task processing, comprising:
acquiring the number of tasks to be processed of a task set to be processed and the first waiting time of at least one task to be processed in the task set to be processed;
judging whether the number of tasks to be processed is larger than a first number threshold and whether the first waiting time is larger than a first time length threshold;
if the number of the tasks to be processed is larger than the first number threshold or the first waiting time is larger than the first time length threshold, acquiring batch tasks to be processed from the task set to be processed according to the number of the tasks to be processed and the first waiting time; and
and performing task processing according to the batch of tasks to be processed.
2. The task processing method according to claim 1, characterized in that,
before the step of judging whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a first time length threshold, the method further comprises:
acquiring a first priority of at least one task to be processed in the set of tasks to be processed;
the determining whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a first time length threshold includes: judging whether the number of the tasks to be processed is larger than a first quantity threshold value or not and whether the first waiting time is larger than a first time length threshold value or not according to the first priority.
3. The task processing method according to claim 2, wherein the determining whether the number of tasks to be processed is greater than a first number threshold and whether the first waiting time is greater than a first time length threshold according to a first priority includes:
updating the first time length threshold according to the first priority to obtain a second time length threshold; and
and judging whether the number of the tasks to be processed is larger than a first number threshold and whether the first waiting time is larger than a second duration threshold.
4. A task processing method according to any one of claims 1 to 3, further comprising:
acquiring performance data of the task processing; and
and adjusting the first quantity threshold and/or the first time length threshold according to the performance data.
5. A task processing method according to any one of claims 1 to 3, further comprising:
judging whether the first waiting time exceeds a third duration threshold; and
the processing of the timeout task is terminated,
the overtime task is a task of which the first waiting time in the set of tasks to be processed exceeds a third time threshold.
6. A task processing method according to any one of claims 1 to 3, wherein if the number of tasks to be processed is greater than the first number threshold or the first waiting time is greater than the first time threshold, acquiring a batch of tasks to be processed from the set of tasks to be processed according to the number of tasks to be processed and the first waiting time includes:
if the number of the tasks to be processed is larger than the first number threshold, acquiring the first number threshold of tasks from the task set to be processed as batch tasks to be processed; and
and if the first waiting time is greater than the first time length threshold, the tasks to be processed in the task set to be processed are used as batch tasks to be processed.
7. A task processing method according to any one of claims 1 to 3, wherein when the task set to be processed is managed by a fifo queue, the obtaining the number of tasks to be processed of the task set to be processed and the first waiting time of at least one task to be processed of the task set to be processed includes:
and acquiring the number of the tasks to be processed of the task set to be processed and the first waiting time of the head task according to the first-in first-out queue.
8. An electronic device, comprising:
a memory storing execution instructions; and
a processor executing the execution instructions stored in the memory, implementing the task processing method according to any one of claims 1 to 7.
9. A readable storage medium having stored therein execution instructions which, when executed by a processor, implement the task processing method according to any one of claims 1 to 7.
10. A computer program product comprising computer programs/instructions which, when executed by a processor, implement the task processing method of any one of claims 1 to 7.
CN202410139038.0A 2024-01-31 2024-01-31 Task processing method, electronic device, storage medium, and program product Pending CN117873728A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410139038.0A CN117873728A (en) 2024-01-31 2024-01-31 Task processing method, electronic device, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410139038.0A CN117873728A (en) 2024-01-31 2024-01-31 Task processing method, electronic device, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN117873728A true CN117873728A (en) 2024-04-12

Family

ID=90581117

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410139038.0A Pending CN117873728A (en) 2024-01-31 2024-01-31 Task processing method, electronic device, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN117873728A (en)

Similar Documents

Publication Publication Date Title
CN107832126B (en) Thread adjusting method and terminal thereof
US7631076B2 (en) Apparatus, system, and method for adaptive polling of monitored systems
CN106557369B (en) Multithreading management method and system
CN110196767B (en) Service resource control method, device, equipment and storage medium
US8032679B2 (en) Device and method for controlling network processing mode, and non-transitory computer-readable medium recording program for controlling network processing mode
WO2022068697A1 (en) Task scheduling method and apparatus
CN109981737B (en) Network request processing method, device, terminal and storage medium
CN112383585A (en) Message processing system and method and electronic equipment
US11438271B2 (en) Method, electronic device and computer program product of load balancing
CN111240864A (en) Asynchronous task processing method, device, equipment and computer readable storage medium
CN112612430B (en) Printing control method of printer, related equipment and storage medium
CN117873728A (en) Task processing method, electronic device, storage medium, and program product
US20060123421A1 (en) Streamlining cpu utilization by delaying transactions
CN112214299A (en) Multi-core processor and task scheduling method and device thereof
CN114116068B (en) Service start optimization method and device, electronic equipment and readable storage medium
CN113364648B (en) Flow control method, system, device, service equipment and storage medium
CN114490030A (en) Method and device for realizing self-adaptive dynamic redis connection pool
CN115314439A (en) Flow control method and related device for data storage IO (input/output) request
CN114422530A (en) Flow control method and device, computer equipment and storage medium
CA2318082C (en) Method and device for controlling processes on a computer system
CN114629792A (en) Resource management method, device and system
CN111858060A (en) Resource dynamic adjustment method and device for high-performance computing cluster
CN111258757A (en) Automatic task arranging method and device, computer equipment and storage medium
CN110851286A (en) Thread management method and device, electronic equipment and storage medium
CN111737176B (en) PCIE data-based synchronization device and driving method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination