CN113159602A - Task distribution method, device, equipment and readable storage medium - Google Patents

Task distribution method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN113159602A
CN113159602A CN202110482209.6A CN202110482209A CN113159602A CN 113159602 A CN113159602 A CN 113159602A CN 202110482209 A CN202110482209 A CN 202110482209A CN 113159602 A CN113159602 A CN 113159602A
Authority
CN
China
Prior art keywords
task
preset
queue
tasks
dispatched
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110482209.6A
Other languages
Chinese (zh)
Inventor
万明霞
李敬文
宋雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bank of China Ltd
Original Assignee
Bank of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bank of China Ltd filed Critical Bank of China Ltd
Priority to CN202110482209.6A priority Critical patent/CN113159602A/en
Publication of CN113159602A publication Critical patent/CN113159602A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06316Sequencing of tasks or work
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/02Banking, e.g. interest calculation or account maintenance

Landscapes

  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Engineering & Computer Science (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Physics & Mathematics (AREA)
  • Finance (AREA)
  • General Physics & Mathematics (AREA)
  • Marketing (AREA)
  • Accounting & Taxation (AREA)
  • Educational Administration (AREA)
  • Technology Law (AREA)
  • Game Theory and Decision Science (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The embodiment of the application provides a task dispatching method, a task dispatching device and a readable storage medium, wherein in response to receiving a task to be dispatched, the task to be dispatched is stored in a first preset queue according to the sequence position of the task to be dispatched in the first preset queue, in response to reaching a preset buffering opportunity, a candidate task set is obtained from the first preset queue, the candidate task set comprises N candidate tasks, N is a preset value larger than 1, the N candidate tasks are stored in a second preset queue, and in response to reaching the preset dispatching opportunity of a target task, the target task is dispatched to a preset executive party of the target task. Therefore, the dispatching of the tasks and the sequential storage of the tasks are executed in the two queues respectively, so that the dispatching of the tasks and the sequencing and concurrent execution of the tasks can be realized, namely the dispatching of the tasks is executed without waiting for the tasks to be dispatched to be stored in the first preset queue in sequence, and the dispatching efficiency of the tasks is improved.

Description

Task distribution method, device, equipment and readable storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method, an apparatus, a device, and a readable storage medium for dispatching a task.
Background
At present, a bank background system uses a task scheduling platform to implement task scheduling, and the scheduling process includes: and sequencing the received tasks and storing the tasks into a dispatching task queue, and reading out the tasks to be dispatched from the dispatching task queue for dispatching when the dispatching time is reached. Since the storage and the reading of the queues cannot be executed in parallel, when the tasks are sequenced, if the dispatching time of the tasks is reached, the tasks need to be dispatched until the sequencing process is completed, so that the tasks are easily blocked, and the dispatching efficiency of the tasks is low.
Disclosure of Invention
The application provides a task dispatching method, a task dispatching device, a task dispatching equipment and a readable storage medium, and aims to improve task dispatching efficiency as follows:
a task dispatching method comprises the following steps:
in response to receiving a task to be dispatched, storing the task to be dispatched into a first preset queue according to the sequence of the task to be dispatched in the first preset queue;
in response to reaching a preset buffering opportunity, acquiring a candidate task set from the first preset queue, wherein the candidate task set comprises N candidate tasks, and N is a preset numerical value larger than 1;
storing the N candidate tasks into a second preset queue;
and in response to reaching a preset dispatching opportunity of the target task, dispatching the target task to a preset executive party of the target task, wherein the target task is a task in the second preset queue.
Optionally, in response to receiving a task to be dispatched, storing the task to be dispatched into a first preset queue according to an ordinal of the task to be dispatched in the first preset queue, where the method includes:
in response to receiving the task to be dispatched, acquiring a target sequence according to a preset priority of the task to be dispatched, wherein the target sequence is the sequence of the task to be dispatched in the first preset queue, and the higher the preset priority is, the smaller the target sequence is;
and storing the tasks to be dispatched into the first preset queue according to the target sequence.
Optionally, the N candidate tasks are the tasks N before the ordinal position in the first preset queue;
the storing the N candidate tasks into a second preset queue includes:
and storing the candidate tasks into the second preset queue one by one from small to large according to the ordinal of the candidate tasks in the first preset queue.
Optionally, after storing the N candidate tasks into a second preset queue, the method further includes:
and deleting the candidate task from the first preset queue, and updating the sequence of the task in the first preset queue.
Optionally, the preset buffering opportunity includes:
at least one of the number of tasks in the second preset queue being lower than a first preset threshold, the number of tasks in the first preset queue exceeding a second preset threshold, and a buffering time reaching a preset buffering period indication.
Optionally, after the dispatching the target task to the preset executor of the target task in response to reaching the preset dispatching opportunity of the target task, the method further includes:
and deleting the target task from the second preset queue, and updating the sequence of the task in the second preset queue.
A task serving apparatus comprising:
the task receiving unit is used for responding to the received task to be dispatched and storing the task to be dispatched into a first preset queue according to the sequence of the task to be dispatched in the first preset queue;
the candidate task acquisition unit is used for acquiring a candidate task set from the first preset queue in response to reaching a preset buffering opportunity, wherein the candidate task set comprises N candidate tasks, and N is a preset value larger than 1;
the candidate task storage unit is used for storing the N candidate tasks into a second preset queue;
and the task dispatching unit is used for responding to a preset dispatching opportunity of a target task, dispatching the target task to a preset executive party of the target task, wherein the target task is a task in the second preset queue.
Optionally, the task receiving unit, configured to store, in response to receiving a task to be dispatched, the task to be dispatched into a first preset queue according to an order of the task to be dispatched in the first preset queue, includes: the task receiving unit is specifically configured to:
in response to receiving the task to be dispatched, acquiring a target sequence according to a preset priority of the task to be dispatched, wherein the target sequence is the sequence of the task to be dispatched in the first preset queue, and the higher the preset priority is, the smaller the target sequence is;
and storing the tasks to be dispatched into the first preset queue according to the target sequence.
A device for serving a task, comprising: a memory and a processor;
the memory is used for storing programs;
the processor is used for executing the program and realizing each step of the task dispatching method.
A readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method for dispatching tasks.
According to the technical scheme, the task dispatching method, the task dispatching device, the task dispatching equipment and the readable storage medium provided by the embodiment of the application store the task to be dispatched into the first preset queue according to the sequence of the task to be dispatched in the first preset queue in response to the received task to be dispatched, obtain a candidate task set from the first preset queue in response to the reaching of the preset buffering opportunity, wherein the candidate task set comprises N candidate tasks, N is a preset value larger than 1, the N candidate tasks are stored into the second preset queue, and the target task is dispatched to the preset executive party of the target task in response to the reaching of the preset dispatching opportunity of the target task. Therefore, the dispatching of the tasks and the sequential storage of the tasks are executed in the two queues respectively, so that the dispatching of the tasks and the sequencing and concurrent execution of the tasks can be realized, namely the dispatching of the tasks is executed without waiting for the tasks to be dispatched to be stored in the first preset queue in sequence, and the dispatching efficiency of the tasks is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a specific implementation of a task dispatching method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a task dispatching process applied to a task scheduling platform according to an embodiment of the present application;
fig. 3 is a flowchart illustrating a task dispatching method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a task dispatching device according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a task distribution device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the field of computer science, the sequencing algorithm for tasks is long-time consuming, and the time complexity is generally in the order of O (n ^ 2). Moreover, the insertion and deletion operations of the same queue cannot be performed in parallel, that is, the operation of inserting a task and the operation of dispatching the task cannot be performed simultaneously, so in the prior art, a single queue is used for sequencing and dispatching, when the task is sequenced, the dispatching opportunity of the task is reached, and the task can be dispatched only after the sequencing process is completed, so that the task is blocked, and the dispatching efficiency of the task is low. And because the complexity of sequencing and dispatching is high, the task dispatching and task sequencing are carried out by using a single queue, and the influence on the performance of the system is high. For example, when the task scheduling platform concurrently sends 50tps per second, and the processing (sorting or dispatching) speed is 5min per task, the number of tasks in the queue is as high as several tens of thousands, frequent task insertion and task sorting are time-consuming, and if tasks are dispatched alternately, the dispatching efficiency is extremely low, and the code performance is low.
The task dispatching method provided in this embodiment is applied to, but not limited to, a task scheduling platform, where the task scheduling platform is configured in a banking system in advance, and the banking system runs on a client or a server.
Fig. 1 illustrates a specific implementation process of a task distribution method, where the method specifically includes: s101 to S108.
S101, a task receiving queue and a task dispatching queue are built.
In this embodiment, this step is only executed in an initial state, the constructed task receiving queue and the constructed task dispatching queue are both empty queues, the data scale of the task dispatching queue and the data scale of the task receiving queue are preset according to an actual application scenario, and the data scale of the task dispatching queue is much smaller than the data scale of the task receiving queue.
S102, in response to the received task to be dispatched, determining the sequence of the task to be dispatched in the task receiving queue according to the preset priority of the task to be dispatched, and recording the sequence as a target sequence.
In this embodiment, the process of determining the ordinal of the task to be dispatched in the task receiving queue includes: the process of sorting the tasks to be dispatched and the stored tasks in the task receiving queue according to the priority, where the priority is preset according to the preset dispatching time and/or the preset importance degree of each task (the tasks to be dispatched and the stored tasks), which may be specifically referred to in the prior art.
It should be noted that, if the task receiving queue is an empty queue (for example, in an initial state), after receiving the task to be dispatched, the task to be dispatched may be directly inserted into the head of the task receiving queue (that is, the ordinal is 1).
It should be noted that the specific sorting process can be referred to in the prior art.
S103, inserting the tasks to be dispatched into the task receiving queue according to the target sequence, and updating the sequence of the tasks in the task receiving queue.
In this embodiment, the ordinal in the task receiving queue is an empty queue in an initial state, and when a task to be dispatched is received once, the ordinal in the task receiving queue is updated once. Specifically, after the task to be dispatched is inserted into the task receiving queue according to the ordinal, the ordinal is added to the ordinal of each task after the task to be dispatched.
It should be noted that the larger the preset priority of the task is, the smaller the sequence of the task in the queue is, i.e. the closer to the head of the queue is.
And S104, reading the task of N bits before the ordinal position in the task receiving queue as a candidate task in response to the preset buffering time.
In this embodiment, N is a preset value, and the preset value is greater than 1 and smaller than the capacity of the task receiving queue (denoted as the preset capacity). Presetting a buffering opportunity comprises: the number of the tasks in the task dispatching queue is lower than a first preset threshold value.
And S105, sending the candidate tasks to the task dispatching queue from small to large in sequence according to the sequence positions of the candidate tasks in the task receiving queue.
And S106, deleting the candidate task from the task receiving queue, and updating the sequence bit of the task in the task receiving queue.
In this embodiment, updating the task dispatch queue includes: and subtracting N from the sequence bit of the task with the sequence bit after the Nth bit.
And S107, responding to the preset dispatching opportunity of the target task, and dispatching the target task to a preset executive party of the target task.
In this embodiment, the target task is a task with the first ordinal position in the task dispatch queue.
And S108, deleting the target task from the task dispatching queue, and updating the sequence of the task in the task dispatching queue.
In this embodiment, updating the task dispatch queue includes: and subtracting 1 from the ordinal of the task with the ordinal being behind the target task.
In fig. 1, S102 to S103 are processes of receiving a new task to be dispatched, sorting the task to be dispatched (referred to as a sorting process for short), and updating the task receiving queue, S104 to S106 are processes of reading a candidate task from the task receiving queue to the task dispatching queue (referred to as a buffering process for short), and S107 to S108 are processes of dispatching the task to a preset executive (referred to as a dispatching process for short). The method does not limit the execution sequence of S107 to S108 and S102 to S103, i.e., the sorting process and the dispatching process can be executed in parallel.
According to the technical scheme, the task dispatching method provided by the embodiment of the application has at least the following beneficial effects:
1. the received tasks to be dispatched are firstly inserted into the task receiving sequence, and the dispatched target tasks are the tasks in the task dispatching queue, so that the task sequencing process and the task dispatching process can be executed in parallel, and the tasks are dispatched without waiting for the task sequencing.
2. The complexity of the process of reading N (N is more than 1) tasks from the task receiving queue to the task dispatching queue at one time is less than that of respectively dispatching the N tasks to the executing party, and the task dispatching efficiency is further improved.
3. The sequence bits of the N tasks read from the task receiving queue are the first N bits of the sequence bits in the task receiving queue, and the N tasks are inserted into the task distribution sequence from small to large according to the sequence bits in the task receiving queue, so that the scheme ensures that the tasks in the two queues are all sequenced according to the priority, and the accuracy of task distribution is improved.
Fig. 2 is a schematic diagram of a task dispatching process applied to a task scheduling platform according to an embodiment of the present application. As shown in FIG. 2, the task scheduling platform includes a task receiving module and a task dispatching module.
The receiving module 201: the system is used for receiving tasks sent by each upstream system, calculating the priority of the tasks according to the sorting factors (including the urgency degree of the tasks, the task processing time and the grade of a client), and sorting the tasks in the task receiving queue according to the priority.
The dispatching module 202: the system comprises a task receiving queue, a task dispatching queue and a task dispatching queue, wherein the task receiving queue is used for sending tasks of the task dispatching queue to an executive party (generally an account number logged in by a customer service person), and when the tasks of the task dispatching queue are lower than a first preset threshold value, a plurality of (N) tasks are obtained in batches from the task receiving queue.
It should be noted that, the implementation process of the specific functions of the receiving module and the dispatching module can refer to the flow shown in fig. 1.
In summary, the scheme solves the problem of low performance caused by frequently receiving data and distributing data in task scheduling by using two queues (namely the task receiving queue and the task distributing queue). Specifically, task reception is dedicated to receiving tasks and storing tasks in order. The task dispatch queue is dedicated to dispatching tasks. Generally, when the number of tasks in the task dispatching queue is smaller than a first preset threshold, a plurality of tasks are obtained in batch from the task receiving queue.
It should be noted that fig. 1 only illustrates a specific process of task dispatching, and the present application also includes other specific implementation processes, for example, the preset buffering timing further includes: the number of tasks in the task receiving queue exceeds a second preset threshold value; and/or, a buffering time indicated by a preset buffering period is reached. For another example, S102-103 is an optional implementation process for receiving tasks to be dispatched and storing the tasks to be dispatched in a task receiving queue in sequence. .
In summary, the task dispatching method provided by the present application is summarized as a flow shown in fig. 3, and as shown in fig. 3, the method includes:
s301, in response to receiving the task to be dispatched, storing the task to be dispatched into a first preset queue according to the sequence position of the task to be dispatched in the first preset queue.
In this embodiment, the ordinal of the task to be dispatched in the first preset queue indicates the position of the task to be dispatched in the first preset queue, and in any queue, the smaller the ordinal of the task to be dispatched is, the closer the position of the task to be dispatched in the queue is to the head of the queue.
It should be noted that, in an alternative specific implementation manner, the foregoing embodiment may be referred to in the above embodiment, where the task to be dispatched is stored in the first preset queue according to the ordinal of the task to be dispatched in the first preset queue.
S302, in response to the preset buffering time, acquiring a candidate task set from a first preset queue.
In this embodiment, the candidate task set includes N candidate tasks, where N is a preset value greater than 1. The first preset queue comprises a plurality of tasks to be dispatched after sequencing.
Optionally, the first preset queue is the task receiving queue in the embodiment, the N candidate tasks are the tasks N before the ordinal number in the first preset queue, and the process that the specific task receiving queue receives the tasks to be dispatched and sequences the tasks may refer to the embodiment.
S303, storing the N candidate tasks into a second preset queue.
In this embodiment, the second preset queue is the task dispatch queue in the above embodiment, and the method for storing the N candidate tasks into the second preset queue includes multiple kinds, and optionally, the candidate tasks are stored into the second preset queue one by one from small to large according to the sequence of the candidate tasks in the first preset queue. For details, reference may be made to the above embodiments, which are not described herein again.
S304, in response to the preset dispatching opportunity of the target task is reached, the target task is dispatched to a preset executive party of the target task.
In this embodiment, the target task is a task in the second preset queue, optionally, the target task is a task with a first ordinal position in the second preset queue, and a specific task dispatching method may refer to the prior art.
It can be seen from the foregoing technical solutions that, in the task dispatching method provided in this embodiment, in response to receiving a task to be dispatched, according to an order of the task to be dispatched in a first preset queue, the task to be dispatched is stored in the first preset queue, in response to reaching a preset buffering time, a candidate task set is obtained from the first preset queue, where the candidate task set includes N candidate tasks, where N is a preset value greater than 1, the N candidate tasks are stored in a second preset queue, and in response to reaching the preset dispatching time of the target task, the target task is dispatched to a preset executive party of the target task. Therefore, the dispatching of the tasks and the sequential storage of the tasks are executed in the two queues respectively, so that the dispatching of the tasks and the sequential concurrent execution of the tasks can be realized, namely the dispatching of the tasks is executed without waiting for the tasks to be dispatched to be stored in the first preset queue in sequence, and the dispatching efficiency of the tasks is improved.
Fig. 4 is a schematic structural diagram illustrating a task dispatching device according to an embodiment of the present application, where as shown in fig. 4, the task dispatching device may include:
the task receiving unit 401 is configured to, in response to receiving a task to be dispatched, store the task to be dispatched in a first preset queue according to an order of the task to be dispatched in the first preset queue;
a candidate task obtaining unit 402, configured to, in response to a preset buffering opportunity being reached, obtain a candidate task set from the first preset queue, where the candidate task set includes N candidate tasks, and N is a preset value 403 greater than 1;
a candidate task storing unit 404, configured to store the N candidate tasks into a second preset queue;
and the task dispatching unit 405 is configured to dispatch the target task to a preset executive party of the target task in response to reaching a preset dispatching opportunity of the target task, where the target task is a task in the second preset queue.
Optionally, the task receiving unit, configured to store, in response to receiving a task to be dispatched, the task to be dispatched into a first preset queue according to an order of the task to be dispatched in the first preset queue, includes: the task receiving unit is specifically configured to:
in response to receiving the task to be dispatched, acquiring a target sequence according to a preset priority of the task to be dispatched, wherein the target sequence is the sequence of the task to be dispatched in the first preset queue, and the higher the preset priority is, the smaller the target sequence is;
and storing the tasks to be dispatched into the first preset queue according to the target sequence.
Optionally, the N candidate tasks are the tasks N before the ordinal position in the first preset queue;
the candidate task storage unit is used for storing the N candidate tasks into a second preset queue, and comprises: the candidate task storage unit is specifically configured to:
and storing the candidate tasks into the second preset queue one by one from small to large according to the ordinal of the candidate tasks in the first preset queue.
Optionally, the apparatus further comprises: and the first queue updating unit is used for deleting the candidate tasks from the first preset queue and updating the sequence of the tasks in the first preset queue after the N candidate tasks are stored in the second preset queue.
Optionally, the preset buffering opportunity includes: at least one of the number of tasks in the second preset queue being lower than a first preset threshold, the number of tasks in the first preset queue exceeding a second preset threshold, and a buffer time reaching a preset buffer period indication.
Optionally, the apparatus further comprises: and the first queue updating unit is used for deleting the target task from the second preset queue and updating the sequence of the tasks in the second preset queue after the target task is dispatched to a preset executive party of the target task in response to the preset dispatching opportunity of the target task.
Fig. 5 shows a schematic structural diagram of a task dispatching device, which may include: at least one processor 501, at least one communication interface 502, at least one memory 503, and at least one communication bus 504;
in the embodiment of the present application, the number of the processor 501, the communication interface 502, the memory 503 and the communication bus 504 is at least one, and the processor 501, the communication interface 502 and the memory 503 complete the communication with each other through the communication bus 504;
the processor 501 may be a central processing unit CPU, or an application Specific Integrated circuit asic, or one or more Integrated circuits configured to implement embodiments of the present invention, etc.;
the memory 503 may include a high-speed RAM memory, and may further include a non-volatile memory (non-volatile memory) or the like, such as at least one disk memory;
the memory stores a program, and the processor can execute the program stored in the memory to implement the steps of the task dispatching method provided by the embodiment of the application, as follows:
a task dispatching method comprises the following steps:
in response to receiving a task to be dispatched, storing the task to be dispatched into a first preset queue according to the sequence of the task to be dispatched in the first preset queue;
in response to reaching a preset buffering opportunity, acquiring a candidate task set from the first preset queue, wherein the candidate task set comprises N candidate tasks, and N is a preset numerical value larger than 1;
storing the N candidate tasks into a second preset queue;
and in response to reaching a preset dispatching opportunity of the target task, dispatching the target task to a preset executive party of the target task, wherein the target task is a task in the second preset queue.
Optionally, in response to receiving a task to be dispatched, storing the task to be dispatched into a first preset queue according to an ordinal of the task to be dispatched in the first preset queue, where the method includes:
in response to receiving the task to be dispatched, acquiring a target sequence according to a preset priority of the task to be dispatched, wherein the target sequence is the sequence of the task to be dispatched in the first preset queue, and the higher the preset priority is, the smaller the target sequence is;
and storing the tasks to be dispatched into the first preset queue according to the target sequence.
Optionally, the N candidate tasks are the tasks N before the ordinal position in the first preset queue;
the storing the N candidate tasks into a second preset queue includes:
and storing the candidate tasks into the second preset queue one by one from small to large according to the ordinal of the candidate tasks in the first preset queue.
Optionally, after storing the N candidate tasks into a second preset queue, the method further includes:
and deleting the candidate task from the first preset queue, and updating the sequence of the task in the first preset queue.
Optionally, the preset buffering opportunity includes:
at least one of the number of tasks in the second preset queue being lower than a first preset threshold, the number of tasks in the first preset queue exceeding a second preset threshold, and a buffering time reaching a preset buffering period indication.
Optionally, after the dispatching the target task to the preset executor of the target task in response to reaching the preset dispatching opportunity of the target task, the method further includes:
and deleting the target task from the second preset queue, and updating the sequence of the task in the second preset queue.
The embodiment of the present application further provides a readable storage medium, where the readable storage medium may store a computer program suitable for being executed by a processor, and when the computer program is executed by the processor, the steps of the task dispatching method provided in the embodiment of the present application are implemented as follows:
a task dispatching method comprises the following steps:
in response to receiving a task to be dispatched, storing the task to be dispatched into a first preset queue according to the sequence of the task to be dispatched in the first preset queue;
in response to reaching a preset buffering opportunity, acquiring a candidate task set from the first preset queue, wherein the candidate task set comprises N candidate tasks, and N is a preset numerical value larger than 1;
storing the N candidate tasks into a second preset queue;
and in response to reaching a preset dispatching opportunity of the target task, dispatching the target task to a preset executive party of the target task, wherein the target task is a task in the second preset queue.
Optionally, in response to receiving a task to be dispatched, storing the task to be dispatched into a first preset queue according to an ordinal of the task to be dispatched in the first preset queue, where the method includes:
in response to receiving the task to be dispatched, acquiring a target sequence according to a preset priority of the task to be dispatched, wherein the target sequence is the sequence of the task to be dispatched in the first preset queue, and the higher the preset priority is, the smaller the target sequence is;
and storing the tasks to be dispatched into the first preset queue according to the target sequence.
Optionally, the N candidate tasks are the tasks N before the ordinal position in the first preset queue;
the storing the N candidate tasks into a second preset queue includes:
and storing the candidate tasks into the second preset queue one by one from small to large according to the ordinal of the candidate tasks in the first preset queue.
Optionally, after storing the N candidate tasks into a second preset queue, the method further includes:
and deleting the candidate task from the first preset queue, and updating the sequence of the task in the first preset queue.
Optionally, the preset buffering opportunity includes:
at least one of the number of tasks in the second preset queue being lower than a first preset threshold, the number of tasks in the first preset queue exceeding a second preset threshold, and a buffering time reaching a preset buffering period indication.
Optionally, after the dispatching the target task to the preset executor of the target task in response to reaching the preset dispatching opportunity of the target task, the method further includes:
and deleting the target task from the second preset queue, and updating the sequence of the task in the second preset queue.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A task dispatching method is characterized by comprising the following steps:
in response to receiving a task to be dispatched, storing the task to be dispatched into a first preset queue according to the sequence of the task to be dispatched in the first preset queue;
in response to reaching a preset buffering opportunity, acquiring a candidate task set from the first preset queue, wherein the candidate task set comprises N candidate tasks, and N is a preset numerical value larger than 1;
storing the N candidate tasks into a second preset queue;
and in response to reaching a preset dispatching opportunity of the target task, dispatching the target task to a preset executive party of the target task, wherein the target task is a task in the second preset queue.
2. The method of claim 1, wherein in response to receiving the task to be served, storing the task to be served in a first preset queue according to an order of the task to be served in the first preset queue, comprises:
in response to receiving the task to be dispatched, acquiring a target sequence according to a preset priority of the task to be dispatched, wherein the target sequence is the sequence of the task to be dispatched in the first preset queue, and the higher the preset priority is, the smaller the target sequence is;
and storing the tasks to be dispatched into the first preset queue according to the target sequence.
3. The method according to claim 2, wherein the N candidate tasks are N-first-in-order tasks in the first preset queue;
the storing the N candidate tasks into a second preset queue includes:
and storing the candidate tasks into the second preset queue one by one from small to large according to the ordinal of the candidate tasks in the first preset queue.
4. The method according to claim 1 or 3, wherein after storing the N candidate tasks into a second predetermined queue, the method further comprises:
and deleting the candidate task from the first preset queue, and updating the sequence of the task in the first preset queue.
5. The method of claim 1, wherein the preset buffering opportunity comprises:
at least one of the number of tasks in the second preset queue is lower than a first preset threshold, the number of tasks in the first preset queue exceeds a second preset threshold, and the buffer time reaching a preset buffer period indication.
6. The method according to claim 1 or 5, wherein after said dispatching the target task to the preset executive of the target task in response to reaching the preset dispatch opportunity of the target task, further comprising:
and deleting the target task from the second preset queue, and updating the sequence of the task in the second preset queue.
7. A task serving apparatus, comprising:
the task receiving unit is used for responding to the received task to be dispatched and storing the task to be dispatched into a first preset queue according to the sequence position of the task to be dispatched in the first preset queue;
the candidate task acquisition unit is used for acquiring a candidate task set from the first preset queue in response to reaching a preset buffering opportunity, wherein the candidate task set comprises N candidate tasks, and N is a preset value greater than 1;
the candidate task storage unit is used for storing the N candidate tasks into a second preset queue;
and the task dispatching unit is used for responding to a preset dispatching opportunity of a target task, dispatching the target task to a preset executive party of the target task, wherein the target task is a task in the second preset queue.
8. The apparatus according to claim 7, wherein the task receiving unit, in response to receiving the task to be served, is configured to store the task to be served in a first preset queue according to an order of the task to be served in the first preset queue, and includes: the task receiving unit is specifically configured to:
in response to receiving the task to be dispatched, acquiring a target sequence according to a preset priority of the task to be dispatched, wherein the target sequence is the sequence of the task to be dispatched in the first preset queue, and the higher the preset priority is, the smaller the target sequence is;
and storing the tasks to be dispatched into the first preset queue according to the target sequence.
9. A task serving apparatus, comprising: a memory and a processor;
the memory is used for storing programs;
the processor is used for executing the program and realizing the steps of the task dispatching method according to any one of claims 1-6.
10. A readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method for dispatching a task according to any one of claims 1 to 6.
CN202110482209.6A 2021-04-30 2021-04-30 Task distribution method, device, equipment and readable storage medium Pending CN113159602A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110482209.6A CN113159602A (en) 2021-04-30 2021-04-30 Task distribution method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110482209.6A CN113159602A (en) 2021-04-30 2021-04-30 Task distribution method, device, equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN113159602A true CN113159602A (en) 2021-07-23

Family

ID=76873093

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110482209.6A Pending CN113159602A (en) 2021-04-30 2021-04-30 Task distribution method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN113159602A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113656178A (en) * 2021-08-19 2021-11-16 中国银行股份有限公司 Data processing method, device, equipment and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102542379A (en) * 2010-12-20 2012-07-04 中国移动通信集团公司 Method, system and device for processing scheduled tasks
CN106775977A (en) * 2016-12-09 2017-05-31 北京小米移动软件有限公司 Method for scheduling task, apparatus and system
CN110096353A (en) * 2019-05-14 2019-08-06 厦门美图之家科技有限公司 Method for scheduling task and device
US20200104165A1 (en) * 2018-09-28 2020-04-02 Atlassian Pty Ltd Systems and methods for scheduling tasks
CN112286704A (en) * 2020-11-19 2021-01-29 每日互动股份有限公司 Processing method and device of delay task, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102542379A (en) * 2010-12-20 2012-07-04 中国移动通信集团公司 Method, system and device for processing scheduled tasks
CN106775977A (en) * 2016-12-09 2017-05-31 北京小米移动软件有限公司 Method for scheduling task, apparatus and system
US20200104165A1 (en) * 2018-09-28 2020-04-02 Atlassian Pty Ltd Systems and methods for scheduling tasks
CN110096353A (en) * 2019-05-14 2019-08-06 厦门美图之家科技有限公司 Method for scheduling task and device
CN112286704A (en) * 2020-11-19 2021-01-29 每日互动股份有限公司 Processing method and device of delay task, computer equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113656178A (en) * 2021-08-19 2021-11-16 中国银行股份有限公司 Data processing method, device, equipment and readable storage medium
CN113656178B (en) * 2021-08-19 2024-02-27 中国银行股份有限公司 Data processing method, device, equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN110096353B (en) Task scheduling method and device
US7937257B2 (en) Estimating performance of application based on automatic resizing of shared memory for messaging
US8903925B2 (en) Scheduled messages in a scalable messaging system
EP2038746B1 (en) Methods, apparatus and computer programs for managing persistence
CN111917863B (en) Message pushing method and device, television equipment and computer storage medium
CN109309712B (en) Data transmission method based on interface asynchronous call, server and storage medium
CN110231995B (en) Task scheduling method, device and storage medium based on Actor model
CN110262883B (en) Timer-based task processing method and device and electronic equipment
US11055027B1 (en) Dynamic queue management
CN115858180B (en) Time slicing method and device and electronic equipment
CN112395067A (en) Task scheduling method, system, device and medium
CN109842621A (en) A kind of method and terminal reducing token storage quantity
CN111507608A (en) Work order early warning method and device and storage medium
CN113159602A (en) Task distribution method, device, equipment and readable storage medium
CN112040001A (en) Request processing method and device based on distributed storage
CN111190541B (en) Flow control method of storage system and computer readable storage medium
CN112306827A (en) Log collection device, method and computer readable storage medium
CN116455832A (en) Method for consuming sequence of message queue based on lane mode
CN111679895A (en) Execution method, device and equipment of distributed timing task and readable storage medium
CN110825342B (en) Memory scheduling device and system, method and apparatus for processing information
CN112181645A (en) Resource scheduling method, device, equipment and storage medium
CN112035460A (en) Identification distribution method, device, equipment and storage medium
CN114443246B (en) Intelligent scheduling method, device, equipment and computer readable storage medium
CN115080435A (en) Case testing method and device
CN115202842A (en) Task scheduling method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination