CN113176931A - Task flow processing method and device, storage medium and electronic equipment - Google Patents

Task flow processing method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN113176931A
CN113176931A CN202110343053.3A CN202110343053A CN113176931A CN 113176931 A CN113176931 A CN 113176931A CN 202110343053 A CN202110343053 A CN 202110343053A CN 113176931 A CN113176931 A CN 113176931A
Authority
CN
China
Prior art keywords
task
identifier
packing
execution sequence
packet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110343053.3A
Other languages
Chinese (zh)
Other versions
CN113176931B (en
Inventor
马云存
王晨
闻英友
窦丽莉
葛东
吕昕东
李盼
何涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neusoft Cloud Technology Co ltd
Neusoft Corp
Original Assignee
Neusoft Cloud Technology Co ltd
Neusoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neusoft Cloud Technology Co ltd, Neusoft Corp filed Critical Neusoft Cloud Technology Co ltd
Priority to CN202110343053.3A priority Critical patent/CN113176931B/en
Publication of CN113176931A publication Critical patent/CN113176931A/en
Application granted granted Critical
Publication of CN113176931B publication Critical patent/CN113176931B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/40Transformation of program code
    • G06F8/41Compilation
    • G06F8/44Encoding

Abstract

The utility model relates to a task flow processing method, a device, a storage medium and an electronic device, wherein the method comprises executing the packing function according to the sequence of the packing function in the packing code, wherein the calling parameter of each packing function is the task identifier of at least one task in the task flow, the packing function is used for packing the task identifier as the calling parameter into a task packet, the task packet comprises an execution sequence number and a task identifier, and the execution sequence number is determined according to the execution sequence of the packing function; and according to the execution sequence number of the task packet, the tasks corresponding to the task identifiers included in each task packet are sequentially executed, and the execution codes with sequential or parallel execution relation among the tasks do not need to be manually written, so that the complexity of code writing is reduced.

Description

Task flow processing method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to electronic information technology, and in particular, to a method and an apparatus for processing a task flow, a storage medium, and an electronic device.
Background
The service is generally composed of a plurality of independent tasks, the plurality of tasks have a sequential or parallel relationship of execution, and a task group formed by arranging the plurality of tasks is a task stream. Before the service is executed in the background, the sequential or parallel execution relation between the tasks of the task flow needs to be manually coded to obtain an execution code with the sequential or parallel execution relation, and the execution of each task in the task flow can be completed by running the manually written execution code in the background.
Because the sequential or parallel execution relation of different task flows is different. Therefore, for different task flows, execution codes with sequential or parallel execution relations among tasks included in the task flows need to be manually written, and therefore coding complexity is high.
Disclosure of Invention
The purpose of the present disclosure is to provide a method, an apparatus, a storage medium, and an electronic device for processing a task stream, which do not require manually writing an execution code between tasks and reduce the encoding complexity.
In order to achieve the above object, in a first aspect, the present disclosure provides a task flow processing method, including:
executing the packing functions according to the sequence of the packing functions in the packing codes, wherein the calling parameter of each packing function is the task identifier of at least one task in the task stream, the packing functions are used for packing the task identifiers serving as the calling parameters into task packets, the task packets comprise execution sequence numbers and task identifiers, and the execution sequence numbers are determined according to the execution sequence of the packing functions;
and sequentially executing the tasks corresponding to the task identifiers included in each task packet according to the execution sequence numbers of the task packets.
Optionally, for each packing function, the packing function packs the task identifier as the call parameter of the packing function into a task package by:
creating a task package;
under the condition that the task cache is empty, taking the initial execution sequence number as the execution sequence number of the task packet created this time;
under the condition that the task cache is not empty, taking the sum of the execution sequence number of the task packet corresponding to the last stored task in the task cache and a preset numerical value as the execution sequence number of the task packet created this time;
the packing function is further to: storing the task corresponding to the task identifier as a calling function into the task cache;
the sequentially executing the tasks corresponding to the task identifiers included in each task packet according to the execution sequence number of the task packet includes: and according to the execution sequence number of the task packet, sequentially taking out the tasks from the task cache according to the task identifier in each task packet for execution.
Optionally, the task flow includes a parallel branch, and task identifiers corresponding to a plurality of tasks on the parallel branch are used as a same call parameter to call the packing function;
the packing function is further to: under the condition that the calling parameter of the packing function comprises task identifiers of a plurality of tasks, aiming at each task identifier behind the first task identifier in each branch, setting a front task identifier corresponding to the task identifier; and the number of the first and second electrodes,
and setting a post-task identifier corresponding to the task identifier for each task identifier before the last task identifier in each branch.
Optionally, the sequentially fetching the task from the task cache according to the task identifier in each task packet according to the execution sequence number of the task packet to execute includes:
when a task packet comprising a plurality of task identifiers is executed, determining a task execution sequence between tasks corresponding to the plurality of task identifiers according to the front task identifier and the rear task identifier, and sequentially taking out the plurality of tasks from the task cache for execution according to the task execution sequence.
Optionally, the sequentially executing the tasks corresponding to the task identifiers included in each task includes:
placing the task corresponding to the task identifier in the task packet into an operation task pool of a task manager according to an execution sequence number of the task packet by a task message processor corresponding to the task flow, wherein the task manager is used for executing the task in the operation task pool, monitoring the execution state of the task in the operation task pool, and notifying the task message processor after monitoring that the task in the operation task pool is completed;
and the task message processor adds the task corresponding to the task identifier in the next task packet into the running task pool after receiving the notification of the task manager.
In a second aspect, the present disclosure provides a task flow processing apparatus, the apparatus comprising:
the device comprises a packing module, a processing module and a processing module, wherein the packing module is used for executing packing functions according to the sequence of the packing functions in the packing codes, the calling parameter of each packing function is the task identifier of at least one task in a task stream, the packing function is used for packing the task identifiers serving as the calling parameters into task packets, each task packet comprises an execution sequence number and a task identifier, and the execution sequence number is determined according to the execution sequence of the packing function;
and the execution module is used for sequentially executing the tasks corresponding to the task identifiers included in each task packet according to the execution sequence numbers of the task packets.
Optionally, for each packing function, the packing function packs the task identifier as the call parameter of the packing function into a task package by:
creating a task package;
under the condition that the task cache is empty, taking the initial execution sequence number as the execution sequence number of the task packet created this time;
under the condition that the task cache is not empty, taking the sum of the execution sequence number of the task packet corresponding to the last stored task in the task cache and a preset numerical value as the execution sequence number of the task packet created this time;
the packing function is further to: storing the task corresponding to the task identifier as a calling function into the task cache;
and the execution module is specifically used for taking out the tasks from the task cache for execution according to the execution sequence numbers of the task packets and the task identifiers in each task packet in sequence.
Optionally, the task flow includes a parallel branch, and task identifiers corresponding to a plurality of tasks on the parallel branch are used as a same call parameter to call the packing function;
the packing function is further to: under the condition that the calling parameter of the packing function comprises task identifiers of a plurality of tasks, aiming at each task identifier behind the first task identifier in each branch, setting a front task identifier corresponding to the task identifier; and the number of the first and second electrodes,
and setting a post-task identifier corresponding to the task identifier for each task identifier before the last task identifier in each branch.
In a third aspect, the present disclosure provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of any of the first aspects described above.
In a fourth aspect, the present disclosure provides an electronic device comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method of any of the first aspects above.
According to the technical scheme, the packaging functions in the packaging codes are utilized, the packaging functions are used for packaging the task identifiers serving as the calling parameters into the task packets, the task packets comprise the execution serial numbers and the task identifiers, the execution serial numbers are determined according to the execution sequence of the packaging functions, therefore, for different task streams, only the packaging codes need to be changed, the task identifiers corresponding to the tasks in the task streams can be automatically packaged and the execution serial numbers of the task packets can be set by utilizing the packaging functions, when the tasks are executed, the tasks corresponding to the task identifiers in the task packets can be executed according to the execution serial numbers of the task packets, the execution codes with sequential or parallel execution relations among the tasks do not need to be manually written for the different task streams, and the complexity of code writing is reduced.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
fig. 1 is a flowchart illustrating a task flow processing method according to an exemplary embodiment of the present disclosure.
FIG. 2 is a task flow diagram shown in accordance with an exemplary embodiment of the present disclosure.
FIG. 3 is another task flow diagram shown in accordance with an exemplary embodiment of the present disclosure.
Fig. 4 is a flow chart illustrating an implementation of a packing function according to an exemplary embodiment of the present disclosure.
Fig. 5 is another schematic diagram illustrating a task flow processing method according to an exemplary embodiment of the present disclosure.
Fig. 6 is a block diagram illustrating a task flow processing device according to an example embodiment.
FIG. 7 is a block diagram illustrating an electronic device in accordance with an example embodiment.
Detailed Description
The following detailed description of specific embodiments of the present disclosure is provided in connection with the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present disclosure, are given by way of illustration and explanation only, not limitation.
When processing a service, when the service logic processing is complex, it can be divided into a plurality of independent and usually reusable logic processing units, and these processing units are called tasks. A task group formed by arranging a plurality of tasks is a task stream, and the plurality of tasks have the execution sequence or parallel relation. Before processing the service, the precedence or parallel relation between the tasks of the task flow needs to be encoded to obtain an execution code with precedence or parallel execution relation, and the execution of each task in the task flow can be completed by running the written execution code in the background. However, in the related art, the sequential or parallel relationship between tasks included in each task stream is usually encoded manually, which results in higher encoding complexity, especially for task streams with complex execution relationships, such as including parallel branches.
In view of this, the present disclosure provides a method, an apparatus, a storage medium, and an electronic device for processing a task stream, where for different task streams, automatic packing and setting of an execution sequence number of each task packet can be achieved by using each packing function in a packing code for a task identifier corresponding to a task in the task stream, and when a task is executed, the task corresponding to the task identifier included in each task packet is executed according to the execution sequence number of the task packet, and it is not necessary to manually write an execution code having a sequential or parallel execution relationship between tasks for different task streams, so that complexity of code writing is reduced.
Fig. 1 is a flowchart illustrating a task flow processing method according to an exemplary embodiment of the present disclosure. Referring to fig. 1, the task flow processing method may include the steps of:
step 101, executing the packing functions according to the sequence of the packing functions in the packing codes, wherein the calling parameter of each packing function is the task identifier of at least one task in the task stream, the packing functions are used for packing the task identifiers serving as the calling parameters into task packets, each task packet comprises an execution sequence number and a task identifier, and the execution sequence number is determined according to the execution sequence of the packing functions.
And 102, sequentially executing the tasks corresponding to the task identifiers included in each task packet according to the execution sequence numbers of the task packets.
For example, the task flow processing method provided by the embodiment of the disclosure can be used for business processing in the field of automatic monitoring operation and maintenance.
Illustratively, FIG. 2 is a task flow diagram illustrating an exemplary embodiment according to the present disclosure. In fig. 2, each task is executed in the direction of the arrow shown in fig. 2. The task identifier of task1 is task1, the task identifier of task2 is task2, the task identifier of task3 is task3, the task identifier of task4 is task4, the task identifier of task5 is task5, the task identifier of task6 is task6, and the task identifier of task7 is task7, which will be further explained below with reference to the task flow shown in fig. 2 as task flow example 1.
For example, for the task stream shown in fig. 2, the packing encoding may be:
task flow new task flow ("task flow instance 1")
Task 1-new task ("task 1")
Task 2-new task ("task 2")
Task 3-new task ("task 3")
Task 4-new task ("task 4")
Task 5-new task ("task 5")
Task 6-new task ("task 6")
List list new list ()
list.add(task3).add(task4).add(task5).add(task6)
Task 7-new task ("task 7")
flow.add(task1)
flow.add(task2)
flow.addParallelTasks(list)
flow.add(task7)
In the above packing encoding, both flow.add () and flow.addparalleltables () are packing functions, where flow.add () is a packing function including only one task identifier for a call argument, and flow.addparalleltables () is a packing function including a plurality of task identifiers for a call argument. Specifically, task1 is a call parameter of flow.add (task 1); task2 is the call parameter of flow.add (task 2); the list is a call parameter of flow.addparalleltables (list); task7 is a call parameter of flow.
It should be understood that the execution of the respective packing functions in the packing encoding is performed in the writing order of the respective packing functions in the packing encoding. Therefore, in the above packing encoding, the execution order of each packing function is flow.add (task1), flow.add (task2), flow.add paralleltables (list), and flow.add (task7) in this order.
Task packages sequentially obtained by the flow.add (task1), the flow.add (task2), the flow.add parallel tasks (list) and the flow.add (task7) packing functions are a task package 1, a task package 2, a task package 3 and a task package 4, wherein the task package 1 comprises a task1, the task package 2 comprises a task2, the task package 3 comprises a task3, a task4, a task5 and a task6, and the task package 4 comprises a task 7. The execution sequence number of the task packet is determined according to the execution sequence of the packing function. Therefore, according to the execution sequence of the individual packing functions, the execution sequence number of the task packet 1 is before the execution sequence number of the task packet 2, the execution sequence number of the task packet 2 is before the execution sequence number of the task packet 3, and the execution sequence number of the task packet 3 is before the execution sequence number of the task packet 4. That is, the sequence of executing each task package is as follows: task package 1, task package 2, task package 3, and task package 4.
It should be noted that the packing code is edited manually, and only the packing code needs to be manually changed for different task streams.
Illustratively, fig. 3 is another task flow diagram shown in accordance with an exemplary embodiment of the present disclosure. The task flow shown in fig. 3 is the task flow embodiment 2, and for fig. 3, only the packing code shown in fig. 2 needs to be modified manually, so that task packets with a sequential execution order corresponding to the task flow shown in fig. 3 can be obtained. For example, the packing encoding corresponding to the task stream shown in fig. 3 may be:
task flow new task flow ("task flow instance 2")
Task 1-new task ("task 1")
List list new list ()
list.add(task2).add(task3).add(task4)
Task 5-new task ("task 5")
flow.add(task1)
flow.addParallelTasks(list)
flow.add(task5)
According to the method, the multiple tasks can be executed according to the sequence or the parallel relation among the multiple tasks only by simply converting the packing codes, so that the coding complexity and the difficulty are effectively reduced, and particularly the service with complex task logic is realized.
By adopting the mode, the task packets with the sequential execution sequence can be automatically generated by the packing function only by manually editing the packing codes and setting the calling parameters of the packing functions, the tasks corresponding to the task identifiers included in the task packets can be executed according to the execution sequence of the task packets, the service processing can be realized, the execution codes with sequential or parallel execution relation among the tasks do not need to be manually written, and the complexity of code writing is reduced.
In a possible approach, the packing function is also used to: and storing the task corresponding to the task identifier as the calling function into the task cache. Correspondingly, for each packing function, the packing function packs the task identifier as the calling parameter of the packing function into a task package by the following method:
a task package is created.
And under the condition that the task cache is empty, taking the initial execution sequence number as the execution sequence number of the task packet created this time.
And under the condition that the task cache is not empty, taking the sum of the execution sequence number of the task packet corresponding to the last stored task in the task cache and a preset numerical value as the execution sequence number of the task packet created this time.
Illustratively, fig. 4 is a flow chart illustrating an implementation of a packing function according to an exemplary embodiment of the present disclosure. Referring to fig. 4, the method includes the following steps:
step 401, a task package is created.
Step 402, determine whether the task buffer is empty.
And step 403, taking the initial execution sequence number as the execution sequence number of the task packet created this time.
Step 404, taking the sum of the execution sequence number of the task packet corresponding to the last stored task in the task cache and a preset numerical value as the execution sequence number of the task packet created this time.
Step 405, storing the task corresponding to the task identifier as the call function into a task cache.
Step 406, storing the task identifier as the call function into the task package created this time
In the present disclosure, the task cache is used for storing the task corresponding to the task identifier.
For example, the initial execution sequence number may be a preset sequence number, for example, 0. And the preset numerical value is a natural number which is greater than 0, so that the execution sequence number of the task packet obtained by packaging the next packaging function is greater than the execution sequence number of the task packet obtained by packaging the previous packaging function.
In the case that the task corresponding to the task identifier as the calling function is stored in the task cache, step 102 illustrated in fig. 1 may include: and according to the execution sequence number of the task packet, sequentially taking out the tasks from the task cache according to the task identifier in each task packet for execution.
In a possible manner, the task flow includes a parallel branch, and task identifiers corresponding to a plurality of tasks on the parallel branch are used as the same call parameter to call the packing function. The packing function is also to: under the condition that the calling parameter of the packing function comprises task identifiers of a plurality of tasks, aiming at each task identifier behind the first task identifier in the calling parameter, setting a front task identifier corresponding to the task identifier; and setting a post-task identifier corresponding to the task identifier for each task identifier before the last task identifier in the calling parameter.
Illustratively, taking the task flow shown in fig. 2 as an example, task3 and task4 and task5 and task6 are parallel task branches, and the task identifiers of task3, task4, task5 and task6 are used as the same call parameter to call the packing function.
For the task branch situation shown in fig. 2, task3 and task4 are in the same branch, and in the branch, the task identifier of task3 is used as the first task identifier in the branch, so that a pre-task identifier needs to be set for the task identifier of task4, and the pre-task identifier corresponding to the task identifier of task4 should be the task identifier of task 3; the task identifier of task4 is used as the last task identifier in the branch, so a post-set task identifier needs to be set for the task identifier of task3, and the post-set task identifier corresponding to the task identifier of task3 should be the task identifier of task 4.
The task5 and the task6 are in the same branch, and in the branch, the task identifier of the task5 is used as the first task identifier in the branch, so that a pre-task identifier needs to be set for the task identifier of the task6, and the pre-task identifier corresponding to the task identifier of the task6 should be the task identifier of the task 5; the task identifier of task6 is used as the last task identifier in the branch, so a post-task identifier needs to be set for the task identifier of task5, and the post-task identifier corresponding to the task identifier of task5 should be the task identifier of task 6.
Under the condition that the pre-task identifier and the post-task identifier are set, sequentially taking out the tasks from the task cache according to the task identifiers in each task packet and executing the tasks according to the execution sequence number of the task packet may include: when a task packet comprising a plurality of task identifiers is executed, determining a task execution sequence among tasks corresponding to the plurality of task identifiers according to the front task identifier and the rear task identifier, and sequentially taking out the plurality of tasks from the task cache for execution according to the task execution sequence. The task package for executing a plurality of task identifiers will be further explained by taking tasks 3 and 4 and tasks 5 and 6 shown in fig. 2 as an example.
If the task identifier is not provided with the post-task identifier, the task corresponding to the task identifier is represented to be in the first execution sequence in the branch. Therefore, when executing the task packet, first, task3 and task5 corresponding to task3 and task4 without the preceding task id are fetched from the task cache, and task3 and task5 can be executed in parallel.
Then, after the execution of task3 or task5 is completed, the post-task identifier of the task that is completed in the task packet is acquired. Taking the example of the completion of the execution of the task3, at this time, the task4 corresponding to the post-task identifier task4 of the task identifier of the task3 is acquired, and the task4 is executed.
By adopting the mode, the execution sequence among the tasks corresponding to the task identifiers in the same task package is represented by the post-task identifier and the post-task identifier, so that the tasks corresponding to the task packages comprising the task identifiers are convenient to process.
In a possible manner, sequentially executing the task corresponding to the task identifier included in each task may include: and the task corresponding to the task identifier in the task packet is placed into an operation task pool of a task manager according to the execution sequence number of the task packet by a task message processor corresponding to the task flow, and the task manager is used for executing the task in the operation task pool, monitoring the execution state of the task in the operation task pool, and notifying the task message processor after monitoring that the task in the operation task pool is completed. The following further explains the execution process of the task corresponding to the task identifier included in each task package in this embodiment with reference to fig. 2 and 5.
According to the packing condition of the task stream shown in fig. 2 by the above packing code, four task packages are obtained by packing, including a task package 1, a task package 2, a task package 3 and a task package 4, and the execution sequence is as follows: task package 1, task package 2, task package 3, and task package 4. The task execution process aiming at the task flow comprises the following steps:
the task message processor firstly puts the task1 corresponding to the task identifier task1 in the task package 1 into an operation task pool of the task manager, and enables the task manager to execute the task1 in the operation task pool, the task manager monitors the execution state of the task1 and informs the task message processor when the task1 is monitored to be completely operated, and after receiving the notice, the task message processor adds the task2 corresponding to the task identifier task2 in the next task package (task package 2) into the operation task pool until the tasks corresponding to the task identifiers of all the task packages are completely operated.
In addition, in the case that a plurality of task flows need to be run simultaneously, the storage space of the running task pool is limited, and the task packages of a large number of task flows cannot be stored simultaneously. When the task package comprises a plurality of task identifiers, if the task package comprises the tasks corresponding to the plurality of task identifiers and is stored in the running task pool, due to the fact that the sequence exists among the tasks corresponding to the plurality of task identifiers, simultaneous execution cannot be achieved, and therefore the mode that the tasks which cannot be executed immediately in advance are stored in the running task pool can occupy the storage space of the running task pool, so that the tasks which can be executed immediately in other task flows cannot be stored, and further execution efficiency of other task flows is reduced. Therefore, in a possible manner, when there are a plurality of task identifiers in the task package, the following manner may also be adopted to put tasks corresponding to the plurality of task identifiers into the running task pool.
Specifically, the task message processor first puts the task corresponding to the task identifier in the first execution order in the task package into the running task pool, monitors the execution state of the task in the running task pool, notifies the task message processor after monitoring that the task in the running task pool is completed, and deletes the task from the running task pool after monitoring that the task is completed. The task message processor is further configured to add, after receiving the notification from the task manager, a task corresponding to a task identifier next to the task identifier of the task corresponding to the notification to the running task pool. The task package 3 corresponding to the task flow shown in fig. 2 is described as an example.
Specifically, the task (including task3 and task4) corresponding to the task identifier (including task3 and task4) in the first execution order in the task packet is placed into an operating task pool by a task message processor corresponding to the task flow, the task manager monitors the execution states of the task3 and the task4, and if the task3 is monitored to be completely operated, the task message processor notifies the task message processor, and according to the post-task identifier of the task identifier task3 of the task3, the task message processor can place the task (i.e., task4) corresponding to the task identifier (i.e., task4) corresponding to the post-task identifier into the operating task pool, and delete the task 3. If the task message processor is notified after the task5 is monitored to complete the task operation, the task message processor may put the task (i.e., task6) corresponding to the task identifier (i.e., task6) corresponding to the task identifier corresponding to the task5 into the task operation pool according to the post-task identifier of the task identifier task5 of the task5, and delete the task 5.
By adopting the mode, the tasks stored in the running task pool are all tasks which can be executed immediately, so that the task execution efficiency of other task flows can be prevented from being influenced under the condition of dealing with the scene needing to process a plurality of task flows.
Based on the same inventive concept, an embodiment of the present disclosure further provides a task flow processing apparatus, and referring to fig. 6, the task flow processing apparatus 600 may include:
a packing module 601, configured to execute a packing function according to a sequence of each packing function in a packing code, where a call parameter of each packing function is a task identifier of at least one task in a task stream, the packing function is configured to pack the task identifier as the call parameter into a task packet, the task packet includes an execution sequence number and a task identifier, and the execution sequence number is determined according to an execution sequence of the packing function;
and the execution module 602 is configured to sequentially execute the task corresponding to the task identifier included in each task packet according to the execution sequence number of the task packet.
Optionally, for each packing function, the packing function packs the task identifier as the call parameter of the packing function into a task package by:
creating a task package;
under the condition that the task cache is empty, taking the initial execution sequence number as the execution sequence number of the task packet created this time;
under the condition that the task cache is not empty, taking the sum of the execution sequence number of the task packet corresponding to the last stored task in the task cache and a preset numerical value as the execution sequence number of the task packet created this time;
the packing function is further to: storing the task corresponding to the task identifier as a calling function into the task cache;
the execution module 602 is specifically configured to, according to the execution sequence number of the task packet, sequentially fetch a task from the task cache according to the task identifier in each task packet and execute the task.
Optionally, the task flow includes a parallel branch, and task identifiers corresponding to a plurality of tasks on the parallel branch are used as a same call parameter to call the packing function;
the packing function is further to: under the condition that the calling parameter of the packing function comprises task identifiers of a plurality of tasks, aiming at each task identifier behind the first task identifier in each branch, setting a front task identifier corresponding to the task identifier; and the number of the first and second electrodes,
and setting a post-task identifier corresponding to the task identifier for each task identifier before the last task identifier in each branch.
Optionally, the execution module 602 is specifically configured to, when a task packet including multiple task identifiers is executed, determine a task execution sequence between tasks corresponding to the multiple task identifiers according to the pre-task identifier and the post-task identifier, and sequentially fetch the multiple tasks from the task cache according to the task execution sequence for execution.
Optionally, the execution module includes an execution submodule, configured to place, by using a task message processor corresponding to the task flow, a task corresponding to a task identifier in the task packet into an operation task pool of a task manager according to an execution sequence number of the task packet, where the task manager is configured to execute the task in the operation task pool, monitor an execution state of the task in the operation task pool, and notify the task message processor after monitoring that the task in the operation task pool is completed;
and the task message processor adds the task corresponding to the task identifier in the next task packet into the running task pool after receiving the notification of the task manager.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Based on the same inventive concept, the embodiments of the present disclosure further provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of any of the task flow processing methods in the above method embodiments.
Based on the same inventive concept, an embodiment of the present disclosure further provides an electronic device, including:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the task flow processing method in any of the above method embodiments.
FIG. 7 is a block diagram illustrating an electronic device in accordance with an example embodiment. As shown in fig. 7, the electronic device 700 may include: a processor 701 and a memory 702. The electronic device 700 may also include one or more of a multimedia component 703, an input/output (I/O) interface 704, and a communication component 705.
The processor 701 is configured to control the overall operation of the electronic device 700, so as to complete all or part of the steps in the task flow processing method. The memory 702 is used to store various types of data to support operation at the electronic device 700, such as instructions for any application or method operating on the electronic device 700 and application-related data, such as contact data, transmitted and received messages, pictures, audio, video, and the like. The Memory 702 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk, or optical disk. The multimedia components 703 may include screen and audio components. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 702 or transmitted through the communication component 705. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 704 provides an interface between the processor 701 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 705 is used for wired or wireless communication between the electronic device 700 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, or 4G, or a combination of one or more of them, so that the corresponding Communication component 705 may include: Wi-Fi module, bluetooth module, NFC module.
In an exemplary embodiment, the electronic Device 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the above-described task flow Processing method.
In another exemplary embodiment, there is also provided a computer readable storage medium including program instructions which, when executed by a processor, implement the steps of the task flow processing method described above. For example, the computer readable storage medium may be the memory 702 described above comprising program instructions executable by the processor 701 of the electronic device 700 to perform the task flow processing method described above.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, various possible combinations will not be separately described in this disclosure.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.

Claims (10)

1. A method for processing a task stream, the method comprising:
executing the packing functions according to the sequence of the packing functions in the packing codes, wherein the calling parameter of each packing function is the task identifier of at least one task in the task stream, the packing functions are used for packing the task identifiers serving as the calling parameters into task packets, the task packets comprise execution sequence numbers and task identifiers, and the execution sequence numbers are determined according to the execution sequence of the packing functions;
and sequentially executing the tasks corresponding to the task identifiers included in each task packet according to the execution sequence numbers of the task packets.
2. The method of claim 1, wherein for each of the packing functions, the packing function packs the task identifier as a call parameter of the packing function into a task package by:
creating a task package;
under the condition that the task cache is empty, taking the initial execution sequence number as the execution sequence number of the task packet created this time;
under the condition that the task cache is not empty, taking the sum of the execution sequence number of the task packet corresponding to the last stored task in the task cache and a preset numerical value as the execution sequence number of the task packet created this time;
the packing function is further to: storing the task corresponding to the task identifier as a calling function into the task cache;
the sequentially executing the tasks corresponding to the task identifiers included in each task packet according to the execution sequence number of the task packet includes: and according to the execution sequence number of the task packet, sequentially taking out the tasks from the task cache according to the task identifier in each task packet for execution.
3. The method according to claim 2, wherein the task flow comprises a parallel branch, and task identifiers corresponding to a plurality of tasks on the parallel branch are used for calling the packing function as a same calling parameter;
the packing function is further to: under the condition that the calling parameter of the packing function comprises task identifiers of a plurality of tasks, aiming at each task identifier behind the first task identifier in each branch, setting a front task identifier corresponding to the task identifier; and the number of the first and second electrodes,
and setting a post-task identifier corresponding to the task identifier for each task identifier before the last task identifier in each branch.
4. The method according to claim 3, wherein said sequentially fetching tasks from the task cache for execution according to the task identifiers in each of the task packets according to the execution sequence numbers of the task packets comprises:
when a task packet comprising a plurality of task identifiers is executed, determining a task execution sequence between tasks corresponding to the plurality of task identifiers according to the front task identifier and the rear task identifier, and sequentially taking out the plurality of tasks from the task cache for execution according to the task execution sequence.
5. The method according to claim 1, wherein the sequentially executing the task includes a task identifier corresponding to the task, and includes:
placing the task corresponding to the task identifier in the task packet into an operation task pool of a task manager according to an execution sequence number of the task packet by a task message processor corresponding to the task flow, wherein the task manager is used for executing the task in the operation task pool, monitoring the execution state of the task in the operation task pool, and notifying the task message processor after monitoring that the task in the operation task pool is completed;
and the task message processor adds the task corresponding to the task identifier in the next task packet into the running task pool after receiving the notification of the task manager.
6. A task flow processing apparatus, characterized in that the apparatus comprises:
the device comprises a packing module, a processing module and a processing module, wherein the packing module is used for executing packing functions according to the sequence of the packing functions in the packing codes, the calling parameter of each packing function is the task identifier of at least one task in a task stream, the packing function is used for packing the task identifiers serving as the calling parameters into task packets, each task packet comprises an execution sequence number and a task identifier, and the execution sequence number is determined according to the execution sequence of the packing function;
and the execution module is used for sequentially executing the tasks corresponding to the task identifiers included in each task packet according to the execution sequence numbers of the task packets.
7. The apparatus of claim 6, wherein for each of the packing functions, the packing function packs the task identifier as a call parameter of the packing function into a task package by:
creating a task package;
under the condition that the task cache is empty, taking the initial execution sequence number as the execution sequence number of the task packet created this time;
under the condition that the task cache is not empty, taking the sum of the execution sequence number of the task packet corresponding to the last stored task in the task cache and a preset numerical value as the execution sequence number of the task packet created this time;
the packing function is further to: storing the task corresponding to the task identifier as a calling function into the task cache;
and the execution module is specifically used for taking out the tasks from the task cache for execution according to the execution sequence numbers of the task packets and the task identifiers in each task packet in sequence.
8. The apparatus according to claim 7, wherein the task flow includes a parallel branch, and task identifiers corresponding to a plurality of tasks on the parallel branch are used for calling the packing function as a same calling parameter;
the packing function is further to: under the condition that the calling parameter of the packing function comprises task identifiers of a plurality of tasks, aiming at each task identifier behind the first task identifier in each branch, setting a front task identifier corresponding to the task identifier; and the number of the first and second electrodes,
and setting a post-task identifier corresponding to the task identifier for each task identifier before the last task identifier in each branch.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
10. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 5.
CN202110343053.3A 2021-03-30 2021-03-30 Task stream processing method and device, storage medium and electronic equipment Active CN113176931B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110343053.3A CN113176931B (en) 2021-03-30 2021-03-30 Task stream processing method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110343053.3A CN113176931B (en) 2021-03-30 2021-03-30 Task stream processing method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113176931A true CN113176931A (en) 2021-07-27
CN113176931B CN113176931B (en) 2024-04-05

Family

ID=76922647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110343053.3A Active CN113176931B (en) 2021-03-30 2021-03-30 Task stream processing method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113176931B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170178056A1 (en) * 2015-12-18 2017-06-22 International Business Machines Corporation Flexible business task flow
CN108182131A (en) * 2017-12-13 2018-06-19 东软集团股份有限公司 Monitor method, apparatus, storage medium and the electronic equipment of application operation state
CN108958832A (en) * 2018-06-12 2018-12-07 北京蜂盒科技有限公司 The method, apparatus and storage medium and electronic equipment of custom service process
WO2018236691A1 (en) * 2017-06-20 2018-12-27 Vfunction, Inc. Systems and methods for running software applications on distributed application frameworks
CN110716748A (en) * 2019-09-24 2020-01-21 深圳中集智能科技有限公司 Service processing method and device, computer readable medium and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170178056A1 (en) * 2015-12-18 2017-06-22 International Business Machines Corporation Flexible business task flow
WO2018236691A1 (en) * 2017-06-20 2018-12-27 Vfunction, Inc. Systems and methods for running software applications on distributed application frameworks
CN108182131A (en) * 2017-12-13 2018-06-19 东软集团股份有限公司 Monitor method, apparatus, storage medium and the electronic equipment of application operation state
CN108958832A (en) * 2018-06-12 2018-12-07 北京蜂盒科技有限公司 The method, apparatus and storage medium and electronic equipment of custom service process
CN110716748A (en) * 2019-09-24 2020-01-21 深圳中集智能科技有限公司 Service processing method and device, computer readable medium and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
余玉涵;刘锋;: "基于复杂事件处理的业务规则管理系统的研究", 电脑知识与技术, no. 10 *
佟玲;周传生;: "基于XML的任务流编辑器的设计与实现", 沈阳师范大学学报(自然科学版), no. 04 *

Also Published As

Publication number Publication date
CN113176931B (en) 2024-04-05

Similar Documents

Publication Publication Date Title
CN106484509B (en) Output method and device of popup window and terminal
CN105573755A (en) Method and device for acquiring application Activity rendering time
CN107203465A (en) System interface method of testing and device
EP3299954A1 (en) Method for communicating between views in android system
CN107302493B (en) Message processing method, message processing device and intelligent terminal
CN109614164A (en) Realize plug-in unit configurable method, apparatus, equipment and readable storage medium storing program for executing
CN113176931B (en) Task stream processing method and device, storage medium and electronic equipment
CN110750419B (en) Offline task processing method and device, electronic equipment and storage medium
CN104298507A (en) Resource processing method and device
CN105260233A (en) Application container creating method and apparatus
CN110806878B (en) Service management method, system and storage medium
CN111176708B (en) SDK file processing method and device, electronic equipment and storage medium
CN113722022A (en) State management system and method
CN108023966B (en) Method, device and storage medium for processing universal gateway interface request
CN113312390A (en) Service data calling method and device, storage medium and electronic equipment
US9934035B2 (en) Device and method for tracing updated predicate values
US8811587B2 (en) Selectively filtering incoming communications events in a communications device
CN106452807B (en) Network processor and method for acquiring message processing data
CN110647405A (en) System message processing method, electronic device, and computer-readable storage medium
CN113852686A (en) Block chain network communication method, device, equipment and readable storage medium
CN115080276B (en) Application program function dynamic switching method and device, storage medium and electronic equipment
CN109933334B (en) Program execution method, device, equipment and medium
CN106557359B (en) Task scheduling method and system
CN114817006A (en) Stack information writing method, device, equipment and medium
CN113778573A (en) Method and system for realizing page in user interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant