CN110704170A - Batch task processing method and device, computer equipment and storage medium - Google Patents

Batch task processing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN110704170A
CN110704170A CN201910847800.XA CN201910847800A CN110704170A CN 110704170 A CN110704170 A CN 110704170A CN 201910847800 A CN201910847800 A CN 201910847800A CN 110704170 A CN110704170 A CN 110704170A
Authority
CN
China
Prior art keywords
task
batch
tasks
sequence
level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910847800.XA
Other languages
Chinese (zh)
Inventor
安凯旋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OneConnect Financial Technology Co Ltd Shanghai
Original Assignee
OneConnect Financial Technology Co Ltd Shanghai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OneConnect Financial Technology Co Ltd Shanghai filed Critical OneConnect Financial Technology Co Ltd Shanghai
Priority to CN201910847800.XA priority Critical patent/CN110704170A/en
Publication of CN110704170A publication Critical patent/CN110704170A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues

Abstract

The application relates to the technical field of test management, in particular to a batch task processing method, a device, computer equipment and a storage medium, which comprises the following steps: generating a plurality of batch task packages; arranging the batch tasks without dependency relationship from high level to low level according to different task levels to generate a first-level priority task sequence; decoupling each batch task package to obtain a second-level priority task sequence; acquiring the time and execution time of each batch task in the second-level priority task sequence when reaching the task processing server, and obtaining a batch task sequence to be executed after rearrangement; and starting a thread, executing the first task in the batch task sequence to be executed, and determining the execution mode of the continuous task. The method and the device enable batch tasks with dependency relationships to be processed in a centralized mode, save threads and improve the speed of processing the batch tasks by the computer.

Description

Batch task processing method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of test management technologies, and in particular, to a method and an apparatus for processing batch tasks, a computer device, and a storage medium.
Background
Under the rapid development of internet financial services, in order to seize diversified products issued by a financial institution or a non-financial institution in the market, under the high concurrency scene of a large number of users, a platform system provides better system performance and rapid response in order to reasonably utilize core system resources, and for some non-real-time online transactions, the processing mode of the platform system is processed by using batch tasks. However, as the number of batch tasks increases, the management among batch task logics is disordered, and configuration items for configuring timing tasks are stored in a plurality of places, so that codes become bloated, the coupling degree is too high, and the subsequent maintenance workload becomes complicated.
At present, when batch tasks are processed, batch data with dependency relationship cannot be effectively processed, so that the efficiency of processing the batch tasks is reduced, and threads of a system cannot be well utilized.
Disclosure of Invention
Accordingly, it is necessary to provide a method, an apparatus, a device, and a storage medium for processing batch tasks, which solve the problems that batch data having dependency cannot be efficiently processed, the efficiency of processing batch tasks is reduced, and threads of a system cannot be well utilized.
A batch task processing method, comprising:
acquiring a batch task sequence to be processed, and extracting task attributes of each batch task in the batch task sequence;
acquiring the dependency relationship among the task attributes, classifying and packaging the batch tasks with the dependency relationship, and generating a plurality of batch task packages;
extracting task level information in the task attributes, and arranging batch tasks without dependency relationship from high level to low level according to different task levels to generate a first-level priority task sequence;
decoupling each batch task package, and respectively inserting the batch tasks obtained after decoupling into corresponding positions of the first-level priority task sequences according to different task levels to obtain second-level priority task sequences;
acquiring the time and execution time of each batch task in the second-level priority task sequence to reach the task processing server, and rearranging the batch processing tasks at the same level according to the time to reach the task processing server to obtain a to-be-executed batch task sequence;
starting a thread, executing a first task in the batch task sequence to be executed, detecting the state of the first task when the execution time corresponding to the first task is finished, and determining the execution mode of a continuous task according to the running state of the first task.
In a possible embodiment, the obtaining of the batch task sequence to be processed and the extracting of the task attribute of each batch task in the batch task sequence include:
acquiring a batch task processing request of a requester, wherein the batch task processing request comprises a requester identifier and task information of batch tasks;
extracting keywords in the task information, and determining initial task attributes of the task information according to the keywords;
and acquiring historical data corresponding to the requester identifier, and modifying the initial task attribute according to the level feature words in the historical data to obtain a final task attribute.
In a possible embodiment, the obtaining of the dependency relationship between the task attributes, and after classifying and packaging the batch tasks with the dependency relationship, generating a plurality of batch task packages includes:
initiating a task display request for the batch tasks in the batch task sequence to acquire node identifiers after the batch tasks are displayed;
respectively recursively searching an upstream dependent task node and a downstream dependent task node by taking any batch task as a starting point according to a preset task tree model;
and summarizing the dependent batch tasks corresponding to the upstream dependent task nodes or the downstream dependent task nodes, packaging the dependent batch tasks, and generating a batch task package.
In a possible embodiment, the decoupling each batch task packet, and after inserting the batch tasks obtained after decoupling into corresponding positions of the first-level priority task sequence according to task levels, respectively, obtaining a second-level priority task sequence includes:
acquiring a task tree model of the batch tasks of the batch task package, and decoupling any batch task with a dependency relationship as a starting point batch task by taking a node income value in the task tree model as zero as a decoupling condition;
and sequentially decoupling the upstream batch tasks or the downstream batch tasks of the starting point batch tasks, and after all the batch tasks on the task tree model are fully decoupled, inserting the tasks into corresponding positions of the first-level priority task sequences according to the task level of each decoupled batch task to obtain the second-level priority task sequences.
In a possible embodiment, the starting the thread, executing a first task in the to-be-executed batch task sequence, detecting a state of the first task when an execution time corresponding to the first task is finished, and determining an execution mode of a subsequent task according to an operation state of the first task, includes:
determining a thread for executing the first task according to the task attribute of the first task;
starting the thread to execute the first task, and recording the starting time and the execution time of the first task;
obtaining the expected time for terminating the first task according to the starting time and the execution time, inquiring a characteristic value in the first task when the expected time arrives, if the characteristic value exists, not finishing the execution of the first task, otherwise finishing the execution;
and if the execution of the first task is finished, the thread is applied to sequentially execute the continuous tasks, otherwise, a new thread is started to execute the continuous tasks.
In a possible embodiment, the obtaining, according to the starting time and the execution time, an expected time when the first task is terminated, querying a feature value in the first task when the expected time comes, if the feature value exists, the first task is not executed and is not terminated, otherwise, the executing is terminated, including:
acquiring an execution function of a thread port executing the first task and an argument value to be executed of the execution function;
and when the expected time comes, acquiring an output value of the execution function, carrying out inverse operation according to the output value to obtain a current argument value, comparing the current argument value with an argument value to be executed, if the current argument value is consistent with the argument value to be executed, not finishing the execution of the first task, otherwise finishing the execution.
In a possible embodiment, the thread is started, a first task in the batch task sequence to be executed is executed, when an execution time corresponding to the first task is finished, a state of the first task is detected, and after an execution mode of a subsequent task is determined according to an operation state of the first task, the method further includes:
when any batch task in the batch task sequence to be processed is subjected to data updating, marking the batch task subjected to data updating as a task to be changed, and extracting the task to be changed from the second-level priority task sequence;
acquiring the updating time of the task to be changed and the change condition of the task level;
and adjusting the arrival time according to the change time, determining the updated position of the task to be changed according to the change condition of the task level, and importing the updated task to be changed into the second-level priority task sequence according to the position.
A batch task processing device comprising the following modules:
the system comprises a batch task package module, a task processing module and a task processing module, wherein the batch task package module is used for acquiring a batch task sequence to be processed and extracting the task attribute of each batch task in the batch task sequence; acquiring the dependency relationship among the task attributes, classifying and packaging the batch tasks with the dependency relationship, and generating a plurality of batch task packages;
the level priority sequence module is used for extracting task level information in the task attributes, and arranging batch tasks without dependency relationship from high level to low level according to different task levels to generate a first level priority task sequence; decoupling each batch task package, inserting the batch tasks obtained after decoupling into corresponding positions of the first-level priority task sequences according to different task levels, and obtaining second-level priority task sequences;
the batch task running module is set to acquire the time and the execution time of each batch task in the second-level priority task sequence reaching the task processing server, and rearranges the batch processing tasks at the same level according to the time of reaching the task processing server to obtain a to-be-executed batch task sequence; starting a thread, executing a first task in the batch task sequence to be executed, detecting the state of the first task when the execution time corresponding to the first task is finished, and determining the execution mode of a continuous task according to the running state of the first task.
A computer device, the device comprising: at least one processor, memory, and transceiver; the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the batch task processing method.
A storage medium having stored thereon computer-readable instructions which, when executed by one or more processors, cause the one or more processors to perform the steps of the batch task processing method described above.
Compared with the existing mechanism, the method and the device have the advantages that the task attribute of each batch task in the batch task sequence is extracted by acquiring the batch task sequence to be processed; acquiring the dependency relationship among the task attributes, classifying and packaging the batch tasks with the dependency relationship, and generating a plurality of batch task packages; extracting task level information in the task attributes, and arranging batch tasks without dependency relationship from high level to low level according to different task levels to generate a first-level priority task sequence; decoupling each batch task package, and respectively inserting the batch tasks obtained after decoupling into corresponding positions of the first-level priority task sequences according to different task levels to obtain second-level priority task sequences; acquiring the time and execution time of each batch task in the second-level priority task sequence to reach the task processing server, and rearranging the batch processing tasks at the same level according to the time to reach the task processing server to obtain a to-be-executed batch task sequence; starting a thread, executing a first task in the batch task sequence to be executed, detecting the state of the first task when the execution time corresponding to the first task is finished, and determining the execution mode of a continuous task according to the running state of the first task. Therefore, batch tasks with dependency relationship can be processed in a centralized mode, threads are saved, and the speed of processing the batch tasks by a computer is improved.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application.
FIG. 1 is a flow diagram illustrating an overall method for batch task processing according to one embodiment of the present application;
FIG. 2 is a diagram illustrating a second task sequence structure according to the present application;
FIG. 3 is a diagram illustrating a task attribute obtaining process in a batch task processing method according to an embodiment of the present application;
FIG. 4 is a diagram illustrating a batch task package generation process in a batch task processing method according to an embodiment of the present application;
FIG. 5 is a block diagram of a batch task processing device according to one embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Fig. 1 is an overall flowchart of a batch task processing method according to an embodiment of the present application, and as shown in fig. 1, the batch task processing method includes the following steps:
s1, acquiring a batch task sequence to be processed, and extracting the task attribute of each batch task in the batch task sequence;
specifically, the batch task sequence to be processed may be already collected by the user, or may be obtained by sending a batch task processing instruction to each data terminal. When a batch task processing instruction is sent to each data terminal, the identity of each data terminal needs to be verified, namely, a mode of verifying the ID of the data terminal can be adopted, the batch task sent by the data terminal is received only when the ID of the data terminal is the ID in a preset ID list, and otherwise, the batch task is not received. Through analyzing the data terminal ID, the problem that the server load is too large and the working efficiency is influenced due to the fact that a large amount of batch data are gathered to one server can be avoided.
In this embodiment, the task attribute refers to a task type, a task level, a task running time, and the like, where the task type includes: for public account opening, batch generation and issue, account opening notification and the like, the task grades are respectively 3,2 and 5, and the running time is respectively 2,6 and 5. When the tasks are sorted in batches, the tasks are arranged from the highest level to the lowest level according to different levels, so that the server can process the most urgent tasks preferentially.
S2, obtaining the dependency relationship among the task attributes, classifying and packaging the batch tasks with the dependency relationship, and generating a plurality of batch task packages;
specifically, the dependency relationship is also referred to as a "logical relationship". In project management, a relationship is referred to that indicates that a change in one of two activities (a leading activity and a following activity) will affect the other activity. The typical dependencies between activities include three forms, namely mandatory dependencies (inherent in the work done), freely processable dependencies (determined by the project team), and external dependencies (between the project activity and the non-project activity).
When the dependency relationship is determined, a function analysis method may be adopted, that is, a system interface called by each batch task is obtained, when two different batch tasks call one system interface, the function of the system interface may be analyzed, and if the calculation results obtained by the called functions in the two batch tasks are consistent, a dependency relationship exists between the two batch tasks.
S3, extracting task level information in the task attributes, and arranging batch tasks without dependency relationship from high level to low level according to different task levels to generate a first-level priority task sequence;
specifically, the level information in the task attribute is dynamically changed, that is, after the previous batch task is processed, the level of the subsequent batch task is changed correspondingly along with the time reaching the server, and at this time, the task sequence needs to be sorted again.
S4, decoupling each batch task packet, and inserting the batch tasks obtained after decoupling into corresponding positions of the first-level priority task sequence according to different task levels to obtain a second-level priority task sequence;
specifically, when decoupling is performed on batch tasks with dependency relationships, a tree model mode may be adopted, that is, a batch task corresponding to any one node in the batch task tree model is used as an initiation point of decoupling operation, and then decoupling is performed sequentially on upstream and downstream nodes.
The decoupling mode is convenient for adopting the scheduling of the in-degree value, and the in-degree is one of important concepts in the graph theory algorithm. It generally refers to the number of times a certain point in the directed graph is the end point of an edge in the graph. When the in-degree of one batch task is zero, the output of other batch tasks is not used as the input of the batch task, so that the batch task is separated from the task tree model, and the decoupling purpose is achieved.
After the batch tasks in the batch task package are decoupled, the batch tasks can be inserted into the positions of the corresponding levels of the first task sequence according to the levels of the decoupled batch tasks in the manner shown in fig. 2. As shown in fig. 2, if the batch task level decoupled from the batch task package is level 1, the batch task level decoupled from the batch task package is inserted between level 1 and level 2 batch tasks in the first task sequence, and if the batch task level decoupled from the batch task package is level 2, the batch task level decoupled from the batch task package is inserted between level 2 and level 3 batch tasks in the first task sequence, and so on until all decoupled batch tasks are added to the first task sequence, the second task sequence is obtained.
S5, acquiring the time for each batch task in the second-level priority task sequence to reach the task processing server and the execution time, and rearranging the batch processing tasks at the same level according to the time for each batch task to reach the task processing server to obtain a batch task sequence to be executed;
specifically, batch tasks of the same level will have the server time reached earlier before the arrival time later. And for the batch tasks with the same level and the same time, sequencing according to the running time, namely sequencing the batch tasks with short running time to the batch tasks with long running time, so as to prevent the batch tasks with large running time from occupying the threads of the server for a long time to cause that other batch tasks cannot be processed.
And S6, starting a thread, executing a first task in the batch task sequence to be executed, detecting the state of the first task when the execution time corresponding to the first task is finished, and determining the execution mode of a continuous task according to the running state of the first task.
Optionally, in an embodiment, the S6, starting a thread, executing a first task in the batch task sequence to be executed, detecting a state of the first task when an execution time corresponding to the first task is ended, and determining an execution manner of a subsequent task according to an operation state of the first task, may include the following steps:
s61, determining a thread for executing the first task according to the task attribute of the first task;
specifically, different task attributes correspond to different threads, for example, a public task and a private task are put into two different threads for processing.
S62, starting the thread to execute the first task, and recording the starting time and the execution time of the first task;
s63, obtaining the expected time for the first task to terminate according to the starting time and the execution time, inquiring the characteristic value in the first task when the expected time arrives, if the characteristic value exists, the first task is not executed and is not finished, otherwise, the execution is finished; if the first task is finished, the thread is applied to execute the continued tasks in sequence, otherwise, a new thread is started to execute the continued tasks
And S64, if the execution of the first task is finished, the threads are applied to sequentially execute the continuous tasks, otherwise, a new thread is started to execute the continuous tasks.
Specifically, the execution of the first task is finished, if the mark is 'finished', the batch task sequence to be executed is withdrawn, and the residual running time is set to be 0; if the state is other state, the remaining time is not 0; and if the remaining time is not 0 and the priority of the remaining time is lower than the priority of other batch tasks of the to-be-executed batch task sequence, selecting a high-priority batch task to run.
According to the method and the device, the grades of the batch tasks are divided, and each batch task is executed according to different task attributes, so that the batch tasks with dependency can be processed in a centralized mode, threads are saved, and the speed of processing the batch tasks by a computer is improved.
Fig. 3 is a schematic diagram of a task attribute obtaining process in a batch task processing method in an embodiment of the present application, as shown in the drawing, S1, obtaining a to-be-processed batch task sequence, and extracting a task attribute of each batch task in the batch task sequence includes:
s11, acquiring a batch task processing request of a requester, wherein the batch task processing request comprises a requester identifier and task information of batch tasks;
specifically, for the processing process of the enterprise cash batch approval transaction, the requesting party may specifically be a customer of a bank, and this step may be implemented by receiving a batch task processing request sent from a front end in a background of a bank system. The batch task processing request may be, for example, a processing request for a batch transfer task, a batch remittance task, or the like, which is made by a customer, and includes task information of the batch task, such as task-related data, a total task amount, or the like, and a requester identifier, which may be related information capable of uniquely identifying the requester, such as an account, a name, a customer number, or the like.
S12, extracting keywords in the task information, and determining the initial task attribute of the task information according to the keywords;
according to the example in the previous step, the keyword may be a user name, for example, if the user name is "zhang san", the task attribute is a personal service, and if the user name is "company a", the corresponding service is an anti-public service, and in this example, "anti-public" and "anti-private" constitute an initial task attribute. Different batch tasks can be roughly classified according to the initial task attributes, so that different threads are used for processing.
And S13, acquiring historical data corresponding to the requester identifier, and modifying the initial task attribute according to the level feature words in the historical data to obtain a final task attribute.
The different request identifiers have different corresponding priority levels, historical data of financial activities performed by different users are recorded in a database, and the users are generally graded according to the stored money amount and credit records of the users. For example, Zhang III is an individual user, but the deposit amount of Zhang III to the bank where the server is located reaches 1000 ten thousand yuan each year, while the deposit amount of company A only reaches 100 ten thousand yuan each year, and the task attribute of Zhang III can be modified into ' fair ', so that the batch task processing time of Zhang III ' is prolonged.
In this embodiment, the initial task attribute is corrected through the historical data, so that the accuracy of task attribute judgment is improved.
Fig. 4 is a schematic diagram of a batch task package generation process in a batch task processing method according to an embodiment of the present application, as shown in the drawing, in step S2, obtaining a dependency relationship between task attributes, and generating a plurality of batch task packages after classifying and packaging batch tasks with dependency relationships, where the step includes:
s21, initiating a task display request for the batch tasks in the batch task sequence, and acquiring the node identifiers after the batch tasks are displayed;
because the batch tasks in the batch task sequence have a dependency relationship, each batch task can be regarded as a task node, and each task node may have a dependency relationship with one or more task nodes. Connecting a plurality of nodes with dependency relationship to form a dependency relationship tree model, and marking each node in the dependency relationship tree model as '1', '2', '3' and the like.
S22, respectively recursively searching upstream dependent task nodes and downstream dependent task nodes by taking any batch task as a starting point according to a preset task tree model;
the results processed by the task nodes 1, 2 and 3 are transmitted downwards to the task node 5, the results processed by the task node 5 are transmitted downwards to the task node 7, the results processed by the task node 7 are transmitted downwards to the task nodes 10 and 11, the data processed by the task node 0 is transmitted downwards to the task node 3, the results processed by the task nodes 3 and 4 are transmitted downwards to the task node 6, the results processed by the task node 6 are transmitted downwards to the task nodes 7, 8 and 9, the results processed by the task node 8 are transmitted downwards to the task node 12, and the like in other cases.
And S23, summarizing the dependent batch tasks corresponding to the upstream dependent task node or the downstream dependent task node, packaging the dependent batch tasks, and generating a batch task package.
Specifically, taking the task node 6 as an example, the upstream task node and the downstream task node of the task node 6 are separated from the dependent task tree model, and after removing the mark, the batch task package is generated after being packaged according to the position relationship from the upstream to the downstream.
In the embodiment, through the task tree model, the dependency relationship among all the batch tasks is quickly obtained, and the efficiency of processing the batch tasks by a computer can be improved.
In an embodiment, the step S4 of decoupling each batch task package, and after inserting the batch tasks obtained after decoupling into corresponding positions of the first-level priority task sequence according to task levels, obtaining a second-level priority task sequence includes:
acquiring a task tree model of the batch tasks of the batch task package, and decoupling any batch task with a dependency relationship as a starting point batch task by taking a node income value in the task tree model as zero as a decoupling condition;
decoupling a condition that an income value of a task node in a task dependency relationship tree model is zero, and establishing a ready task set to dynamically describe tasks which can be scheduled in parallel at a certain moment; the ready set is defined as a set formed by tasks which have no precursor task or have completed direct precursor tasks at a certain moment, namely, the input value is zero; the priorities of the tasks in the set are the same and are not in sequence, and any member can be scheduled as long as the required storage resources are met; when the execution of a certain task in the set is finished and the dependent data is transmitted to the subsequent task, the task is deleted from the ready task set, and the task node in the dependency graph and all directed edges sent from the task node are hidden or logically deleted; and then all tasks with the degree of income of 0 are selected from the new visible graph without hidden edges and task nodes to be added into the ready task set until all tasks are scheduled or the task dependency relationship is completely invisible.
And sequentially decoupling the upstream batch tasks or the downstream batch tasks of the starting point batch tasks, and after all the batch tasks on the task tree model are fully decoupled, inserting the tasks into corresponding positions of the first-level priority task sequences according to the task level of each decoupled batch task to obtain the second-level priority task sequences.
Specifically, after all nodes on the task dependency tree are decoupled, the actual running time of each decoupled task is extracted, and the original task level is adjusted according to the actual running time, that is, the original level of the task a is "3", the running time when a dependency relationship exists is 6, and the running time after decoupling is changed to 4, so that the original level is changed to "4". Namely, the running time is shortened, and the corresponding task level is promoted. And inserting the decoupled tasks according to the position of the new task level in the first-level priority task sequence to form the second-level priority task sequence.
According to the embodiment, the second task sequence is generated by decoupling the batch task packages, so that the batch tasks can be effectively sequenced and combined, and the efficiency of processing the batch tasks by a computer is improved.
In an embodiment, obtaining an expected time for terminating the first task according to the starting time and the execution time, querying a feature value in the first task when the expected time comes, if the feature value exists, the first task is not executed and terminated, otherwise, the execution is terminated, including:
acquiring an execution function of a thread port executing the first task and an argument value to be executed of the execution function;
and when the expected time comes, acquiring an output value of the execution function, carrying out inverse operation according to the output value to obtain a current argument value, comparing the current argument value with an argument value to be executed, if the current argument value is consistent with the argument value to be executed, not finishing the execution of the first task, otherwise finishing the execution.
According to the embodiment, the node of which the execution of the first task is finished is effectively judged, so that the computer resources can be saved, and the waste of threads is reduced.
In an embodiment, the S6 starts a thread, executes a first task in the batch task sequence to be executed, detects a state of the first task when an execution time corresponding to the first task is finished, and determines an execution mode of a subsequent task according to the running state of the first task, where the method further includes:
when any batch task in the batch task sequence to be processed is subjected to data updating, marking the batch task subjected to data updating as a task to be changed, and extracting the task to be changed from the second-level priority task sequence;
acquiring the updating time of the task to be changed and the change condition of the task level;
and adjusting the arrival time according to the change time, determining the updated position of the task to be changed according to the change condition of the task level, and importing the updated task to be changed into the second-level priority task sequence according to the position.
A batch task processing device comprising the following modules:
the system comprises a batch task package module, a task processing module and a task processing module, wherein the batch task package module is used for acquiring a batch task sequence to be processed and extracting the task attribute of each batch task in the batch task sequence; acquiring the dependency relationship among the task attributes, classifying and packaging the batch tasks with the dependency relationship, and generating a plurality of batch task packages;
the level priority sequence module is used for extracting task level information in the task attributes, and arranging batch tasks without dependency relationship from high level to low level according to different task levels to generate a first level priority task sequence; decoupling each batch task package, inserting the batch tasks obtained after decoupling into corresponding positions of the first-level priority task sequences according to different task levels, and obtaining second-level priority task sequences;
the batch task running module is set to acquire the time and the execution time of each batch task in the second-level priority task sequence reaching the task processing server, and rearranges the batch processing tasks at the same level according to the time of reaching the task processing server to obtain a to-be-executed batch task sequence; starting a thread, executing a first task in the batch task sequence to be executed, detecting the state of the first task when the execution time corresponding to the first task is finished, and determining the execution mode of a continuous task according to the running state of the first task.
In one embodiment, a computer device is presented, at least one processor, a memory, and a transceiver;
the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the steps of the batch task processing method in the embodiments.
In one embodiment, a storage medium is provided, in which computer readable instructions are stored, and when executed by one or more processors, the one or more processors are caused to execute the steps of the batch task processing method in the above embodiments. Wherein the storage medium may be a non-volatile storage medium.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, or the like.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-described embodiments are merely illustrative of some embodiments of the present application, which are described in more detail and detail, but are not to be construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method for batch task processing, comprising:
acquiring a batch task sequence to be processed, and extracting task attributes of each batch task in the batch task sequence;
acquiring the dependency relationship among the task attributes, classifying and packaging the batch tasks with the dependency relationship, and generating a plurality of batch task packages;
extracting task level information in the task attributes, and arranging batch tasks without dependency relationship from high level to low level according to different task levels to generate a first-level priority task sequence;
decoupling each batch task package, and respectively inserting the batch tasks obtained after decoupling into corresponding positions of the first-level priority task sequences according to different task levels to obtain second-level priority task sequences;
acquiring the time and execution time of each batch task in the second-level priority task sequence to reach the task processing server, and rearranging the batch processing tasks at the same level according to the time to reach the task processing server to obtain a to-be-executed batch task sequence;
starting a thread, executing a first task in the batch task sequence to be executed, detecting the state of the first task when the execution time corresponding to the first task is finished, and determining the execution mode of a continuous task according to the running state of the first task.
2. The batch task processing method according to claim 1, wherein obtaining a batch task sequence to be processed and extracting a task attribute of each batch task in the batch task sequence comprises:
acquiring a batch task processing request of a requester, wherein the batch task processing request comprises a requester identifier and task information of batch tasks;
extracting keywords in the task information, and determining initial task attributes of the task information according to the keywords;
and acquiring historical data corresponding to the requester identifier, and modifying the initial task attribute according to the level feature words in the historical data to obtain a final task attribute.
3. The method of claim 1, wherein the obtaining of the dependency relationship among the task attributes and the generating of the plurality of batch task packages after classifying and packaging the batch tasks with the dependency relationship comprise:
initiating a task display request for the batch tasks in the batch task sequence to acquire node identifiers after the batch tasks are displayed;
respectively recursively searching an upstream dependent task node and a downstream dependent task node by taking any batch task as a starting point according to a preset task tree model;
and summarizing the dependent batch tasks corresponding to the upstream dependent task nodes or the downstream dependent task nodes, packaging the dependent batch tasks, and generating a batch task package.
4. The batch task processing method of claim 3, wherein the decoupling each batch task packet, and inserting the batch tasks obtained after decoupling into corresponding positions of the first-level priority task sequence according to task levels to obtain a second-level priority task sequence comprises:
acquiring a task tree model of the batch tasks of the batch task package, and decoupling any batch task with a dependency relationship as a starting point batch task by taking a node income value in the task tree model as zero as a decoupling condition;
and sequentially decoupling the upstream batch tasks or the downstream batch tasks of the starting point batch tasks, and after all the batch tasks on the task tree model are fully decoupled, inserting the tasks into corresponding positions of the first-level priority task sequences according to the task level of each decoupled batch task to obtain the second-level priority task sequences.
5. The batch task processing method according to claim 1, wherein the starting of the thread, executing a first task in the batch task sequence to be executed, detecting a state of the first task when an execution time corresponding to the first task is completed, and determining an execution mode of a subsequent task according to an operation state of the first task, includes:
determining a thread for executing the first task according to the task attribute of the first task;
starting the thread to execute the first task, and recording the starting time and the execution time of the first task;
obtaining the expected time for terminating the first task according to the starting time and the execution time, inquiring a characteristic value in the first task when the expected time arrives, if the characteristic value exists, not finishing the execution of the first task, otherwise finishing the execution;
and if the execution of the first task is finished, the thread is applied to sequentially execute the continuous tasks, otherwise, a new thread is started to execute the continuous tasks.
6. The batch task processing method according to claim 5, wherein the obtaining an expected time for terminating the first task according to the starting time and the execution time, when the expected time comes, querying a feature value in the first task, if the feature value exists, the first task is not executed and is not finished, otherwise, the execution is finished, and the method includes:
acquiring an execution function of a thread port executing the first task and an argument value to be executed of the execution function;
and when the expected time comes, acquiring an output value of the execution function, carrying out inverse operation according to the output value to obtain a current argument value, comparing the current argument value with an argument value to be executed, if the current argument value is consistent with the argument value to be executed, not finishing the execution of the first task, otherwise finishing the execution.
7. The batch task processing method according to claim 1, wherein the thread is started, a first task in the batch task sequence to be executed is executed, when an execution time corresponding to the first task is finished, a state of the first task is detected, and after an execution mode of a subsequent task is determined according to an operation state of the first task, the method further comprises:
when any batch task in the batch task sequence to be processed is subjected to data updating, marking the batch task subjected to data updating as a task to be changed, and extracting the task to be changed from the second-level priority task sequence;
acquiring the updating time of the task to be changed and the change condition of the task level;
and adjusting the arrival time according to the change time, determining the updated position of the task to be changed according to the change condition of the task level, and importing the updated task to be changed into the second-level priority task sequence according to the position.
8. A batch task processing apparatus, comprising:
the system comprises a batch task package module, a task processing module and a task processing module, wherein the batch task package module is used for acquiring a batch task sequence to be processed and extracting the task attribute of each batch task in the batch task sequence; acquiring the dependency relationship among the task attributes, classifying and packaging the batch tasks with the dependency relationship, and generating a plurality of batch task packages;
the level priority sequence module is used for extracting task level information in the task attributes, and arranging batch tasks without dependency relationship from high level to low level according to different task levels to generate a first level priority task sequence; decoupling each batch task package, inserting the batch tasks obtained after decoupling into corresponding positions of the first-level priority task sequences according to different task levels, and obtaining second-level priority task sequences;
the batch task running module is set to acquire the time and the execution time of each batch task in the second-level priority task sequence reaching the task processing server, and rearranges the batch processing tasks at the same level according to the time of reaching the task processing server to obtain a to-be-executed batch task sequence; starting a thread, executing a first task in the batch task sequence to be executed, detecting the state of the first task when the execution time corresponding to the first task is finished, and determining the execution mode of a continuous task according to the running state of the first task.
9. A computer device, the device comprising:
at least one processor, memory, and transceiver;
wherein the memory is configured to store program code and the processor is configured to call the program code stored in the memory to perform the batch task processing method according to any one of claims 1 to 7.
10. A computer storage medium characterized in that it comprises instructions which, when run on a computer, cause the computer to perform the steps of the batch task processing method according to any one of claims 1 to 7.
CN201910847800.XA 2019-09-09 2019-09-09 Batch task processing method and device, computer equipment and storage medium Pending CN110704170A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910847800.XA CN110704170A (en) 2019-09-09 2019-09-09 Batch task processing method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910847800.XA CN110704170A (en) 2019-09-09 2019-09-09 Batch task processing method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110704170A true CN110704170A (en) 2020-01-17

Family

ID=69194808

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910847800.XA Pending CN110704170A (en) 2019-09-09 2019-09-09 Batch task processing method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110704170A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112040082A (en) * 2020-09-10 2020-12-04 广东新禾道信息科技有限公司 Image picture batch processing method and device, server and storage medium
CN112308443A (en) * 2020-11-09 2021-02-02 中国科学院空天信息创新研究院 Batch scheduling method and device for remote sensing information product generation workflow
CN112506991A (en) * 2020-12-03 2021-03-16 杭州小电科技股份有限公司 Method, system, electronic device and storage medium for parallel processing
CN112925624A (en) * 2021-03-17 2021-06-08 中国电子系统技术有限公司 Configuration method and device of data processing task

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101038559A (en) * 2006-09-11 2007-09-19 中国工商银行股份有限公司 Batch task scheduling engine and dispatching method
US7949575B1 (en) * 2003-12-30 2011-05-24 Microsoft Corporation Account related task processing
CN109901926A (en) * 2019-01-25 2019-06-18 平安科技(深圳)有限公司 Method, server and storage medium based on big data behavior scheduling application task

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7949575B1 (en) * 2003-12-30 2011-05-24 Microsoft Corporation Account related task processing
CN101038559A (en) * 2006-09-11 2007-09-19 中国工商银行股份有限公司 Batch task scheduling engine and dispatching method
CN109901926A (en) * 2019-01-25 2019-06-18 平安科技(深圳)有限公司 Method, server and storage medium based on big data behavior scheduling application task

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
孙德洋;娄嘉鹏;李建鹏;张寒冰;成文文;: "异构云环境下的密码服务调度方法", 计算机应用与软件, no. 06, 12 June 2019 (2019-06-12) *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112040082A (en) * 2020-09-10 2020-12-04 广东新禾道信息科技有限公司 Image picture batch processing method and device, server and storage medium
CN112040082B (en) * 2020-09-10 2021-05-14 广东新禾道信息科技有限公司 Image picture batch processing method and device, server and storage medium
CN112308443A (en) * 2020-11-09 2021-02-02 中国科学院空天信息创新研究院 Batch scheduling method and device for remote sensing information product generation workflow
CN112506991A (en) * 2020-12-03 2021-03-16 杭州小电科技股份有限公司 Method, system, electronic device and storage medium for parallel processing
CN112925624A (en) * 2021-03-17 2021-06-08 中国电子系统技术有限公司 Configuration method and device of data processing task

Similar Documents

Publication Publication Date Title
CN110704170A (en) Batch task processing method and device, computer equipment and storage medium
CN109104336B (en) Service request processing method and device, computer equipment and storage medium
US8224845B2 (en) Transaction prediction modeling method
US7627544B2 (en) Recognizing event patterns from event streams
CN108280150B (en) Distributed asynchronous service distribution method and system
Trummer et al. Multi-objective quality-driven service selection—A fully polynomial time approximation scheme
CN112100204A (en) Virtual resource regulation and control method and device, computer equipment and storage medium
US20120278513A1 (en) Priority scheduling for multi-channel context aware communication technology
CN111881221A (en) Method, device and equipment for customer portrait in logistics service
CN110727857A (en) Method and device for identifying key features of potential users aiming at business objects
CN111858062A (en) Evaluation rule optimization method, service evaluation method and related equipment
CN111553652B (en) Service processing method and device
CN112925664A (en) Target user determination method and device, electronic equipment and storage medium
CN112396480B (en) Order business data processing method, system, computer equipment and storage medium
CN111124791A (en) System testing method and device
CN112365239A (en) Event-based cloud service management handling method and system
CN115168014A (en) Job scheduling method and device
CN115033590A (en) Multi-domain data fusion method, device and storage medium
CN114253746A (en) Product application service management method, device, equipment and medium based on software as a service (SaaS)
CN114048512A (en) Method and device for processing sensitive data
CN111008078A (en) Batch processing method, device and equipment of data and computer storage medium
CN111652709A (en) Payment splitting method and device
JP5401577B2 (en) Computing that analyzes the credit of telephone users based on the current usage of telephone numbers and past survey history using the analysis method specified by the survey client
CN112783775A (en) Special character input testing method and device
CN107085536A (en) A kind of task management method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination