CN108228326A - Batch tasks processing method and distributed system - Google Patents

Batch tasks processing method and distributed system Download PDF

Info

Publication number
CN108228326A
CN108228326A CN201711487115.8A CN201711487115A CN108228326A CN 108228326 A CN108228326 A CN 108228326A CN 201711487115 A CN201711487115 A CN 201711487115A CN 108228326 A CN108228326 A CN 108228326A
Authority
CN
China
Prior art keywords
processing
task
piece
task piece
tasks
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711487115.8A
Other languages
Chinese (zh)
Inventor
黄进
王明远
井宏博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Lexin Software Technology Co Ltd
Original Assignee
Shenzhen Lexin Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Lexin Software Technology Co Ltd filed Critical Shenzhen Lexin Software Technology Co Ltd
Priority to CN201711487115.8A priority Critical patent/CN108228326A/en
Publication of CN108228326A publication Critical patent/CN108228326A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/51Discovery or management thereof, e.g. service location protocol [SLP] or web services

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the invention discloses a kind of batch tasks processing method and distributed system, wherein, batch tasks processing method is included based on the distributed system for including multiple processing ends, this method:Batch tasks are divided into multiple tasks piece by the processing end for receiving batch tasks, it and will be in the storage to database of multiple tasks piece, it whether there is pending task piece in each processing end timing scan database, if pending task piece is arrived in scanning, it then seizes the pending task piece scanned and is handled, solve the problems, such as that existing distributed system has to rely on management end and can just carry out task processing, it can be realized batch tasks are handled so that processing end need not rely on management end, achieved the effect that decentralization.

Description

Batch tasks processing method and distributed system
Technical field
The present embodiments relate to computer application technology more particularly to a kind of batch tasks processing method and distributions Formula system.
Background technology
With the development of microprocessing, the more single large centralised system of distributed system has higher cost performance, It can realize the function similar to single large centralised system by less expensive price, and therefore, distributed system becomes The more and more common system architecture used.
The system architecture of existing distributed system generally includes resource management layer and tasks carrying layer, wherein resource management Control device of the layer as whole system, can realize and task, resource etc. are allocated and tasks carrying layer is controlled The functions such as system;The task processing equipment of tasks carrying layer can realize specific tasks carrying logic, and can realize horizontal expansion Exhibition.
But for existing distributed system, only after resource management layer sends out instruction, task processing equipment ability Enough to perform corresponding action, once resource management layer goes wrong, whole system would be at the state of paralysis.
Invention content
The present invention provides a kind of batch tasks processing method and distributed system, to reach in batch tasks processing procedure In, processing end need not rely on management end and the effect handled batch tasks can be realized.
In a first aspect, an embodiment of the present invention provides a kind of batch tasks processing method, the method is based on distributed system System, wherein, the distributed system includes multiple processing ends;The method includes:
It receives the processing ends of batch tasks and the batch tasks is divided into multiple tasks piece, and by the multiple task In piece storage to database;
With the presence or absence of pending task piece in database described in each processing end timing scan, if scanning is appointed to pending Business piece, then seize the pending task piece scanned and handled.
Second aspect, the embodiment of the present invention additionally provide a kind of distributed system, which includes database and multiple places Manage end;
The processing end includes:
Division module for the batch tasks received to be divided into multiple tasks piece, and the multiple task piece is deposited It stores up in database;
Scan module, for whether there is pending task piece in database described in timing scan;
Processing module is seized, if for scanning to pending task piece, seizes the pending task piece scanned And it is handled;
The task piece that the database divides the batch tasks received for storage.
The embodiment of the present invention is based on the distributed system for including multiple processing ends, by the processing end for receiving batch tasks Batch tasks are divided into multiple tasks piece, and multiple tasks piece is stored into database, each processing end timing scan data With the presence or absence of pending task piece in library, if scanning seizes the pending task scanned to pending task piece Piece is simultaneously handled, and is solved the problems, such as that existing distributed system has to rely on management end and can just carry out task processing, is made Processing end need not rely on management end and can be realized batch tasks are handled, achieved the effect that decentralization;It is in addition, each Processing end performs the time without pinned task, can receive batch tasks at any time, increases while handling batch tasks facilitating user The usage experience of user is added.
Description of the drawings
Exemplary embodiments of the present invention will be described in detail referring to the drawings by general below, makes those of ordinary skill in the art The above-mentioned and other feature and advantage of the present invention are become apparent from, in attached drawing:
Fig. 1 is the flow chart of the batch tasks processing method in the embodiment of the present invention one;
Fig. 2 is the flow chart of the batch tasks processing method in the embodiment of the present invention two;
Fig. 3 is the flow chart of the batch tasks processing method in the embodiment of the present invention three;
Fig. 4 is the structure diagram of the distributed system in the embodiment of the present invention four.
Specific embodiment
The present invention is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched The specific embodiment stated is used only for explaining the present invention rather than limitation of the invention.It also should be noted that in order to just Part related to the present invention rather than entire infrastructure are illustrated only in description, attached drawing.
Embodiment one
Fig. 1 is the flow chart of batch tasks processing method that the embodiment of the present invention one provides, and the present embodiment is applicable to deposit In batch tasks situation to be treated, this method can be performed by distributed system, wherein, distributed system includes multiple Processing end.As shown in Figure 1, this method specifically includes:
S110, it receives the processing ends of batch tasks batch tasks is divided into multiple tasks piece, and by multiple tasks piece It stores in database.
In the present embodiment, distributed system includes multiple mutually independent processing ends, wherein, each processing end can Externally input batch tasks are received at any time, and can be divided into batch tasks accordingly by the characteristics of batch tasks Multiple tasks piece, wherein each task piece is the part that can be cut out in batch tasks.It is inputted in user pending Batch tasks after, preferably can be to receive the processing ends of batch tasks batch tasks directly are divided into multiple tasks piece, It and will be in the storage to database of multiple tasks piece.
It, can also be it should be noted that when the currently processed amount for the processing end for receiving batch tasks is larger After the processing end receives batch tasks, batch tasks are uploaded in database, are existed by the smaller processing end of currently processed amount The batch tasks are obtained in database and are divided into corresponding multiple tasks piece, can also be carried while alleviating and handling end pressure The processing speed of high batch tasks.
With the presence or absence of pending task piece in S120, each processing end timing scan database, if scanning to pending Task piece is then seized the pending task piece scanned and is handled.
It is preserved in the present embodiment, in database and the corresponding each task piece of batch tasks.Including receiving batch tasks Processing end including each processing end every preset time will to the task piece in database carry out single pass, with determine number According to whether there is pending task piece in library, if scanning to pending task piece, seize scan it is pending Task piece is simultaneously handled accordingly.Wherein, preset time can be the time of the magnitudes such as second, millisecond, microsecond.
Batch tasks processing method provided in this embodiment, based on the distributed system for including multiple processing ends, this method Batch tasks are divided into, and multiple tasks piece is stored to data by multiple tasks piece by the processing end for receiving batch tasks In library, with the presence or absence of pending task piece in each processing end timing scan database, if pending task piece is arrived in scanning, It seizes the pending task piece scanned and is handled, solve existing distributed system and have to rely on management end and just may be used The problem of to carry out task processing so that processing end, which need not rely on management end and can be realized, handles batch tasks, reaches The effect of decentralization;Distributed system is easy to extend simultaneously, and in extension without being additionally configured, direct dilatation processing end is It can;In addition, each processing end performs the time without pinned task, batch tasks can be received at any time, appointed user is facilitated to handle batch The usage experience of user is increased while business.
Based on the above technical solution, further, batch tasks are drawn in the processing end for receiving batch tasks It is divided into after multiple tasks piece, including:The processing ends of batch tasks is received by the processing of task piece each in multiple tasks piece State marks.
In the present embodiment, after batch tasks are divided into multiple tasks piece in the processing end for receiving batch tasks, also Can be pending by the processing status indication of each task piece, wherein, processing state can be that each task piece is self-contained Inherent feature.In addition, if when processing state is not task piece self-contained inherent feature, when task piece uploads to data After in library, each task piece and the correspondence of corresponding processing state can also store in the form of a table, critical field in table The processing state of task piece can be marked, wherein, the processing state of task piece can be pending, processing neutralisation treatment completes three Kind.
Embodiment two
The present embodiment provides " database described in each processing end timing scan on the basis of the various embodiments described above In with the presence or absence of pending task piece, if scanning seizes the pending task piece scanned to pending task piece And handled " operation a kind of preferred embodiment.Fig. 2 is batch tasks processing method provided by Embodiment 2 of the present invention Flow chart, as shown in Fig. 2, this method includes:
S210, it receives the processing ends of batch tasks batch tasks is divided into multiple tasks piece, and by multiple tasks piece It stores in database.
Task piece in S220, each processing end timing scan database, and data are determined according to the processing state of task piece It whether there is pending task piece in library.
In the present embodiment, each processing end, can be according to the self-contained processing shape of task piece when carrying out task piece scanning State determined in database with the presence or absence of pending task piece, can also be according to being preserved in the form of a table in database for task The processing state of piece come determine in database whether there is pending task piece.
If S230, scanning to pending task piece, seize the pending task piece scanned, processing end will be seized To task piece processing status indication for processing in.
In the present embodiment, if the processing state of scanning to task piece is pending, scanning to the waiting task piece Processing end seize the task piece, meanwhile, the processing end by the processing status indication of the task piece for processing in, specifically, can The self-contained processing status indication of task piece as in processing, can also will be handled in database table at status keyword section Labeled as in processing.
S240, processing end are put into the task piece that accounts for is robbed in the task processing queue of processing end, and handle task successively Handle the task piece in queue.
In the present embodiment, processing end, which is often robbed, accounts for a task piece, and corresponding task piece will be put into appointing for processing end Business processing queue is medium pending, wherein, it can cache multiple processing ends in the task processing queue in each processing end and rob The task piece accounted for.Each processing end can also include a task and handle thread, for handling received task piece successively, Task piece in task processing queue can be committed to preset flow rate in task processing thread and handled by processing end successively.
Batch tasks processing method provided in this embodiment, this method is on the basis of the various embodiments described above, according to task The processing state of piece is determined with the presence or absence of pending task piece in database, if scanning is seized to pending task piece The pending task piece scanned, processing end will rob the processing status indication of the task piece accounted for as in processing, processing end will It robs the task piece accounted for be put into the task processing queue of processing end, be committed to so that task is handled the task piece in queue successively Thread is handled.While reaching decentralization, user facilitated to handle batch tasks, increase user experience effect, The processing state of task piece can be accurately determined, takes multiple scan and handles to avoid to same task piece.
On the basis of the various embodiments described above, further, it will rob the task piece that accounts in processing end and be put into processing end After in task processing queue, further include:
If processing end detects that the task piece number in task processing queue reaches predetermined threshold value, stop scanning.
In the present embodiment, the task processing queue of each processing end is there are a caching upper limit, if the task of caching Piece has been more than the upper limit, and the task process performance of processing end is likely to be affected.Based on this, for the processing of each task Queue can set a predetermined threshold value, if processing end detects that the task piece number in task processing queue has reached pre- If threshold value, then stop the scanning to pending task piece.After the task piece number that task is handled in queue lowers (such as Drop to preset value), processing end can continue to whether there is pending task piece in timing scan database.
On the basis of the various embodiments described above, further, the task piece in the task processing queue, packet are handled successively It includes:
The task piece that task is handled in queue is committed to thread successively according to preset flow rate to handle.
In the present embodiment, queue is handled to the flow velocity of thread submission task piece in order to limit task, to mitigate thread process The pressure of task piece can utilize the current limlitings tool settings preset flow rates such as LinkedBlockingQueue and RateLimiter, So that the task piece in task processing queue is committed to thread successively according to preset flow rate, this method simple, intuitive can be real-time Observation and the processing speed of control task piece, prevent load too high.
Embodiment three
The present embodiment is optimized on the basis of the various embodiments described above, and Fig. 3 is batch that the embodiment of the present invention three provides The flow chart of task processing method is measured, as shown in figure 3, this method includes:
S310, it receives the processing ends of batch tasks batch tasks is divided into multiple tasks piece, and by multiple tasks piece It stores in database.
Task piece in S320, each processing end timing scan database, and data are determined according to the processing state of task piece It whether there is pending task piece in library.
If S330, scanning to pending task piece, seize the pending task piece scanned, processing end will be seized To task piece processing status indication for processing in.
S340, processing end are put into the task piece that accounts for is robbed in the task processing queue of processing end, and handle task successively Handle the task piece in queue.
S350, processing end are completed after a task piece has been handled, by the processing status indication of the task piece for processing, and The handling result of the task piece is sent to database.
In the present embodiment, processing end has often handled a task piece by thread, can will handle the task piece of completion Status indication is handled to complete for processing, specifically, the self-contained processing status indication of task piece can be completed for processing, It can be completed being handled in database table at status keyword section labeled as processing.Meanwhile processing end will handle completing for task The handling result of piece is sent in database.
Batch tasks processing method provided in this embodiment on the basis of the various embodiments described above, will handle appointing for completion The processing status indication of business piece is completed, and handling result is sent to database for processing.Reach decentralization, facilitate user While handling batch tasks, increase user experience effect, the processing state of task piece can be accurately determined, with after an action of the bowels The handling result of continuous output batch tasks.
On the basis of the various embodiments described above, further, the batch is appointed in the processing end for receiving batch tasks Business is divided into after multiple tasks piece, is further included:
In each processing end timing scan database with the processing state of the corresponding each task piece of batch tasks whether It is completed for processing;
It is completed if the processing state scanned to each task piece is processing, seizes the handling result of each task piece, and will The handling result of each task piece is merged and is exported.
In the present embodiment, each processing end will carry out single pass every preset time to the task piece in database, with Determine in database with the processing state of the corresponding each task piece of batch tasks whether be that processing is completed, if scanning is to each The processing state of business piece is that processing is completed, then seizes the handling result of each task piece, and by the handling result of each task piece into Row merges and exports.Wherein, preset time can be the time of the magnitudes such as second, millisecond, microsecond.It should be noted that scanning with The preset time of the processing state of the corresponding each task piece of batch tasks can with scanning the preset time of pending task piece With identical, can also be different;In addition, twice sweep corresponds to two different operations, may be performed simultaneously, it can not also be simultaneously It performs.
It should be noted that a sub- tasks carrying thread and main task execution can be respectively included in each processing end Thread, wherein, subtask execution thread scans for execution and seizes pending task piece, task piece is put into task processing Queue, successively processing task handle the task piece in queue and handling result are sent to database operation;Main task is held Line journey scans and seizes batch tasks and the handling result by each task piece of batch tasks that processing is completed for performing The operations such as merge and export.
Batch tasks processing method provided in this embodiment performs line by the way that the execution thread of processing end is divided into subtask Journey and main task execution thread so that processing end may be performed simultaneously task piece processing operation and the output of batch tasks handling result Operation improves the processing speed of batch tasks.
Example IV
Fig. 4 is the structure diagram of the distributed system in the embodiment of the present invention four, as shown in figure 4, distributed system 400 Including:Database 410 and multiple processing ends 420;
Wherein, processing end 420 includes:
Division module 421 for the batch tasks received to be divided into multiple tasks piece, and multiple tasks piece is stored Into database;
Scan module 422, for whether there is pending task piece in timing scan database;
Processing module 423 is seized, if for scanning to pending task piece, seizes the pending task scanned Piece is simultaneously handled;
The task piece that database 410 divides the batch tasks received for storage.
Further, division module 421 includes the first indexing unit, for the batch tasks to be divided into multiple It is engaged in after piece, the processing state of task piece each in multiple tasks piece is marked is.
Further, scan module 422 is specifically used for the task piece in timing scan database, and according to the place of task piece Reason state determines to whether there is pending task piece in database.
Further, processing module 423 is seized to include:
Second indexing unit, for the processing status indication of the task piece accounted for will to be robbed as in processing;
Processing unit is seized, is put into the task processing queue of processing end, and locate successively for the task piece accounted for will to be robbed Task piece in the processing queue of reason task.
Further, processing module 423 is seized to further include:
The processing status indication of task piece after a task piece has been handled in processing end, is by third indexing unit Processing is completed;
Handling result uploading unit, for handling result to be sent to database.
Further, processing end 420 further includes main task processing module, wherein, main task processing module includes:
Batch tasks scanning element, for batch tasks to be divided into multiple tasks in the processing end for receiving batch tasks After piece, in timing scan database with the processing state of the corresponding each task piece of batch tasks whether be processing complete;
Handling result preemption unit is completed if the processing state for scanning to each task piece is processing, is seized each The handling result of task piece;
Handling result merges output unit, for merging and exporting the handling result of each task piece.
Further, processing end 420 further includes:
Monitoring module is scanned, for being put into it in the task processing queue of processing end by the task piece accounted for is robbed in processing end Afterwards, if detecting, the task piece number in task processing queue reaches predetermined threshold value, stops scanning.
Further, processing unit is seized to be specifically used for:
The task piece that task is handled in queue is committed to thread successively according to preset flow rate to handle.
The distributed system that the embodiment of the present invention is provided can perform the batch tasks that any embodiment of the present invention is provided Processing method has the corresponding function module of execution method and advantageous effect.
Note that it above are only presently preferred embodiments of the present invention and institute's application technology principle.It will be appreciated by those skilled in the art that The present invention is not limited to specific embodiment described here, can carry out for a person skilled in the art various apparent variations, It readjusts and substitutes without departing from protection scope of the present invention.Therefore, although being carried out by above example to the present invention It is described in further detail, but the present invention is not limited only to above example, without departing from the inventive concept, also It can include other more equivalent embodiments, and the scope of the present invention is determined by scope of the appended claims.

Claims (16)

1. a kind of batch tasks processing method, which is characterized in that the method is based on distributed system, wherein, the distribution System includes multiple processing ends;The method includes:
The batch tasks are divided into multiple tasks piece, and the multiple task piece is deposited by the processing end for receiving batch tasks It stores up in database;
With the presence or absence of pending task piece in database described in each processing end timing scan, if pending task is arrived in scanning Piece is then seized the pending task piece scanned and is handled.
2. according to the method described in claim 1, it is characterized in that, the batch is appointed in the processing end for receiving batch tasks Business is divided into after multiple tasks piece, including:
The processing end for receiving batch tasks marks the processing state of task piece each in the multiple task piece as place Reason.
3. according to the method described in claim 1, it is characterized in that, in database described in each processing end timing scan whether There are pending task piece, including:
Task piece in database described in each processing end timing scan, and institute is determined according to the processing state of the task piece It states and whether there is pending task piece in database.
4. according to the method described in claim 1, it is characterized in that, if scanning is seized scanning and is arrived to pending task piece Pending task piece and handled, including:
The processing end will rob the processing status indication of the task piece accounted for as in processing;
The processing end is put into the task piece that accounts for is robbed in the task processing queue of the processing end, and processing is described successively appoints Task piece in business processing queue.
5. according to the method described in claim 4, it is characterized in that, will be robbed in the processing end task piece that accounts for be put into it is described In the task processing queue of processing end, and after handling the task piece in task processing queue successively, further include:
It is completed after a task piece has been handled, by the processing status indication of the task piece for processing the processing end;
Handling result is sent to the database by the processing end.
6. according to claim 1-5 any one of them methods, which is characterized in that in the processing end for receiving batch tasks by institute It states batch tasks to be divided into after multiple tasks piece, further include:
In database described in each processing end timing scan with the processing state of the corresponding each task piece of the batch tasks Whether it is that processing is completed;
It is completed if the processing state scanned to each task piece is processing, seizes the handling result of each task piece, And the handling result of each task piece is merged and exported.
7. according to the method described in claim 4, it is characterized in that, will be robbed in the processing end task piece that accounts for be put into it is described After in the task processing queue of processing end, further include:
If the processing end detects that the task piece number in the task processing queue reaches predetermined threshold value, stop scanning.
8. according to the method described in claim 4, it is characterized in that, handle the task piece in task processing queue successively, Including:
The task piece that the task is handled in queue is committed to thread successively according to preset flow rate to handle.
9. a kind of distributed system, which is characterized in that including database and multiple processing ends;
The processing end includes:
Division module for the batch tasks received to be divided into multiple tasks piece, and the multiple task piece storage is arrived In database;
Scan module, for whether there is pending task piece in database described in timing scan;
Processing module is seized, if for scanning to pending task piece, the pending task piece scanned is seized and goes forward side by side Row processing;
The task piece that the database divides the batch tasks received for storage.
10. system according to claim 9, which is characterized in that the division module includes:
First indexing unit, will be in the multiple task piece for after the batch tasks are divided into multiple tasks piece The processing state of each task piece, which marks, is.
11. system according to claim 9, which is characterized in that the scan module is specifically used for:
Task piece in database described in timing scan, and determined according to the processing state of the task piece be in the database It is no that there are pending task pieces.
12. system according to claim 9, which is characterized in that the processing module of seizing includes:
Second indexing unit, for the processing status indication of the task piece accounted for will to be robbed as in processing;
Processing unit is seized, is put into the task processing queue of the processing end, and locate successively for the task piece accounted for will to be robbed Manage the task piece in the task processing queue.
13. system according to claim 12, which is characterized in that the processing module of seizing further includes:
Third indexing unit, after having handled a task piece in the processing end, by the processing state mark of the task piece Processing is denoted as to complete;
Handling result uploading unit, for handling result to be sent to the database.
14. according to claim 9-13 any one of them systems, which is characterized in that the processing end further includes main task processing Module, wherein, main task processing module includes:
Batch tasks scanning element, for the batch tasks to be divided into multiple tasks in the processing end for receiving batch tasks After piece, whether the processing state in database described in timing scan with the corresponding each task piece of the batch tasks is processing It completes;
Handling result preemption unit is completed if the processing state for scanning to each task piece is processing, seizes institute State the handling result of each task piece;
Handling result merges output unit, for merging and exporting the handling result of each task piece.
15. system according to claim 12, which is characterized in that the processing end further includes:
Scan monitoring module, for the processing end by rob the task piece accounted for be put into the processing end task handle queue In after, if detecting, the task piece number in task processing queue reaches predetermined threshold value, stops scanning.
16. system according to claim 12, which is characterized in that the processing unit of seizing is specifically used for:
The task piece that the task is handled in queue is committed to thread successively according to preset flow rate to handle.
CN201711487115.8A 2017-12-29 2017-12-29 Batch tasks processing method and distributed system Pending CN108228326A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711487115.8A CN108228326A (en) 2017-12-29 2017-12-29 Batch tasks processing method and distributed system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711487115.8A CN108228326A (en) 2017-12-29 2017-12-29 Batch tasks processing method and distributed system

Publications (1)

Publication Number Publication Date
CN108228326A true CN108228326A (en) 2018-06-29

Family

ID=62646484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711487115.8A Pending CN108228326A (en) 2017-12-29 2017-12-29 Batch tasks processing method and distributed system

Country Status (1)

Country Link
CN (1) CN108228326A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109299119A (en) * 2018-08-30 2019-02-01 上海艾融软件股份有限公司 A kind of control system and method for mass data distribution batch processing
CN110119305A (en) * 2019-05-13 2019-08-13 北京达佳互联信息技术有限公司 Task executing method, device, computer equipment and storage medium
CN110515718A (en) * 2019-08-30 2019-11-29 深圳前海微众银行股份有限公司 Batch tasks breakpoint is continuous to make method, apparatus, equipment and medium
CN110532077A (en) * 2019-08-22 2019-12-03 腾讯科技(深圳)有限公司 Task processing method, device and storage medium
CN112148505A (en) * 2020-09-18 2020-12-29 京东数字科技控股股份有限公司 Data batching system, method, electronic device and storage medium
CN113806052A (en) * 2021-09-24 2021-12-17 四川新网银行股份有限公司 Decentralized distributed timing task processing method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070179931A1 (en) * 2006-01-31 2007-08-02 International Business Machines Corporation Method and program product for automating the submission of multiple server tasks for updating a database
CN101359295A (en) * 2007-08-01 2009-02-04 阿里巴巴集团控股有限公司 Batch task scheduling and allocating method and system
CN102622426A (en) * 2012-02-27 2012-08-01 杭州闪亮科技有限公司 Database writing system and database writing method
CN104598320A (en) * 2015-01-30 2015-05-06 北京正奇联讯科技有限公司 Task execution method and system based on distributed system
CN105339940A (en) * 2013-06-28 2016-02-17 甲骨文国际公司 Naive, client-side sharding with online addition of shards
CN106095585A (en) * 2016-06-22 2016-11-09 中国建设银行股份有限公司 Task requests processing method, device and enterprise information system
US20170228422A1 (en) * 2016-02-10 2017-08-10 Futurewei Technologies, Inc. Flexible task scheduler for multiple parallel processing of database data
CN107430528A (en) * 2015-03-09 2017-12-01 亚马逊科技公司 Opportunistic resource migration is placed with optimizing resource

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070179931A1 (en) * 2006-01-31 2007-08-02 International Business Machines Corporation Method and program product for automating the submission of multiple server tasks for updating a database
CN101359295A (en) * 2007-08-01 2009-02-04 阿里巴巴集团控股有限公司 Batch task scheduling and allocating method and system
CN102622426A (en) * 2012-02-27 2012-08-01 杭州闪亮科技有限公司 Database writing system and database writing method
CN105339940A (en) * 2013-06-28 2016-02-17 甲骨文国际公司 Naive, client-side sharding with online addition of shards
CN104598320A (en) * 2015-01-30 2015-05-06 北京正奇联讯科技有限公司 Task execution method and system based on distributed system
CN107430528A (en) * 2015-03-09 2017-12-01 亚马逊科技公司 Opportunistic resource migration is placed with optimizing resource
US20170228422A1 (en) * 2016-02-10 2017-08-10 Futurewei Technologies, Inc. Flexible task scheduler for multiple parallel processing of database data
CN106095585A (en) * 2016-06-22 2016-11-09 中国建设银行股份有限公司 Task requests processing method, device and enterprise information system

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109299119A (en) * 2018-08-30 2019-02-01 上海艾融软件股份有限公司 A kind of control system and method for mass data distribution batch processing
CN110119305A (en) * 2019-05-13 2019-08-13 北京达佳互联信息技术有限公司 Task executing method, device, computer equipment and storage medium
CN110119305B (en) * 2019-05-13 2022-01-21 北京达佳互联信息技术有限公司 Task execution method and device, computer equipment and storage medium
CN110532077A (en) * 2019-08-22 2019-12-03 腾讯科技(深圳)有限公司 Task processing method, device and storage medium
CN110532077B (en) * 2019-08-22 2021-12-07 腾讯科技(深圳)有限公司 Task processing method and device and storage medium
CN110515718A (en) * 2019-08-30 2019-11-29 深圳前海微众银行股份有限公司 Batch tasks breakpoint is continuous to make method, apparatus, equipment and medium
CN112148505A (en) * 2020-09-18 2020-12-29 京东数字科技控股股份有限公司 Data batching system, method, electronic device and storage medium
CN113806052A (en) * 2021-09-24 2021-12-17 四川新网银行股份有限公司 Decentralized distributed timing task processing method
CN113806052B (en) * 2021-09-24 2023-06-06 四川新网银行股份有限公司 Decentralized distributed timing task processing method

Similar Documents

Publication Publication Date Title
CN108228326A (en) Batch tasks processing method and distributed system
CN104268738B (en) Material intelligent is stored in a warehouse and access method
CN106095590B (en) A kind of method for allocating tasks and device based on thread pool
CN109815011A (en) A kind of method and apparatus of data processing
US9495206B2 (en) Scheduling and execution of tasks based on resource availability
CN105391649B (en) A kind of array dispatching method and device
CN103309738A (en) User job scheduling method and device
CN106209682A (en) Business scheduling method, device and system
CN106502918B (en) A kind of scheduling memory method and device
He et al. Achieving a fairer future by changing the past
CN105589748A (en) Service request processing method and apparatus
WO2016105419A1 (en) Apparatus and method for routing data in a switch
CN106790332A (en) A kind of resource regulating method, system and host node
CN110334993A (en) The method, apparatus and computer equipment that a kind of pair of sowing goods yard is managed
CN104320382B (en) Distributed current processing device, method and unit in real time
CN109800074A (en) Task data concurrently executes method, apparatus and electronic equipment
CN112860401A (en) Task scheduling method and device, electronic equipment and storage medium
CN109242179A (en) A kind of intelligent dispatching algorithm based on flow control
CN105242915B (en) A kind of processing method and processing device of data manipulation
CN111451155B (en) Sorting task processing method and device, warehouse control system and storage medium
CN109189581A (en) A kind of job scheduling method and device
CN106484689B (en) Data processing method and device
CN105162725B (en) A kind of method and device pre-processed to protocol processes assembly line message address
CN114792125A (en) Data processing method and device based on distributed training, electronic equipment and medium
CN108595270A (en) A kind of recovery method and device of memory source

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180629

RJ01 Rejection of invention patent application after publication