CN114519636A - Batch service processing method, device, equipment and storage medium - Google Patents

Batch service processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN114519636A
CN114519636A CN202210136501.7A CN202210136501A CN114519636A CN 114519636 A CN114519636 A CN 114519636A CN 202210136501 A CN202210136501 A CN 202210136501A CN 114519636 A CN114519636 A CN 114519636A
Authority
CN
China
Prior art keywords
time
batch
batch service
duration
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210136501.7A
Other languages
Chinese (zh)
Inventor
张梓聪
穆琼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Agricultural Bank of China
Original Assignee
Agricultural Bank of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agricultural Bank of China filed Critical Agricultural Bank of China
Priority to CN202210136501.7A priority Critical patent/CN114519636A/en
Publication of CN114519636A publication Critical patent/CN114519636A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/02Banking, e.g. interest calculation or account maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/302Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system component is a software system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3055Monitoring arrangements for monitoring the status of the computing system or of the computing system component, e.g. monitoring if the computing system is on, off, available, not available
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3419Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment by assessing time
    • G06F11/3423Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment by assessing time where the assessed time is active or idle time

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computing Systems (AREA)
  • Quality & Reliability (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Computer Hardware Design (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Technology Law (AREA)
  • General Business, Economics & Management (AREA)
  • Mathematical Physics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a batch service processing method, a device, equipment and a storage medium, wherein the batch service processing method comprises the following steps: predicting the operation prediction duration of the current batch service nodes after the operation is finished by adopting a time series model; acquiring the real-time operation duration of the current batch service nodes; and judging whether the current batch service node operates abnormally or not according to the operation predicted time length and the real-time operation time length, if not, operating the batch service associated node or the next batch service node corresponding to the current batch service node according to the specified flow, and if so, generating batch operation alarm information. The batch service processing method provided by the invention can accurately judge whether the batch service nodes are abnormal in operation, and avoids the problem that the whole banking system is influenced because the abnormal batch service nodes cannot be detected due to improper setting of the operation threshold time.

Description

Batch service processing method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of banking business systems, in particular to a batch business processing method, a batch business processing device, a batch business processing equipment and a storage medium.
Background
With the rapid growth of the business banking scale, the number, logic complexity and relevance of background batch processing programs of the banking system are increasing, and the monitoring of the running conditions of each batch becomes more and more complex. The service system batch comprises a plurality of batches, each batch completes one type of service data processing, one batch comprises a plurality of operation nodes, each operation node completes one step in the batch, the nodes have a front-to-back sequence relation, and the front-to-back sequence node starts to execute after finishing executing. If the batch operation of the intermediate node is overtime, ended in advance or failed, the batch operation of the whole service system is influenced.
In the prior art, the time-use monitoring of each batch is realized mainly by means of manual threshold configuration, and due to the fact that commercial bank systems are numerous, batch relations of each system are complex and have complex association relations, the manual threshold configuration mode is difficult to operate, and meanwhile pertinence and timeliness are lacked.
Disclosure of Invention
The invention provides a batch service processing method, a batch service processing device, equipment and a storage medium, which are used for achieving the purposes of automatically setting the running threshold time of batch service nodes and further improving the running abnormity detection accuracy of the batch service nodes.
According to an aspect of the present invention, there is provided a batch traffic processing method, including:
predicting the operation prediction duration of the current batch service nodes after the operation is finished by adopting a time series model;
acquiring the real-time operation duration of the current batch service nodes;
and judging whether the current batch service node operates abnormally or not according to the operation predicted time length and the real-time operation time length, if not, operating a batch service associated node or a next batch service node corresponding to the current batch service node according to a specified flow, and if so, generating batch operation alarm information.
Optionally, the operation prediction duration includes an operation upper limit duration and an operation lower limit duration;
judging whether the current batch service nodes operate abnormally according to the operation predicted time length and the real-time operation time length comprises the following steps:
and if the real-time operation time length is greater than the operation upper limit time length or less than the operation lower limit time length, judging that the current batch service node is abnormal in operation.
Optionally, the obtaining the real-time operation duration of the current batch service node includes:
acquiring the real-time running duration of the current batch service nodes at fixed time;
judging whether the current batch service nodes operate abnormally according to the operation predicted time length and the real-time operation time length comprises the following steps:
and judging whether the current batch service nodes operate abnormally or not according to the operation predicted time length and the real-time operation time length.
Optionally, the obtaining the real-time running duration of the current batch service node at regular time includes:
and if the operation of the current batch service nodes is finished, acquiring the operation starting time, the operation ending time and the manual intervention duration of the current batch service nodes, and determining the real-time operation duration according to the operation starting time, the operation ending time and the manual intervention duration.
Optionally, the training sample data of the time series model includes a long data set in the historical runtime of the current batch service node;
the historical operation long data set comprises a plurality of historical operation durations, and one historical operation duration corresponds to the real-time operation duration of the current batch service nodes in one day after operation is finished.
Optionally, the time series model includes a trend term, a period term, a holiday term and an error term;
training the time series model comprises: training the trend, period, and holiday terms according to the historical runtime long dataset.
Optionally, training the time series model further includes:
and adding the real-time running duration of the current batch service nodes in the current day after the running is finished into the historical running long data set, and retraining the trend item, the period item and the holiday item through the updated historical running long data set.
Optionally, the method further includes updating the operation prediction duration to a prediction result database;
and when judging whether the current batch service nodes operate abnormally, acquiring the operation prediction duration from the prediction result database.
Optionally, the batch service node and the batch service association node are used to implement a batch processing task in a banking system.
According to another aspect of the present invention, there is provided a batch traffic processing apparatus, including: the device comprises a prediction module, a real-time operation duration acquisition module and an operation abnormity judgment module;
the prediction module is used for predicting the operation prediction duration of the current batch service nodes according to the time sequence model;
the real-time operation duration acquisition module is used for acquiring the real-time operation duration of the current batch service nodes;
and the operation abnormity judgment module is used for judging whether the current batch service nodes operate abnormally according to the operation prediction time length and the real-time operation time length.
According to another aspect of the present invention, there is provided an electronic apparatus including: at least one processor; and a memory communicatively coupled to the at least one processor;
wherein the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to enable the at least one processor to perform any of the batch service processing methods described in embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer-readable storage medium storing computer instructions for causing a processor to implement any of the batch service processing methods described in the embodiments of the present invention when executed.
The batch service processing method adopts the time sequence model to predict the operation prediction time length when the operation of the batch service nodes is finished, judges whether the operation of the batch service nodes is abnormal or not based on the operation cloud measurement time length, can accurately determine the operation threshold time of the batch service nodes compared with a mode of setting the operation threshold time of the batch service nodes based on experience, further accurately judges whether the operation of the batch service nodes is abnormal or not, and avoids the problem that the abnormal batch service nodes cannot be detected due to improper setting of the operation threshold time, so that the whole banking service system is influenced.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present invention, nor do they necessarily limit the scope of the invention. Other features of the present invention will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flow diagram of a batch business process method in an embodiment;
FIG. 2 is a flow diagram of another batch business processing method in an embodiment;
FIG. 3 is a block diagram showing the structure of a batch service processing apparatus according to an embodiment;
fig. 4 is a schematic structural diagram of an electronic device in the embodiment.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example one
Fig. 1 is a flowchart of a batch service processing method in an embodiment, where the method is applicable to a banking system for batch task processing, and the method may be executed by a batch service processing apparatus, which may be implemented in a form of software, and the batch service processing apparatus may be configured in the banking system. As shown in fig. 1, the method includes:
s101, a time sequence model is adopted to predict the operation prediction duration of the current batch service nodes after the operation is finished.
For example, in the present embodiment, the time series model may adopt an autoregressive model (AR model), an autoregressive integrated moving average model (ARIMA model), a logistic regression model, or the like.
Illustratively, in this embodiment, the batch service node is configured to implement batch processing tasks in the banking system, where the batch processing tasks may include a statistical analysis processing task, a transaction reimbursement processing task, a service processing task, and the like.
In this embodiment, the statistical analysis processing task may specifically be a statistical analysis of transaction data at the end of the day, the end of the month, and the end of the year, a statistical analysis of account data at the end of the day, the end of the month, and the end of the year, and the like;
the transaction reimbursement accounting processing tasks can be specifically daily reimbursement accounting, foreign exchange transaction flat-warehouse processing, gold transaction flat-warehouse processing, annual terminal exchange loss and benefit transfer and other tasks;
the business processing tasks can be specifically tasks of batch collection and payment, batch payment due for personal loan, batch automatic payment for credit cards, batch periodical automatic transfer, batch fund settlement and the like.
For example, in this embodiment, one batch service node may include multiple threads, and different threads may run in parallel, so as to implement parallel processing of data to be processed or statistical analysis tasks to be completed in a corresponding batch processing task.
And S102, acquiring the real-time operation time length of the current batch service nodes.
In this embodiment, the real-time running duration is a running duration from when a batch service node starts to execute a corresponding batch processing task on the same day.
Specifically, if the real-time running time length of the current batch service node is obtained, the batch task node does not complete the corresponding batch processing task, and then the real-time running time length is: the duration from the starting time of executing the corresponding batch processing task to the current sampling time;
if the real-time running time length of the current batch service node is obtained, the batch task node finishes the corresponding batch processing task, and the real-time running time length is as follows: the time length from the starting time of the corresponding batch processing task to the time of the corresponding batch processing task after the corresponding batch processing task is executed.
And S103, judging whether the current batch service nodes operate abnormally or not according to the operation prediction time length and the real-time operation time length.
Illustratively, in this step, the operation prediction time length is used as a reference to determine whether the real-time operation time length is too long or too short, and if the real-time operation time length is too long or too short, it is determined that the current batch service nodes are abnormally operated.
And S104, if the current batch service node is not abnormal, operating a batch service associated node or a next batch service node corresponding to the current batch service node according to the specified flow, and if the current batch service node is abnormal, generating batch operation alarm information.
Illustratively, in the banking system, each batch service node may be relatively independent, that is, each batch service node may independently complete its corresponding batch processing task according to a set sequence.
The batch service nodes can also have an association relationship, and at the moment, the designated batch service nodes are taken as batch service association nodes and associated with the designated batch service nodes through the batch service association nodes;
for example, assume that the batch service nodes having an association relationship are: the method comprises the steps that a batch service node (marked as a first batch service node) for statistical analysis of daily end transaction data, a batch service node (marked as a second batch service node) for daily end golden buying and selling bunkering processing, and a batch service node (marked as a third batch service node) for batch fund fixed delivery are used, so that the first batch service node can be used as a batch service association node, and the second batch service node and the third batch service node are respectively associated to the first batch service node;
at this time, the first batch service node has a job waiting relationship with the second batch service node and the third batch service node, that is, after the second batch service node and the third batch service node execute the corresponding batch processing tasks, the first batch service node executes the corresponding batch processing tasks again; or after the second batch of service nodes execute the designated link in the final-day gold trading bunkering processing, the first batch of service executes the designated link in the final-day transaction data statistical analysis; and after the third batch service node executes the designated link in batch fund decision-making, the first batch service executes the designated link in daily end-of-day transaction data statistical analysis.
In this step, if the batch service associated node is not included, the current batch service node is: the batch service node is positioned between the batch service node which has executed the corresponding batch processing task last and the batch service node which does not start to execute the corresponding batch processing task next;
if the batch service associated node is included, the current batch service node is: a batch service node or a batch service association node that is executing a corresponding batch processing task; the batch service node or the batch service related node of the corresponding batch processing task is not started to execute by the subsequent batch service node after the corresponding batch processing task is executed.
In this step, if the current batch service node is not abnormal, the banking service system continues to operate according to the normal working flow;
if the current batch service node is abnormal, the banking service system generates batch operation warning information, and then the banking service system can continue to operate according to a normal working flow or stop operating to wait for manual intervention.
The batch service processing method provided by the embodiment adopts the time sequence model to predict the operation prediction time length when the operation of the batch service nodes is finished, judges whether the operation of the batch service nodes is abnormal or not based on the operation cloud measurement time length, and can accurately determine the operation threshold time of the batch service nodes compared with a mode of setting the operation threshold time of the batch service nodes based on experience, so as to accurately judge whether the operation of the batch service nodes is abnormal or not, and avoid the problem that the abnormal batch service nodes cannot be detected due to improper setting of the operation threshold time, so that the whole banking system is influenced.
Based on the content recorded in step S101, in an implementation, the same time series model is used to predict the operation prediction duration of each batch service node after the operation is completed.
At this time, a historical runtime long data set of any batch service node can be used as training sample data used when training the time sequence model, wherein the historical runtime long data set comprises a plurality of historical runtime durations, and one historical runtime duration corresponds to a real-time runtime duration when the batch service node is completely run in one day (i.e., a duration from the starting time when the batch processing task corresponding to the historical runtime duration is executed to the time when the batch processing task corresponding to the historical runtime duration is completely executed).
Illustratively, when the historical operation time lengths of all the batch service nodes are similar, the same time sequence model is adopted to predict the operation prediction time length of each batch service node after the operation is finished, so that the system resources can be saved, and the redundant time sequence model is prevented from being trained.
Based on the content recorded in step S101, in an implementation, different time series models are used to predict the operation prediction duration when the corresponding batch service nodes finish operating, where the different time series models may be different types of time series models (for example, two types of time series models, such as an AR model and a logistic regression model), or the same type of time series models but with different model coefficients.
At this time, the historical runtime long data sets of different batches of service nodes are used as training sample data used when training the corresponding time series model.
Illustratively, the operation prediction time length of the corresponding batch service nodes after the operation is finished is predicted by adopting different time series models, so that the pertinence of the prediction time length of the operation of the different batch service nodes can be improved, and the accuracy of the operation prediction time length of each batch service node is further ensured.
In a preferred scheme, a time sequence decomposition model is used as a time sequence model, and the time sequence decomposition model with different model coefficients is used for predicting the operation prediction duration of the batch service nodes after the operation is finished, wherein the time sequence decomposition model has the form:
y(t)=g(t)+s(t)+h(t)+∈(t)
in the above formula, g (t) represents a trend term, s (t) represents a period term, h (t) represents a holiday term, and e (t) represents an error term, wherein the trend term, the period term, and the holiday term need to be trained through a long data set during historical operation, and the error term is a set value.
In this embodiment, the specific form of the trend term g (t) is as follows:
g(t)=(k+a(t)δ)·t+(m+a(t)Tγ)
wherein a (t), δ and γ are respectively represented by the following formulas:
a(t)=(a1(t),...,aS(t))T
δ=(δ1,...,δS)T
γ=(γ1,...,γS)T,γj=-sjδj
in the above formula, k represents a growth rate, m represents an offset parameter, S represents the number of time terms included in the time series input to g (t), δjRepresents the rate of change, s, of the growth of the jth time termjTime stamp indicating the jth time entry, ajThe value of (t) is 0 or 1, and a is obtained if the growth change rate of the jth time item changesj(t) take 1, otherwise aj(t) is 0.
In the above formula, k and m are coefficients of the trend term, which are values to be trained, and the other parameters are set values.
The specific form of the period term s (t) is as follows:
s(t)=X(t)β
wherein, X (t) and beta are respectively shown as the following formulas:
Figure BDA0003504935930000091
β=(a1,b1,...,aN,bN)T
in the above equation, β is a coefficient term of the period term, and is a value to be trained, P represents a period length (for example, if the period is year, P is 365, and if the period is week, P is 7), and N is a set value.
The concrete form of the holiday term h (t) is as follows:
h(t)=Z(t)θ
wherein Z (t) and theta are respectively shown as the following formulas:
Figure BDA0003504935930000101
θ=(θ1,...,θL)T
in the above formula, theta is used as a coefficient term of the holiday term and is a value to be trained, and thetaiShowing the influence factor of the ith holiday, L showing the number of set holidays, DiSection i showing settingsThe dates corresponding to a period of time before and after the holiday.
Illustratively, in the scheme, the corresponding time sequence decomposition model is fitted according to the specified long data set during the historical operation, wherein the fitting method is not particularly limited, for example, a BFGS algorithm and an L-BFGS algorithm may be adopted to fit each value to be trained in the time sequence decomposition model.
Illustratively, the operation prediction duration of the batch service nodes is predicted based on the time sequence decomposition model, so that the problem that the accuracy of the operation prediction duration is reduced due to the fact that the influence of periodicity or holidays on the change trend of the operation prediction duration is not considered can be solved.
In this scheme, as an optional step, the timing sequence decomposition model may be retrained once a day, specifically including:
and adding the real-time running duration of the batch service nodes in the current day after running into the corresponding historical running long data set, and retraining the time sequence decomposition model through the updated historical running long data set, namely retraining the trend item, the period item and the holiday item in the time sequence decomposition model.
In an alternative, the time-series decomposition model may employ a Prophet model (including a trend term, at least one of a period term and a holiday term, and an error term), and the training of the Prophet model may be implemented based on a Prophet open-source tool (framework).
Illustratively, the prophet open source tool inputs a known time series, the length of the time series that needs to be predicted (e.g., a one day prediction is made in units of days); the output is time predicted value, upper bound and lower bound.
By combining the batch service processing method in this embodiment, the input of the prophet open source tool is a long data set (including a timestamp and corresponding historical running time) during historical running, and the length of the time sequence to be predicted is one day; the output comprises an operation upper limit time length and an operation lower limit time length.
Illustratively, when the operation predicted time length includes an operation upper limit time length and an operation lower limit time length, on the basis of the content recorded in step S103, according to the operation predicted time length and the real-time operation time length, determining whether the operation of the current batch service node is abnormal includes:
and if the real-time operation time length is longer than the operation upper limit time length or shorter than the operation lower limit time length, judging that the operation of the current batch service nodes is abnormal.
Fig. 2 is a flow chart of another batch service processing method in the embodiment, and referring to fig. 2, as an implementable embodiment, the batch service processing method may be:
s201, a Prophet model is adopted to predict the operation upper limit duration and the operation lower limit duration when the operation of the current batch of service nodes is finished.
In the scheme, prophett models with different model coefficients are adopted to predict the upper limit operation time length and the lower limit operation time length when the operation of the batch service nodes corresponding to the day is finished.
Optionally, in this step, after the operation upper limit duration and the operation lower limit duration of the current batch of service nodes after operation is finished are predicted, the operation upper limit duration and the operation lower limit duration may be updated to the prediction result database.
S202, the real-time operation duration of the current batch service nodes is obtained regularly.
Based on the content recorded in step S102, in this step, the real-time running duration of the current batch service node is periodically obtained.
If the real-time running time of the current batch service node is acquired, the batch task node does not complete the corresponding batch processing task, and the real-time running time is as follows: from the starting time of executing the corresponding batch processing task to the current sampling time;
if the current batch service node is operated (namely the batch task node executes the corresponding batch processing task), the real-time operation time length is determined according to the following formula,
t=te-ts+tr
in the above formula, teFor the obtained operation end time, t, of the current batch service nodesFor the obtained running start time, t, of the current batch of service nodesrFor the acquired current batch service sectionThe manual intervention of the points is long.
S203, judging whether the current batch service nodes operate abnormally or not according to the operation predicted time length and the real-time operation time length.
On the basis of step S202, in this step, if the operation of the current batch service node is not completed, the real-time operation duration of the current batch service node is obtained, and then it is immediately determined whether the operation of the current batch service node is abnormal;
if the operation of the current batch service nodes is finished, after the real-time operation time length of the current batch service nodes after the operation is finished is obtained for the first time, whether the current batch service nodes operate abnormally is judged, and after the operation abnormity judgment is finished, whether the current batch service nodes operate abnormally is not judged continuously on the same day even if the real-time operation time length of the current batch service nodes is continuously obtained subsequently.
On the basis of step S201, in this step, if the real-time operation duration is greater than the operation upper limit duration or less than the operation lower limit duration, it is determined that the current batch service node is abnormally operated.
On the basis of step S201, if the prediction result database is configured, when it is determined whether the current batch service nodes operate abnormally, the required operation upper limit duration and operation lower limit duration are obtained from the prediction result database.
For example, by combining step S202 and step S203, whether the batch service nodes operate abnormally is determined at regular time, so that timeliness of operation abnormality detection can be improved, when the batch service nodes are abnormal, alarm information can be generated at the first time, and influence on operation of subsequent batch service nodes is reduced to the greatest extent.
And S204, if the current batch service node is not abnormal, operating the batch service associated node or the next batch service node corresponding to the current batch service node according to the specified flow, and if the current batch service node is abnormal, generating batch operation alarm information.
The embodiment of this step is the same as that described in step S104.
And S205, on the same day, retraining the Prophet model after all the batch service nodes are operated.
In this step, the real-time running duration of the batch service nodes in the current day after running is added to the corresponding long data set in the historical running, and the corresponding Prophet model is retrained through the updated long data set in the historical running.
And the retrained Prophet model is used for predicting the operation upper limit time length and the operation lower limit time length when the operation of the batch service nodes corresponding to the next day is finished.
Example two
Fig. 3 is a block diagram of a batch service processing apparatus in an embodiment, and referring to fig. 3, the embodiment proposes a batch service processing apparatus, including: the system comprises a prediction module 100, a real-time operation duration acquisition module 200 and an operation abnormity judgment module 300.
The prediction module 100 is configured to predict an operation prediction duration of the current batch service node according to the time series model.
The real-time operation duration obtaining module 200 is configured to obtain a real-time operation duration of a current batch of service nodes.
The operation exception determining module 300 is configured to determine whether the current batch of service nodes operate abnormally according to the operation prediction time length and the real-time operation time length.
Optionally, the prediction module 100 may be configured to predict the operation prediction duration of each batch service node after the operation is completed by using the same time series model; and predicting the operation prediction duration of the corresponding batch service nodes after the operation is finished by adopting different time series models.
Optionally, the real-time operation duration obtaining module 200 may be configured to: and if the operation of the current batch service nodes is finished, acquiring the operation starting time, the operation ending time and the manual intervention duration of the current batch service nodes, and determining the real-time operation duration according to the operation starting time, the operation ending time and the manual intervention duration.
Optionally, the operation abnormality determining module 300 may be configured to: and if the real-time operation time length is longer than the operation upper limit time length or shorter than the operation lower limit time length, judging that the operation of the current batch service nodes is abnormal.
Optionally, the batch service processing apparatus may further include a prediction result database, where the prediction result database is used to store the operation prediction duration output by the prediction module 100.
Optionally, the batch service processing apparatus may further include a model training module, where the model training module is configured to: training a time series model through a long data set during historical operation; the time series model is retrained with the updated historical runtime long data set.
In this embodiment, the batch service processing apparatus may be configured to implement any one of the batch service processing methods described in the first embodiment, and the specific implementation process and the beneficial effects thereof are the same as the corresponding contents described in the first embodiment, and are not described herein again.
EXAMPLE III
FIG. 4 shows a schematic block diagram of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital assistants, cellular phones, smart phones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 4, the electronic device 10 includes at least one processor 11, and a memory communicatively connected to the at least one processor 11, such as a Read Only Memory (ROM)12, a Random Access Memory (RAM)13, and the like, wherein the memory stores a computer program executable by the at least one processor, and the processor 11 can perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM)12 or the computer program loaded from a storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data necessary for the operation of the electronic apparatus 10 can also be stored. The processor 11, the ROM 12, and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to the bus 14.
A number of components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, or the like; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, or the like. The processor 11 performs the batch traffic processing method described above.
In some embodiments, the batch traffic processing method may be implemented as a computer program tangibly embodied in a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the batch traffic processing method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the batch traffic processing method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on a machine, as a stand-alone software package partly on a machine and partly on a remote machine or entirely on a remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), blockchain networks, and the Internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present invention may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solution of the present invention can be achieved.
The above-described embodiments should not be construed as limiting the scope of the invention. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (11)

1. A method for processing a batch of traffic, comprising:
predicting the operation prediction duration of the current batch service nodes after the operation is finished by adopting a time series model;
acquiring the real-time operation duration of the current batch service nodes;
and judging whether the current batch service node operates abnormally or not according to the operation predicted time length and the real-time operation time length, if not, operating a batch service associated node or a next batch service node corresponding to the current batch service node according to a specified flow, and if so, generating batch operation alarm information.
2. The batch traffic processing method according to claim 1, wherein the operation prediction duration includes an upper operation limit duration and a lower operation limit duration;
judging whether the current batch service nodes operate abnormally according to the operation predicted time length and the real-time operation time length comprises the following steps:
and if the real-time operation time length is greater than the operation upper limit time length or less than the operation lower limit time length, judging that the current batch service node is abnormal in operation.
3. The batch service processing method of claim 1, wherein obtaining the real-time running duration of the current batch service node comprises:
acquiring the real-time running duration of the current batch service nodes at regular time;
judging whether the current batch service nodes operate abnormally according to the operation predicted time length and the real-time operation time length comprises the following steps:
and judging whether the current batch service nodes operate abnormally or not according to the operation predicted time length and the real-time operation time length.
4. The batch service processing method of claim 3, wherein the periodically obtaining the real-time running duration of the current batch service node comprises:
and if the operation of the current batch service nodes is finished, acquiring the operation starting time, the operation ending time and the manual intervention duration of the current batch service nodes, and determining the real-time operation duration according to the operation starting time, the operation ending time and the manual intervention duration.
5. The batch traffic processing method of claim 1, wherein training sample data of the time series model comprises a historical runtime long data set of the current batch traffic node;
the historical operation long data set comprises a plurality of historical operation durations, and one historical operation duration corresponds to the real-time operation duration of the current batch service nodes in one day after operation is finished.
6. The batch traffic processing method according to claim 5, wherein the time series model includes a trend term, a period term, a holiday term, and an error term;
training the time series model comprises: training the trend, period, and holiday terms according to the historical runtime long dataset.
7. The batch traffic processing method of claim 6, wherein training the time series model further comprises:
and adding the real-time running duration of the current batch service nodes in the current day after the running is finished into the historical running long data set, and retraining the trend item, the period item and the holiday item through the updated historical running long data set.
8. The batch business processing method of any one of claims 1 to 7, wherein the batch business node and the batch business association node are used for implementing batch processing tasks in a banking system.
9. A batch traffic processing apparatus, comprising: the device comprises a prediction module, a real-time operation duration acquisition module and an operation abnormity judgment module;
the prediction module is used for predicting the operation prediction duration of the current batch service nodes according to the time sequence model;
the real-time operation duration acquisition module is used for acquiring the real-time operation duration of the current batch service nodes;
and the operation abnormity judgment module is used for judging whether the current batch service nodes operate abnormally according to the operation prediction time length and the real-time operation time length.
10. An electronic device, characterized in that the electronic device comprises:
at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the batch traffic processing method of any one of claims 1-8.
11. A computer-readable storage medium storing computer instructions for causing a processor to implement the batch traffic processing method of any one of claims 1 to 8 when executed.
CN202210136501.7A 2022-02-15 2022-02-15 Batch service processing method, device, equipment and storage medium Pending CN114519636A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210136501.7A CN114519636A (en) 2022-02-15 2022-02-15 Batch service processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210136501.7A CN114519636A (en) 2022-02-15 2022-02-15 Batch service processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114519636A true CN114519636A (en) 2022-05-20

Family

ID=81596466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210136501.7A Pending CN114519636A (en) 2022-02-15 2022-02-15 Batch service processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114519636A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115967641A (en) * 2023-03-16 2023-04-14 浙江正泰仪器仪表有限责任公司 Method and device for batch equipment parameter operation, computer equipment and medium
CN117455643A (en) * 2023-10-13 2024-01-26 厦门国际银行股份有限公司 Intelligent monitoring method, device and equipment for batch processing operation

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115967641A (en) * 2023-03-16 2023-04-14 浙江正泰仪器仪表有限责任公司 Method and device for batch equipment parameter operation, computer equipment and medium
CN115967641B (en) * 2023-03-16 2023-05-30 浙江正泰仪器仪表有限责任公司 Method, device, computer equipment and medium for operating batch equipment parameters
CN117455643A (en) * 2023-10-13 2024-01-26 厦门国际银行股份有限公司 Intelligent monitoring method, device and equipment for batch processing operation

Similar Documents

Publication Publication Date Title
CN114519636A (en) Batch service processing method, device, equipment and storage medium
CN106408341A (en) Goods sales volume prediction method and device, and electronic equipment
CN114662953A (en) Internet of things equipment operation and maintenance method, device, equipment and medium
CN112785057A (en) Component prediction method, device, equipment and storage medium based on exponential smoothing
CN115907616A (en) Material purchasing system, method, electronic equipment and storage medium
CN115759751A (en) Enterprise risk prediction method and device, storage medium, electronic equipment and product
CN115630078A (en) User data processing method, device, equipment and medium based on digital twin
CN115545481A (en) Risk level determination method and device, electronic equipment and storage medium
CN114999665A (en) Data processing method and device, electronic equipment and storage medium
CN109783217B (en) Shut down determination method, apparatus, electronic equipment and the storage medium of period
CN114638514A (en) Monitoring method, device, equipment and medium for vehicle insurance operation
CN115858621A (en) User behavior prediction method and device, electronic equipment and storage medium
CN112633683B (en) Resource usage statistics method, device, system, electronic equipment and storage medium
CN115858291A (en) System index detection method and device, electronic equipment and storage medium thereof
CN114897381A (en) Accounting evaluation method, device, equipment, medium and product
CN115641216A (en) Fund net worth estimation method, device, electronic device and storage medium
CN114997720A (en) Software research and development project risk monitoring method, device, equipment and storage medium
CN115077906A (en) Engine high-occurrence fault cause determination method, engine high-occurrence fault cause determination device, electronic equipment and medium
CN114820193A (en) Chip change curve generation method, device, equipment and storage medium
CN116128651A (en) Transaction amount abnormality detection method, device, equipment and storage medium
CN116307668A (en) Batch operation deduction method, related device and computer storage medium
CN115292384A (en) Electricity consumption data generation method and electricity consumption data generation device
CN117611323A (en) Financing client credit prediction method and device, electronic equipment and medium
CN117573412A (en) System fault early warning method and device, electronic equipment and storage medium
CN114638636A (en) Inward rotation price determining method, device, equipment, medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination