CN110445939B - Capacity resource prediction method and device - Google Patents

Capacity resource prediction method and device Download PDF

Info

Publication number
CN110445939B
CN110445939B CN201910729638.1A CN201910729638A CN110445939B CN 110445939 B CN110445939 B CN 110445939B CN 201910729638 A CN201910729638 A CN 201910729638A CN 110445939 B CN110445939 B CN 110445939B
Authority
CN
China
Prior art keywords
target
host
processing efficiency
prediction
service
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910729638.1A
Other languages
Chinese (zh)
Other versions
CN110445939A (en
Inventor
叶浩
丛新法
侯青军
王晓明
崔涛
刘亚瑞
李团结
刘双
张婷
胡海波
张忠龙
邱斌
高晓兵
孙莉华
徐士方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China United Network Communications Group Co Ltd
Original Assignee
China United Network Communications Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China United Network Communications Group Co Ltd filed Critical China United Network Communications Group Co Ltd
Priority to CN201910729638.1A priority Critical patent/CN110445939B/en
Publication of CN110445939A publication Critical patent/CN110445939A/en
Application granted granted Critical
Publication of CN110445939B publication Critical patent/CN110445939B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/14Network analysis or design
    • H04L41/147Network analysis or design for predicting network behaviour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/22Arrangements for supervision, monitoring or testing
    • H04M3/36Statistical metering, e.g. recording occasions when traffic exceeds capacity of trunks
    • H04M3/362Traffic simulation

Abstract

The embodiment of the invention provides a capacity resource prediction method and a capacity resource prediction device, wherein after a target business process number and a target business parameter of a business running on a host are obtained according to a prediction strategy, the target business process number is input into a pre-trained prediction model of a reference capacity resource to obtain the reference processing efficiency of the host, then the reference processing efficiency is used as a factor and is input into the pre-trained prediction model of an actual capacity resource together with the target business parameter to obtain the actual processing efficiency of the host, and further the capacity resource required in a future preset time period can be predicted according to the actual processing efficiency of the host.

Description

Capacity resource prediction method and device
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to a method and an apparatus for predicting capacity resources.
Background
With the continuous expansion of data scale of communication operators and the continuous increase of telecommunication service types, the processing pressure of each service host for processing telecommunication services is also continuously increased, and the capacity resources such as the processing upper limit, the emergency capability, the system stability and the like of the host need to be clearly estimated in the production process.
At present, the traditional capacity resource estimation mode is mostly based on daily statistical data and an empirical formula to complete estimation. The general estimation process comprises the following steps: under the condition of manual participation, the current or recent capacity resource condition is estimated by utilizing methods such as simple average trend analysis and the like based on a script and a database according to manual experience judgment.
However, the conventional prediction method has low efficiency and low accuracy of a prediction result.
Disclosure of Invention
The embodiment of the invention provides a capacity resource prediction method and device, which are used for improving the prediction efficiency of capacity resources.
A first aspect of an embodiment of the present invention provides a method for predicting capacity resources, including:
obtaining a target service process number and a target service parameter of a service running on the host according to the prediction strategy;
inputting the target business process number into a prediction model of a reference capacity resource obtained by pre-training to obtain the reference processing efficiency of the host;
inputting the target service parameters and the reference processing efficiency into a pre-trained prediction model of actual capacity resources to obtain the actual processing efficiency of the host;
and predicting the capacity resource required in the future preset time period according to the actual processing efficiency of the host.
Optionally, the target service parameter includes at least one of the following parameters: target load rate, target phone bill quantity, target user quantity and target peripheral resource ratio; and/or the presence of a gas in the gas,
the capacity resource comprises at least one of the following resources: the number of the hosts, the number of the virtual machines, the processing speed of a Central Processing Unit (CPU) and the size of a memory.
Optionally, when the service parameter includes the target phone number and the target user number, obtaining a target service parameter of a service running on a host according to a prediction policy, including:
and calculating the historical phone number and the historical user number by using an autoregressive moving average (ARMA) algorithm to obtain the target phone number and the target user number.
Optionally, the obtaining the target service process number of the service running on the host according to the prediction policy includes:
inputting the initial value of the number of the business processes into the prediction model of the reference capacity resource to obtain the test processing efficiency of the host;
sequentially increasing the values of the business process numbers according to a preset step length, and inputting the prediction model of the reference capacity resource to obtain the test processing efficiency of the host corresponding to the business process numbers under different value-taking conditions;
and determining the value of the corresponding service process number as the target service process number when the test processing efficiency of the host is the maximum value according to the test processing efficiency of the host corresponding to the service process number under different value conditions.
Optionally, the number of service processes includes at least one of the following process numbers: average core service process number, average important service process number and average total service process number.
A second aspect of the embodiments of the present invention provides a capacity resource prediction apparatus, including:
the obtaining module is used for obtaining the target service process number and the target service parameters of the service running on the host according to the prediction strategy;
the first input module is used for inputting the target business process number into a pre-trained prediction model of a reference capacity resource to obtain the reference processing efficiency of the host;
the second input module is used for inputting the target service parameters and the reference processing efficiency into a pre-trained prediction model of the actual capacity resource to obtain the actual processing efficiency of the host;
and the prediction module is used for predicting the capacity resource required in the future preset time period according to the actual processing efficiency of the host.
Optionally, the target service parameter includes at least one of the following parameters: target load rate, target phone bill quantity, target user quantity and target peripheral resource ratio; and/or the presence of a gas in the gas,
the capacity resource comprises at least one of the following resources: the number of the hosts, the number of the virtual machines, the processing speed of a Central Processing Unit (CPU) and the size of a memory.
Optionally, when the service parameter includes the target phone number and the target user number, the obtaining module is further configured to:
and calculating the historical phone number and the historical user number by using an autoregressive moving average (ARMA) algorithm to obtain the target phone number and the target user number.
Optionally, the obtaining module is further configured to:
inputting the initial value of the number of the business processes into the prediction model of the reference capacity resource to obtain the test processing efficiency of the host; sequentially increasing the values of the business process numbers according to a preset step length, and inputting the prediction model of the reference capacity resource to obtain the test processing efficiency of the host corresponding to the business process numbers under different value-taking conditions; and determining the value of the corresponding service process number as the target service process number when the test processing efficiency of the host is the maximum value according to the test processing efficiency of the host corresponding to the service process number under different value conditions.
Optionally, the number of service processes includes at least one of the following process numbers: average core service process number, average important service process number and average total service process number.
A third aspect of embodiments of the present invention provides an electronic device, including: a processor, a memory, and a computer program; wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method of any of the preceding first aspects.
A fourth aspect of the present invention provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed, the computer program implements the method according to any one of the first aspect.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
in the capacity resource prediction method and device provided by the embodiment of the invention, considering that a large amount of training data is usually needed in model training, a large amount of production data cannot be usually obtained in a production environment, and model training cannot be usually carried out or an accurate prediction model cannot be obtained only according to data of the production environment, a two-stage prediction model is adopted in the embodiment of the invention, specifically, the prediction model of the reference capacity resource can be a model trained in a test environment, and test data under various conditions can be collected in the test environment for training, so that the prediction model of the reference capacity resource can output reference processing efficiency under more conditions; when the prediction model of the actual capacity resource is used for prediction, the reference processing efficiency and the production data are used as the input of the prediction model of the capacity resource, so that the production environment and the test environment are considered, the accuracy of the capacity prediction is greatly improved, manual participation is not needed, and the prediction efficiency is higher. Specifically, in the embodiment of the present invention, after the target business process number and the target business parameter of the business running on the host are obtained according to the prediction policy, the target business process number is input into the prediction model of the pre-trained reference capacity resource to obtain the reference processing efficiency of the host, and then the reference processing efficiency is used as a factor, and the prediction model of the pre-trained actual capacity resource is input together with the target business parameter to obtain the actual processing efficiency of the host, so that the capacity resource required in the future preset time period can be predicted according to the actual processing efficiency of the host, thereby not only considering the production environment, but also considering the test environment, greatly improving the accuracy of the capacity prediction, and having higher prediction efficiency without manual participation.
Drawings
Fig. 1 is a schematic flow chart illustrating a method for predicting capacity resources according to an embodiment of the present invention;
fig. 2 is a schematic diagram illustrating a first test result of a method for predicting capacity resources according to an embodiment of the present invention;
fig. 3 is a diagram illustrating a second test result of the capacity resource prediction method according to the embodiment of the present invention;
fig. 4 is a schematic diagram illustrating a third test result of the capacity resource prediction method according to the embodiment of the present invention;
fig. 5 is a diagram illustrating a fourth test result of the capacity resource prediction method according to the embodiment of the present invention;
FIG. 6 is a diagram illustrating a predicted effect of a reference processing efficiency according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a prediction effect of a prediction model of capacity resources according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a model architecture of a capacity resource prediction method according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a prediction apparatus of capacity resources according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It is to be understood that the terms "first," "second," "third," "fourth," and the like (if any) in the description and claims of this invention and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element.
In the capacity resource prediction method and device provided by the embodiment of the invention, considering that a large amount of training data is usually needed in model training, a large amount of production data cannot be usually obtained in a production environment, and model training cannot be usually carried out or an accurate prediction model cannot be obtained only according to data of the production environment, a two-stage prediction model is adopted in the embodiment of the invention, specifically, the prediction model of the reference capacity resource can be a model trained in a test environment, and test data under various conditions can be collected in the test environment for training, so that the prediction model of the reference capacity resource can output reference processing efficiency under more conditions; when the prediction model of the actual capacity resource is used for prediction, the reference processing efficiency and the production data are used as the input of the prediction model of the capacity resource, so that the production environment and the test environment are considered, the accuracy of the capacity prediction is greatly improved, manual participation is not needed, and the prediction efficiency is higher. Specifically, in the embodiment of the present invention, after the target business process number and the target business parameter of the business running on the host are obtained according to the prediction policy, the target business process number is input into the prediction model of the pre-trained reference capacity resource to obtain the reference processing efficiency of the host, and then the reference processing efficiency is used as a factor, and the prediction model of the pre-trained actual capacity resource is input together with the target business parameter to obtain the actual processing efficiency of the host, so that the capacity resource required in the future preset time period can be predicted according to the actual processing efficiency of the host, thereby not only considering the production environment, but also considering the test environment, greatly improving the accuracy of the capacity prediction, and having higher prediction efficiency without manual participation.
The embodiment of the invention can be applied to a terminal, and the terminal can comprise: the electronic device can run the capacity resource prediction method, such as a mobile phone, a tablet computer, a notebook computer, a desktop computer or a server.
The processing efficiency described in the embodiment of the present invention is the traffic processed by the host in unit time, and the traffic may specifically include: a number of words, a number of data requests, and/or a packet size, etc.
The prediction model of the reference capacity resource described in the embodiment of the present invention may be a neural network model obtained based on a deep learning technique and training of test data in a test environment. Specifically, the deep learning technology is a machine learning algorithm composed of large-scale neurons, and the technology mainly comprises the steps of collecting a large amount of basic data, researching rules among the data, forming a certain statistical model, and constructing an intelligent model by using algorithms such as regression analysis, time series, neural networks and the like.
The prediction model of the actual capacity resource described in the embodiment of the present invention may be a neural network model obtained based on a deep learning technique and production data training in a production environment.
In a specific application, when training a prediction model based on capacity resources of a neural network, it is generally easy to think that the training is performed based on only test samples or based on only production samples.
However, in the model obtained by training only based on the test sample, because there is a difference between the test sample and the actual sample, an accurate capacity prediction effect cannot be obtained in general; in the model obtained by training only based on the production samples, it is difficult to obtain a large number of production samples, and therefore, the neural network model cannot be trained according to the large number of production samples. For this reason, in an application scenario of capacity resource prediction, detection with a low equivalent rate of artificial prediction is generally adopted, and a technical scheme for performing capacity prediction based on a neural network model is not available.
Based on this, in the embodiment of the present invention, a two-stage prediction model is adopted, specifically, the prediction model of the reference capacity resource may be a model trained in a test environment, and test data under various conditions may be collected in the test environment for training, so that the prediction model of the reference capacity resource may output reference processing efficiency under many conditions; when the prediction model of the actual capacity resource is used for prediction, the reference processing efficiency and the production data are used as the input of the prediction model of the capacity resource, so that the production environment and the test environment are considered, the accuracy of the capacity prediction is greatly improved, manual participation is not needed, and the prediction efficiency is higher.
As shown in fig. 1, fig. 1 is a schematic flow chart of a prediction method of capacity resources according to an embodiment of the present invention. The method specifically comprises the following steps:
step S101: and obtaining the target service process number and the target service parameters of the service running on the host according to the prediction strategy.
In the embodiment of the present invention, the prediction policy may be determined according to an actual application scenario, for example, data at a certain time may be predicted according to historical data, for example, according to a target service process number and a target service parameter of a service operated by a host at the time of this year, a target service process number and a target service parameter of a service operated by a host at the time of next year, and the like.
Optionally, the number of service processes includes at least one of the following process numbers: average core service process number, average important service process number and average total service process number.
In the embodiment of the present invention, through analyzing the host environment, it is found that, after a telecommunication service is taken as a center, system resources are divided by service dimensions, test data are collected through continuous testing of the test environment, and then data processing methods such as regression analysis, principal component analysis, data dimension reduction, and the like are applied, only the number of host average service processes needs to be researched through factors such as service complexity based on the service processes, actual processing bottleneck of production, Central Processing Unit (CPU) memory occupation ratio, and the like, so that three main factors are extracted: average core service process number, average basic service process number and average total service process number. According to the three main factors, a prediction model of the reference capacity resource can be established by combining methods such as missing value completion, data normalization and the like.
Therefore, in the prediction strategy, the obtained number of business processes includes: and when at least one of the average core service process number, the average important service process number and the average total service process number is obtained, a better prediction effect can be obtained.
It can be understood that specific contents of the average core service process number, the average important service process number, and the average total service process number may be determined according to an actual application scenario, which is not specifically limited in the embodiment of the present invention.
Optionally, the obtaining the target service process number of the service running on the host according to the prediction policy includes:
inputting the initial value of the number of the business processes into the prediction model of the reference capacity resource to obtain the test processing efficiency of the host; sequentially increasing the values of the business process numbers according to a preset step length, and inputting the prediction model of the reference capacity resource to obtain the test processing efficiency of the host corresponding to the business process numbers under different value-taking conditions; and determining the value of the corresponding service process number as the target service process number when the test processing efficiency of the host is the maximum value according to the test processing efficiency of the host corresponding to the service process number under different value conditions.
In the embodiment of the present invention, both the initial value of the number of service processes and the preset step length may be determined according to an actual application scenario, which is not specifically limited in the embodiment of the present invention.
In the embodiment of the invention, the value of the corresponding service process number is determined as the target service process number when the test processing efficiency of the host is the maximum value according to the test processing efficiency of the host corresponding to the service process number under different value conditions, and because the test processing efficiency of the host is the maximum value which is the expected processing efficiency value, the value of the corresponding service process number when the test processing efficiency of the host is the maximum value is determined as the target service process number, so that the expected reference processing efficiency can be obtained.
Optionally, the target service parameter includes at least one of the following parameters: target load rate, target phone bill quantity, target user quantity and target peripheral resource ratio.
In the embodiment of the application, it is considered that with the continuous expansion of the data scale of the current communication operator, the service types are continuously increased, the processing pressure of each service host in daily production is also continuously increased, and the capacity resources such as the processing upper limit, the emergency capacity, the system stability and the like of the host need to be clearly estimated in the production process. Therefore, the capacity resource prediction method of the embodiment of the invention can be applied to a service and distributed production environment based on a telecommunication system, and is realized based on a cross-industry data mining standard process (CRISP-DM) standard process.
In a service and distributed production environment based on a telecommunication system, a load rate, a call ticket amount, a number of users, and a peripheral resource ratio are relatively common and important service parameters, and therefore, in the embodiment of the present invention, the target service parameter includes at least one of the following parameters: target load rate, target phone bill quantity, target user quantity and target peripheral resource ratio.
Optionally, when the service parameter includes the target phone number and the target user number, obtaining a target service parameter of a service running on a host according to a prediction policy, including: and calculating the historical phone number and the historical user number by using an Auto-Regressive Moving Average (ARMA) algorithm to obtain the target phone number and the target user number.
Step S102: and inputting the target business process number into a pre-trained prediction model of the reference capacity resource to obtain the reference processing efficiency of the host.
In the embodiment of the invention, the prediction model of the reference capacity resource can firstly obtain the test data samples under each scene through a convolution neural network and other methods, so that the test environment has higher flexibility, samples of various outstanding conditions and various risk prediction scenes can be included, and subsequently, when emergency prediction or complex scenes are faced, a more accurate prediction result can be obtained only by changing the factor input configuration of the current scene, thereby improving the risk coping capability on the production environment.
Specifically, the prediction model building process of the reference capacity resource M1 may be:
prediction model of reference capacity resource M1: prediction reference processing efficiency (Z1) is the average core traffic process number (x1) to average important traffic process number (x2) to average total traffic process number (x3)
The functional formula represents: z1 ═ M1(x1, x2, x3)
In the experiment, the following four methods were tested on the prediction model of the reference capacity resource trained in advance.
The first method comprises the following steps: as shown in fig. 2, the number of core service processes and the total number of service processes are increased in a certain proportion, and the change of the core service efficiency is observed.
The second method comprises the following steps: as shown in fig. 3, the core service process number and the reference service number are fixed, and the relationship between the core service process number and the efficiency is observed.
The third method comprises the following steps: as shown in fig. 4, only the core business process number and efficiency relationship are observed, ignoring other factors.
The fourth method comprises the following steps: as shown in fig. 5, the total number of business processes is constant, and the relationship between the number of core business processes and the efficiency is observed.
Through experiments, as shown in fig. 6, the accuracy of the standard processing efficiency obtained in the embodiment of the present application can reach more than 95%.
Step S103: and inputting the target service parameters and the reference processing efficiency into a pre-trained prediction model of the actual capacity resource to obtain the actual processing efficiency of the host.
In an embodiment of the present invention, the prediction model M2 of the actual capacity resource may be trained based on production environment data. Specifically, by collecting information such as production environment logs, detailed uniflow, host configuration and the like, and using methods such as correlation analysis, data dimension reduction and the like, main factors in the model are extracted: load rate (describing the load condition of a single process), prediction efficiency (the prediction result in M1, which is integrated into M2 in a factor manner), call ticket amount, user number, peripheral resource allocation (peripheral resources required to be used in service process operation, when the resource types are consistent, the actual efficiency will be affected by the size of the allocation), and a saturation efficiency model is established. And training a final model by using a feedforward neural network algorithm through methods of completion, normalization and the like.
For example, the model of M2 is constructed as follows:
the pricing efficiency (Z2) is saturation (Y1) to reference efficiency (Z1) to ticket amount (Y3) to user number (Y4) to peripheral resource allocation (Y5).
The functional formula represents: z2 ═ M2(Y1, Z1, Y3, Y4, Y5).
Two models were integrated to get: z2 ═ M2(Y1, M1(x1, x2, x3), Y3, Y4, Y5).
It can be understood that, because the production environment cannot be adjusted at will, if the result factor of the M1 model is not added, the M2 model can only be based on the available production actual data, and the flexibility, the prediction scenario, and the accuracy are poor. In the embodiment of the invention, the container efficiency under the ideal condition of M1 and full load is taken as a relevant factor and is integrated into the M2 model, so that the M1 model fully included in the M2 model has the advantages of tightly depending on production data and making up for the defects of the M1 model.
In the example, in the experiment, the effect of obtaining the actual processing efficiency of the host by the prediction model of the actual capacity resource is as shown in fig. 7, and it can be seen that the overall error is controlled within 0.03.
Step S104: and predicting the capacity resource required in the future preset time period according to the actual processing efficiency of the host.
In the embodiment of the invention, the actual processing efficiency of the host is the traffic processed in the unit time of the host, so that the capacity resource required to be configured in the future preset time period can be obtained by dividing the actual processing efficiency of the host by the total traffic which can be processed in the future preset time period, thereby achieving the effects of preparing in advance and ensuring smooth implementation of the traffic in the future preset time period.
Optionally, the capacity resource includes at least one of the following resources: the number of the hosts, the number of the virtual machines, the processing speed of a Central Processing Unit (CPU) and the size of a memory.
In the embodiment of the invention, the number of the hosts, the number of the virtual machines, the processing speed of the CPU and the memory size are considered as main factors influencing the computing capacity of the hosts, so that the number of the hosts, the number of the virtual machines, the processing speed of the CPU and the memory size can be used as capacity resources needing to be slowly predicted.
In summary, in the prediction method and apparatus for capacity resources provided in the embodiments of the present invention, in consideration that a large amount of training data is usually required in model training, a large amount of production data cannot be usually obtained in a production environment, and a model training cannot be usually performed or an accurate prediction model cannot be obtained only according to data of the production environment, in the embodiments of the present invention, a two-stage prediction model is used, specifically, a prediction model for a reference capacity resource may be a model trained in a test environment, and test data under various conditions may be collected in the test environment for training, so that the prediction model for the reference capacity resource may output reference processing efficiency under many conditions; when the prediction model of the actual capacity resource is used for prediction, the reference processing efficiency and the production data are used as the input of the prediction model of the capacity resource, so that the production environment and the test environment are considered, the accuracy of the capacity prediction is greatly improved, manual participation is not needed, and the prediction efficiency is higher. Specifically, in the embodiment of the present invention, after the target business process number and the target business parameter of the business running on the host are obtained according to the prediction policy, the target business process number is input into the prediction model of the pre-trained reference capacity resource to obtain the reference processing efficiency of the host, and then the reference processing efficiency is used as a factor, and the prediction model of the pre-trained actual capacity resource is input together with the target business parameter to obtain the actual processing efficiency of the host, so that the capacity resource required in the future preset time period can be predicted according to the actual processing efficiency of the host, thereby not only considering the production environment, but also considering the test environment, greatly improving the accuracy of the capacity prediction, and having higher prediction efficiency without manual participation.
Fig. 8 is a schematic diagram of a model architecture of a capacity resource prediction method according to an embodiment of the present invention. As shown in fig. 8, the capacity resource prediction method according to the present invention includes:
two data models are included: the prediction model (M1) of the reference capacity resource and the prediction model (M2) of the actual capacity resource are trained to obtain M1 through the core business process number, the basic business process number and the total business process number, and then after the target business short-range number is input into M1, the reference processing efficiency predicted by the M1 model can be used as a factor, and factor parameters collected by the production environment are added: and (3) training the load rate, the phone number, the user number and the memory bank number again to form a model M2, and finally obtaining a prediction result according to a certain prediction strategy.
On the one hand, the embodiment of the invention takes data as a basis, eliminates the interference of human experience and ensures the stability of prediction, thereby leading the production configuration to be completely dependent on the prediction result of the model to carry out the allocation of capacity resources and reducing unnecessary resource waste, on the other hand, various influence factors are added in the model, thereby leading various outstanding situations and various risk prediction scenes to be included, and when the emergency prediction or the complex scenes are faced, only the factor input configuration of the current scene needs to be changed, thus obtaining more accurate prediction results and improving the risk coping capability on the production environment.
In summary, in the prediction method and apparatus for capacity resources provided in the embodiments of the present invention, in consideration that a large amount of training data is usually required in model training, a large amount of production data cannot be usually obtained in a production environment, and a model training cannot be usually performed or an accurate prediction model cannot be obtained only according to data of the production environment, in the embodiments of the present invention, a two-stage prediction model is used, specifically, a prediction model for a reference capacity resource may be a model trained in a test environment, and test data under various conditions may be collected in the test environment for training, so that the prediction model for the reference capacity resource may output reference processing efficiency under many conditions; when the prediction model of the actual capacity resource is used for prediction, the reference processing efficiency and the production data are used as the input of the prediction model of the capacity resource, so that the production environment and the test environment are considered, the accuracy of the capacity prediction is greatly improved, manual participation is not needed, and the prediction efficiency is higher. Specifically, in the embodiment of the present invention, after the target business process number and the target business parameter of the business running on the host are obtained according to the prediction policy, the target business process number is input into the prediction model of the pre-trained reference capacity resource to obtain the reference processing efficiency of the host, and then the reference processing efficiency is used as a factor, and the prediction model of the pre-trained actual capacity resource is input together with the target business parameter to obtain the actual processing efficiency of the host, so that the capacity resource required in the future preset time period can be predicted according to the actual processing efficiency of the host, thereby not only considering the production environment, but also considering the test environment, greatly improving the accuracy of the capacity prediction, and having higher prediction efficiency without manual participation.
Fig. 9 is a schematic structural diagram of a prediction apparatus of capacity resources according to an embodiment of the present invention. As shown in fig. 9, the capacity resource prediction apparatus provided in the present embodiment includes:
an obtaining module 210, configured to obtain a target service process number and a target service parameter of a service running on a host according to a prediction policy;
a first input module 220, configured to input the target service process number into a pre-trained prediction model of a reference capacity resource, so as to obtain a reference processing efficiency of the host;
a second input module 230, configured to input the target service parameter and the reference processing efficiency into a pre-trained prediction model of an actual capacity resource, so as to obtain an actual processing efficiency of the host;
and the predicting module 240 is used for predicting the capacity resource required in the future preset time period according to the actual processing efficiency of the host.
Optionally, the target service parameter includes at least one of the following parameters: target load rate, target phone bill quantity, target user quantity and target peripheral resource ratio; and/or the presence of a gas in the gas,
the capacity resource comprises at least one of the following resources: the number of the hosts, the number of the virtual machines, the processing speed of a Central Processing Unit (CPU) and the size of a memory.
Optionally, when the service parameter includes the target phone number and the target user number, the obtaining module is further configured to:
and calculating the historical phone number and the historical user number by using an autoregressive moving average (ARMA) algorithm to obtain the target phone number and the target user number.
Optionally, the obtaining module is further configured to:
inputting the initial value of the number of the business processes into the prediction model of the reference capacity resource to obtain the test processing efficiency of the host; sequentially increasing the values of the business process numbers according to a preset step length, and inputting the prediction model of the reference capacity resource to obtain the test processing efficiency of the host corresponding to the business process numbers under different value-taking conditions; and determining the value of the corresponding service process number as the target service process number when the test processing efficiency of the host is the maximum value according to the test processing efficiency of the host corresponding to the service process number under different value conditions.
Optionally, the number of service processes includes at least one of the following process numbers: average core service process number, average important service process number and average total service process number.
A third aspect of embodiments of the present invention provides an electronic device, including: a processor, a memory, and a computer program; wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method of any of the preceding first aspects.
In summary, in the prediction method and apparatus for capacity resources provided in the embodiments of the present invention, in consideration that a large amount of training data is usually required in model training, a large amount of production data cannot be usually obtained in a production environment, and a model training cannot be usually performed or an accurate prediction model cannot be obtained only according to data of the production environment, in the embodiments of the present invention, a two-stage prediction model is used, specifically, a prediction model for a reference capacity resource may be a model trained in a test environment, and test data under various conditions may be collected in the test environment for training, so that the prediction model for the reference capacity resource may output reference processing efficiency under many conditions; when the prediction model of the actual capacity resource is used for prediction, the reference processing efficiency and the production data are used as the input of the prediction model of the capacity resource, so that the production environment and the test environment are considered, the accuracy of the capacity prediction is greatly improved, manual participation is not needed, and the prediction efficiency is higher. Specifically, in the embodiment of the present invention, after the target business process number and the target business parameter of the business running on the host are obtained according to the prediction policy, the target business process number is input into the prediction model of the pre-trained reference capacity resource to obtain the reference processing efficiency of the host, and then the reference processing efficiency is used as a factor, and the prediction model of the pre-trained actual capacity resource is input together with the target business parameter to obtain the actual processing efficiency of the host, so that the capacity resource required in the future preset time period can be predicted according to the actual processing efficiency of the host, thereby not only considering the production environment, but also considering the test environment, greatly improving the accuracy of the capacity prediction, and having higher prediction efficiency without manual participation.
The prediction apparatus of capacity resources provided in the embodiments of the present invention can be used to execute the methods shown in the corresponding embodiments, and the implementation manner and principle thereof are the same, and are not described again.
An embodiment of the present invention further provides an electronic device, including: a processor, a memory, and a computer program; wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method of any of the preceding embodiments.
Embodiments of the present invention also provide a computer-readable storage medium, which stores a computer program, and when the computer program is executed, the computer program implements the method according to any one of the foregoing embodiments.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A method for predicting capacity resources, the method comprising:
obtaining a target service process number and a target service parameter of a service running on the host according to the prediction strategy;
inputting the target business process number into a prediction model of a reference capacity resource obtained by pre-training according to test data in a test environment to obtain the reference processing efficiency of the host;
inputting the target service parameters and the reference processing efficiency into a prediction model of actual capacity resources obtained by pre-training according to production data in a production environment to obtain the actual processing efficiency of the host;
and predicting the capacity resource required in the future preset time period according to the actual processing efficiency of the host.
2. The method of claim 1, wherein the target traffic parameter comprises at least one of the following parameters: target load rate, target phone bill quantity, target user quantity and target peripheral resource ratio; and/or the presence of a gas in the gas,
the capacity resource comprises at least one of the following resources: the number of the hosts, the number of the virtual machines, the processing speed of a Central Processing Unit (CPU) and the size of a memory.
3. The method of claim 2, wherein obtaining the target service parameters of the service running on the host according to the prediction policy when the service parameters include the target phone number and the target user number comprises:
and calculating the historical phone number and the historical user number by using an autoregressive moving average (ARMA) algorithm to obtain the target phone number and the target user number.
4. The method according to any one of claims 1 to 3, wherein the obtaining the target number of business processes of the business running on the host according to the prediction policy comprises:
inputting the initial value of the number of the business processes into the prediction model of the reference capacity resource to obtain the test processing efficiency of the host;
sequentially increasing the values of the business process numbers according to a preset step length, and inputting the prediction model of the reference capacity resource to obtain the test processing efficiency of the host corresponding to the business process numbers under different value-taking conditions;
and determining the value of the corresponding service process number as the target service process number when the test processing efficiency of the host is the maximum value according to the test processing efficiency of the host corresponding to the service process number under different value conditions.
5. The method of claim 4, wherein the number of business processes comprises at least one of the following: average core service process number, average important service process number and average total service process number.
6. An apparatus for predicting capacity resources, comprising:
the obtaining module is used for obtaining the target service process number and the target service parameters of the service running on the host according to the prediction strategy;
the first input module is used for inputting the target business process number into a prediction model of a reference capacity resource obtained by pre-training according to test data in a test environment to obtain the reference processing efficiency of the host;
the second input module is used for inputting the target service parameters and the reference processing efficiency into a prediction model of actual capacity resources obtained by pre-training according to production data in a production environment to obtain the actual processing efficiency of the host;
and the prediction module is used for predicting the capacity resource required in the future preset time period according to the actual processing efficiency of the host.
7. The apparatus of claim 6, wherein the target traffic parameter comprises at least one of: target load rate, target phone bill quantity, target user quantity and target peripheral resource ratio; and/or the presence of a gas in the gas,
the capacity resource comprises at least one of the following resources: the number of the hosts, the number of the virtual machines, the processing speed of a Central Processing Unit (CPU) and the size of a memory.
8. The apparatus of claim 7, wherein when the service parameter comprises the target phone number and the target user number, the obtaining module is further configured to:
and calculating the historical phone number and the historical user number by using an autoregressive moving average (ARMA) algorithm to obtain the target phone number and the target user number.
9. The apparatus of any of claims 6-8, wherein the means for obtaining is further configured to:
inputting the initial value of the number of the business processes into the prediction model of the reference capacity resource to obtain the test processing efficiency of the host; sequentially increasing the values of the business process numbers according to a preset step length, and inputting the prediction model of the reference capacity resource to obtain the test processing efficiency of the host corresponding to the business process numbers under different value-taking conditions; and determining the value of the corresponding service process number as the target service process number when the test processing efficiency of the host is the maximum value according to the test processing efficiency of the host corresponding to the service process number under different value conditions.
10. The apparatus of claim 9, wherein the number of business processes comprises at least one of: average core service process number, average important service process number and average total service process number.
CN201910729638.1A 2019-08-08 2019-08-08 Capacity resource prediction method and device Active CN110445939B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910729638.1A CN110445939B (en) 2019-08-08 2019-08-08 Capacity resource prediction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910729638.1A CN110445939B (en) 2019-08-08 2019-08-08 Capacity resource prediction method and device

Publications (2)

Publication Number Publication Date
CN110445939A CN110445939A (en) 2019-11-12
CN110445939B true CN110445939B (en) 2021-03-30

Family

ID=68433984

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910729638.1A Active CN110445939B (en) 2019-08-08 2019-08-08 Capacity resource prediction method and device

Country Status (1)

Country Link
CN (1) CN110445939B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111275268A (en) * 2020-02-27 2020-06-12 中国联合网络通信集团有限公司 Pricing process efficiency prediction method, device, equipment and storage medium
CN111581070B (en) * 2020-05-07 2023-08-18 拉扎斯网络科技(上海)有限公司 Capacity determination method, device, electronic equipment and computer readable storage medium
CN113742187A (en) * 2021-08-05 2021-12-03 中移(杭州)信息技术有限公司 Capacity prediction method, device, equipment and storage medium of application system
CN114518988B (en) * 2022-02-10 2023-03-24 中国光大银行股份有限公司 Resource capacity system, control method thereof, and computer-readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101304590A (en) * 2008-04-14 2008-11-12 中国网络通信集团公司 Apparatus and method for determining wireless network capacitance of mobile communication network
CN103257896A (en) * 2013-01-31 2013-08-21 南京理工大学连云港研究院 Max-D job scheduling method under cloud environment
CN105471655A (en) * 2015-12-09 2016-04-06 中国联合网络通信集团有限公司 Method and device for determining power-on or power-off state of physical device in virtual cluster
CN109714395A (en) * 2018-12-10 2019-05-03 平安科技(深圳)有限公司 Cloud platform resource uses prediction technique and terminal device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10162397B2 (en) * 2016-03-03 2018-12-25 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Energy efficient workload placement management based on observed server efficiency measurements
CN107294764A (en) * 2017-04-26 2017-10-24 中国科学院信息工程研究所 Intelligent supervision method and intelligent monitoring system
CN108932182B (en) * 2018-07-06 2021-11-23 许继集团有限公司 Message bus performance test method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101304590A (en) * 2008-04-14 2008-11-12 中国网络通信集团公司 Apparatus and method for determining wireless network capacitance of mobile communication network
CN103257896A (en) * 2013-01-31 2013-08-21 南京理工大学连云港研究院 Max-D job scheduling method under cloud environment
CN105471655A (en) * 2015-12-09 2016-04-06 中国联合网络通信集团有限公司 Method and device for determining power-on or power-off state of physical device in virtual cluster
CN109714395A (en) * 2018-12-10 2019-05-03 平安科技(深圳)有限公司 Cloud platform resource uses prediction technique and terminal device

Also Published As

Publication number Publication date
CN110445939A (en) 2019-11-12

Similar Documents

Publication Publication Date Title
CN110445939B (en) Capacity resource prediction method and device
CN111614690B (en) Abnormal behavior detection method and device
CN109583904A (en) Training method, impaired operation detection method and the device of abnormal operation detection model
CN109711746A (en) A kind of credit estimation method and system based on complex network
CN111970400B (en) Crank call identification method and device
CN114721833A (en) Intelligent cloud coordination method and device based on platform service type
CN107832291A (en) Client service method, electronic installation and the storage medium of man-machine collaboration
CN109873790A (en) Network security detection method, device and computer readable storage medium
CN111626767B (en) Resource data issuing method, device and equipment
CN113205403A (en) Method and device for calculating enterprise credit level, storage medium and terminal
CN110910241B (en) Cash flow evaluation method, apparatus, server device and storage medium
CN117193992A (en) Model training method, task scheduling device and computer storage medium
CN113704389A (en) Data evaluation method and device, computer equipment and storage medium
CN117196630A (en) Transaction risk prediction method, device, terminal equipment and storage medium
CN113095515A (en) Service fault information processing method and device
CN109872183A (en) Intelligent Service evaluation method, computer readable storage medium and terminal device
CN115375965A (en) Preprocessing method for target scene recognition and target scene recognition method
CN113890948A (en) Resource allocation method based on voice outbound robot dialogue data and related equipment
KR20230075678A (en) Method and apparatus for predicting price change based on object recognition
CN109285559B (en) Role transition point detection method and device, storage medium and electronic equipment
CN113298641A (en) Integrity degree cognition method and device
CN113723663A (en) Power work order data processing method and device, electronic equipment and storage medium
CN111008078A (en) Batch processing method, device and equipment of data and computer storage medium
CN112905987B (en) Account identification method, device, server and storage medium
CN111178647A (en) Method, system and computer storage medium for pushing work order

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant