CN112508304A - Transaction object liquidity prediction method, device, equipment and storage medium - Google Patents

Transaction object liquidity prediction method, device, equipment and storage medium Download PDF

Info

Publication number
CN112508304A
CN112508304A CN202011531569.2A CN202011531569A CN112508304A CN 112508304 A CN112508304 A CN 112508304A CN 202011531569 A CN202011531569 A CN 202011531569A CN 112508304 A CN112508304 A CN 112508304A
Authority
CN
China
Prior art keywords
data
transaction object
prediction
prediction model
liquidity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011531569.2A
Other languages
Chinese (zh)
Inventor
李伟石
勾朝臣
朱凯
黄炜
谢华雯
李嘉欣
朱沁心
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Pudong Development Bank Co Ltd
Original Assignee
Shanghai Pudong Development Bank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Pudong Development Bank Co Ltd filed Critical Shanghai Pudong Development Bank Co Ltd
Priority to CN202011531569.2A priority Critical patent/CN112508304A/en
Publication of CN112508304A publication Critical patent/CN112508304A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/067Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/04Trading; Exchange, e.g. stocks, commodities, derivatives or currency exchange

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Physics & Mathematics (AREA)
  • Marketing (AREA)
  • General Physics & Mathematics (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Game Theory and Decision Science (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Educational Administration (AREA)
  • Technology Law (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the specification provides a method, a device, equipment and a storage medium for predicting liquidity of a transaction object, wherein the method comprises the following steps: acquiring data to be processed of a target transaction object; respectively inputting the data to be processed into each fluidity prediction model of the fluidity prediction model set to obtain a plurality of fluidity prediction sub-results; the mobility prediction model set is generated through a pre-training Bayesian neural network model; and according to the prediction weight of each mobility prediction model, carrying out weighted average on the plurality of mobility predictor results to obtain a mobility prediction result of the target transaction object. The embodiment of the specification can improve the accuracy of the liquidity prediction of the transaction object.

Description

Transaction object liquidity prediction method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of big data processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for predicting transaction object liquidity.
Background
In many fields of engineering, weather, finance, etc., data is often presented in a time-series manner (i.e., data is a set of random variables ordered in time). Time series data essentially reflects the tendency of some random variable or variables to change over time. Due to the statistical dependency between data, a time series is statistically considered to be an implementation of some random process. The data sequence and the data size in the time sequence both contain objective world and change information thereof, and represent a dynamic process of change. The significance of time series analysis is to study the statistical regularity (i.e. variation trend) of a certain time series in the long-term variation process so as to predict the future development according to the variation trend.
For example, in the case of liquidity prediction in the Financial field, a Financial Time Series (FTS) analysis predicts the future development of a transaction object (e.g., a Financial product, etc.) according to the past liquidity change trend. Currently, FTS analysis generally models the liquidity of the transaction object through a traditional linear or non-linear machine learning model (e.g., LSTM, XGBOOST, etc.), and then predicts the liquidity of the transaction object according to the created liquidity prediction model. However, in implementing the present application, the inventors of the present application found that: because the financial time series are full of noise, the liquidity of the transaction object is difficult to accurately predict by a prediction model obtained based on the traditional modeling method.
Disclosure of Invention
An object of the embodiments of the present specification is to provide a method, an apparatus, a device, and a storage medium for predicting liquidity of a transaction object, so as to improve accuracy of liquidity prediction of the transaction object.
To achieve the above object, in one aspect, an embodiment of the present specification provides a method for predicting liquidity of a transaction object, including:
acquiring data to be processed of a target transaction object;
respectively inputting the data to be processed into each fluidity prediction model of the fluidity prediction model set to obtain a plurality of fluidity prediction sub-results; the mobility prediction model set is generated through a pre-training Bayesian neural network model;
and according to the prediction weight of each mobility prediction model, carrying out weighted average on the plurality of mobility predictor results to obtain a mobility prediction result of the target transaction object.
In an embodiment of the present specification, the pre-training step of the mobility prediction model set includes:
acquiring a data set of a target transaction object; the data set is constructed on the basis of transaction data and external associated data in a designated historical period of a target transaction object;
calculating posterior distribution by using a training set in the data set; the posterior distribution is used for representing a Bayesian neural network model;
extracting parameter vectors from the posterior distribution, and generating a fluidity prediction model according to the parameter vectors;
testing whether the mobility prediction model meets a preset condition by using a test set in the data set;
and when the fluidity prediction model does not meet the preset condition, updating the parameter vectors in the posterior distribution according to a preset Bayesian inference method and performing iterative computation until the currently generated fluidity prediction model meets the preset condition.
In an embodiment of the present specification, the bayesian inference method includes a function space steiner variational gradient descent algorithm.
In an embodiment of the present specification, the function space steinmann variational gradient descent algorithm includes:
Figure BDA0002852248970000021
wherein, theta(s+1)Denotes the parameter, θ, after the S +1 sample iteration(s)Represents the parameters after the S-th iteration sampling, S represents the sequence number of the sampling sequence, X represents the training subset of the data, delta represents the step length,
Figure BDA0002852248970000022
represents the output function of the bayesian neural network,
Figure BDA0002852248970000023
the partial differentiation of the output function of the Bayes neural network is calculated at the position of the extracted parameter, T represents the matrix transposition,
Figure BDA0002852248970000024
SVGD processing of bayesian neural network output functions is represented.
In an embodiment of this specification, the acquiring data to be processed of the target transaction object includes:
acquiring transaction data to be processed of a target transaction object and external associated data;
and extracting feature data matched with an input layer of the liquidity prediction model set from the transaction data and the external associated data to serve as the data to be processed of the target transaction object.
In an embodiment of the present specification, the bayesian neural network model is a bayesian deep learning model.
In an embodiment of the present specification, the target transaction object includes a fixed-term money-class financing transaction object.
In another aspect, an embodiment of the present specification further provides a device for predicting liquidity of a transaction object, including:
the data acquisition module is used for acquiring data to be processed of the target transaction object;
the result obtaining module is used for respectively inputting the data to be processed into each fluidity prediction model of the fluidity prediction model set to obtain a plurality of fluidity prediction sub-results; the mobility prediction model set is generated by a pre-trained Bayesian neural network model represented by posterior distribution;
and the result fusion module is used for carrying out weighted average on the plurality of mobility predictor results according to the prediction weight of each mobility prediction model to obtain the mobility prediction result of the target transaction object.
In another aspect, the embodiments of the present specification further provide a computer device, which includes a memory, a processor, and a computer program stored on the memory, and when the computer program is executed by the processor, the computer program executes the instructions of the above method.
In another aspect, the present specification further provides a computer storage medium, on which a computer program is stored, and the computer program is executed by a processor of a computer device to execute the instructions of the method.
As can be seen from the technical solutions provided in the embodiments of the present specification, since the mobility prediction model set is generated by pre-training the bayesian neural network model, the parameter of each mobility prediction model in the mobility prediction model set is no longer a determined value (for example, an optimal value), but is a conditional distribution of continuous random variables; i.e., a more likely parameter is given a greater prediction weight, and a less likely parameter is given a lesser prediction weight in the prediction (rather than ignoring the less likely parameter). Thus, by considering the uncertainty of the parameters, the uncertainty of the prediction output can be better estimated, and the accuracy of the liquidity prediction of the trading object is improved.
Drawings
In order to more clearly illustrate the embodiments of the present specification or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present specification, and for those skilled in the art, other drawings can be obtained according to the drawings without any creative effort. In the drawings:
FIG. 1 illustrates a prior art training output diagram based on conventional financial timing modeling;
FIG. 2 is a schematic diagram of a training output based on Bayesian neural network modeling in an embodiment of the present description;
FIG. 3 illustrates a Bayesian neural network based model pre-training flow diagram in some embodiments of the present description;
FIG. 4 illustrates a flow diagram of a method for transaction object liquidity prediction in some embodiments of the present description;
FIG. 5 illustrates a flow prediction diagram based on a set of flow prediction models in some embodiments of the present description;
FIG. 6 is a block diagram illustrating the structure of a transaction object liquidity prediction apparatus in some embodiments of the present disclosure;
FIG. 7 shows a block diagram of a computer device in accordance with some embodiments of the present disclosure.
[ description of reference ]
61. A data acquisition module;
62. a result obtaining module;
63. a result fusion module;
702. a computer device;
704. a processor;
706. a memory;
708. a drive mechanism;
710. an input/output module;
712. an input device;
714. an output device;
716. a presentation device;
718. a graphical user interface;
720. a network interface;
722. a communication link;
724. a communication bus.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present specification, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is obvious that the described embodiments are only a part of the embodiments of the present specification, and not all of the embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present specification without any inventive step should fall within the scope of protection of the present specification.
The transaction object in the embodiment of the present specification may refer to a money-type financial transaction pair with a fixed term in a financial market (for example, money-type fund, bank money-type financial product, deposit-type product, and the like). The transaction data corresponding to these transaction objects is typically a financial time series. Research shows that the financial time series of the transaction objects have the characteristics of non-stability, non-linearity and high noise, and may fluctuate with seasons, noise and automatic correction, and overlap exists between useful signals and noise, that is, the financial time series has a certain degree of uncertainty, so that the liquidity of the transaction objects is difficult to accurately predict by a prediction model obtained based on a traditional modeling scheme. Moreover, because the time frame of these transaction objects is fixed and relatively short (e.g., three months, six months, one year, three years, five years, etc.), the number of samples of the corresponding financial time series is relatively small; and then the problem of overfitting is easy to occur in a prediction model obtained based on a traditional modeling scheme, so that the liquidity of a transaction object is more difficult to accurately predict.
It should be understood that the transaction object in the embodiment of the present specification is not limited to the above definition and explanation, and in other embodiments of the present specification, the transaction object may be any other suitable goods or services, etc. as needed.
In view of this, in order to improve the accuracy of the liquidity prediction of the transaction object, the embodiments of the present specification propose a new modeling scheme for the liquidity prediction of the transaction object, and propose a new scheme for predicting the liquidity of the transaction object by using a prediction model obtained by the new modeling scheme. For ease of understanding, before describing an embodiment in which the liquidity of the transaction object is predicted by using a prediction model obtained by the new modeling scheme, the new modeling scheme will be described.
A Bayesian Neural Network (BNN) technology is used in the transaction object mobility modeling process in the embodiments of the present specification. In some embodiments of the present specification, a Bayesian Deep Learning (BDL) technique that uses a Bayesian neural network technique is used in the transaction object fluidity modeling process of the embodiments of the present specification. For ease of understanding, the related art description is made below. However, those skilled in the art will appreciate that Bayesian deep learning based trading object liquidity modeling is merely an illustrative example in this specification. In other embodiments of the present disclosure, the liquidity modeling of the transaction object may also be implemented based on other bayesian neural networks, which is not limited in this specification, and may be specifically selected according to needs.
The parameters or particles mentioned in the following embodiments of the present specification generally refer to network parameters (e.g., weights and bias values) of a neural network (e.g., a bayesian neural network). Moreover, in the modeling and prediction scenarios based on bayesian neural networks, both the parameters and the particles refer to the distribution (i.e., the conditional distribution of continuous random variables) rather than some determined value.
Compared with the traditional deep learning, the Bayesian deep learning technology is more in line with the human reasoning process: explicit representation of a priori knowledge and empirical observations, computational models, and uncertainties in the data. Mobility prediction can actually be a supervised learning problem: training set using { (x)1,y1),(x2,y2),…,(xN,yN) Denotes wherein xnIs input, ynIs the output. Let X be { X ═ X1,…,xN},y={y1,…,yN}. Wishing to learn a neural network f with theta as a parameterθSo that for a new test data x, f can be usedθ(x) Its label is well predicted.
The traditional machine learning method can be equivalent to probabilistic modeling as follows:
Figure BDA0002852248970000061
then there is
Figure BDA0002852248970000062
Where C is a constant independent of theta. Thus, a common training objective function is as follows:
Figure BDA0002852248970000063
the above objective function is equivalent to max under the above modelingθp (y | θ; X); wherein p (y | theta; X) is called a Likelihood function of theta, and the method for estimating the parameter theta is called Maximum Likelihood (Maximum Likelihood) estimation. The model training and prediction process comprises the following steps: first solve for theta*=argmaxθp (y | theta; X), and then p (y | X)test) As test data xtestThe predicted distribution of (a) is,
Figure BDA0002852248970000066
as test data xtestThe predicted value of (2).
The above methods belong to the frequency school in machine learning. The school assumption is that although the parameter values in the probabilistic modeling are unknown, there is a real parameter θ that generates the data set, and therefore a specific value of this parameter needs to be inferred (e.g., as shown in fig. 1). The Bayesian school believes that the real value of the parameter theta cannot be deduced, and only a priori cognition can be performed on the parameter theta, and the cognition is updated by the observed data set. Therefore, the result obtained by bayesian inference is not the optimal value of θ, but the posterior distribution of θ (for example, as shown in fig. 2). Therefore, Bayesian school has a better prediction of uncertainty performance.
In particular, a prior distribution p may be defined for θ0(theta), the model training process is,its posterior distribution is calculated according to bayes' rule:
Figure BDA0002852248970000064
the prediction process is as follows: given a test data xtestThe corresponding prediction distribution p (y | theta; X) is taken for each parameter value through the posterior distribution p (theta | y; X)test) And (3) averaging:
p(y|y;xtest,X)=∫p(y|θ;xtest)p(θ|y;X)dθ (2)
then can be combined with
Figure BDA0002852248970000065
As the predicted value.
Although the above benefits are brought about by accurate bayesian inference, the integrals in equations (1) and (2) are difficult to calculate, and therefore approximate bayesian inference methods need to be developed to make their calculations possible. The better Bayesian inference method can more accurately approximate the real posterior distribution, and the more accurate approximation aims to more accurately estimate the prediction distribution obtained by the Bayesian method (namely, the formula (2)). In the application scenario of the transaction object in the embodiment of the description, a more reasonable approximate Bayesian inference method is adopted, so that the advantages of the Bayesian deep learning technology can be better embodied.
As described above, the bayesian inference process is to calculate equations (1) and (2). The method of approximate sampling can be used to approximate:
θ(1)(2),…,θ(S)~p(θ|y;X) (3)
Figure BDA0002852248970000071
thus, the key is equation (3), i.e., how to obtain an approximate sample from the posterior distribution p (θ | y; X). The posterior distribution p (θ | y; X) is abbreviated as p (θ) below.
In some embodiments of the present description, some conventional bayesian inference methods may be selected for bayesian inference in modeling the liquidity of the transaction object. For example, in an embodiment of the present specification, a bayesian inference algorithm such as Hamilton Monte Carlo (HMC), stevens gradient descent (SVGD), or the like may be used.
HMC algorithm
The HMC algorithm is one of the most commonly used traditional algorithms in the gradient-based Markov Chain Monte Carlo (MCMC) algorithm, and its algorithm principle is: extending a sampling variable θ (also called a particle) into a binary set (θ, r) with a target distribution of p (θ, r) ═ p (θ) p (r), where
Figure BDA0002852248970000072
The following differential equation was then modeled:
Figure BDA0002852248970000073
Figure BDA0002852248970000074
and needs to do regularly for r
Figure BDA0002852248970000075
Is re-sampled. Compared with a general MCMC algorithm with random walk, the method has the advantages that gradient information is utilized, so that the whole space can be efficiently traversed, and the smooth distribution can be efficiently converged.
(II) SVGD algorithm
SVGD and HMC belong to two kinds of Bayesian inference algorithms in nature, and the idea is more complex than that of HMC. Specifically, the HMC independently evolves each particle in a set of particles according to the same differential equation, and the values of the particles are independent of each other; SVGD is a distribution representation of a group of particles, and the purpose of the algorithm is to make the distribution closer and closer to the true posterior distribution, and the group of particles also tends to show a good representation of the posterior distribution. The optimization process of the SVGD is as follows:
θ(s)←θ(s)+∈v(θ(s))
wherein
Figure BDA0002852248970000081
k is a kernel function. As can be seen, particle θ(s)In the direction of iteration of
Figure BDA0002852248970000082
One, its role is to provide a repulsive force such that it is as far away as possible from other particles. Compared with the fact that each particle in the HMC is independent, diversity among the particles in the SVGD is kept in iteration, so that the SVGD can be more accurately represented by posterior distribution, and the variance of the estimation of the formula (4) is expected to be reduced.
In some embodiments of the present description, a bayesian inference method, originally created, may also be used for bayesian inference in modeling the liquidity of a transaction object. The inventive Bayesian inference method is referred to as function space Stent gradient descent (f-SVGD for short) in the present specification.
In carrying out the present application, the inventors of the present application found that: when the SVGD algorithm is applied to a Bayesian neural network (including BDL), the characteristic of over-parameterization of the neural network has a more obvious defect. The over-parameterization means that: neural networks of different parameters may express the same or similar functions. Therefore, although the SVGD requires diversity of parameter particles, the diversity of the obtained neural network function space cannot be guaranteed, and therefore prediction precision improvement and accurate uncertainty estimation brought by the Bayesian neural network are difficult to achieve better. The f-SVGD changes the idea that the traditional Bayesian neural network inference method finds posterior distribution on a parameter space, and then the process of approximating the posterior distribution by using the SVGD is carried out on a function space, namely, each particle is actually a function. Since the function space is infinite dimensional, the f-SVGD still needs to represent each particle with a neural network, but the updating of neural network parameters is significantly different from the SVGD, specifically as shown in the following equation:
Figure BDA0002852248970000083
wherein, theta(s+1)Denotes the parameter, θ, after the S +1 sample iteration(s)Represents the parameters after the S-th iteration sampling, S represents the sequence number of the sampling sequence, X represents the training subset of the data, delta represents the step length,
Figure BDA0002852248970000084
represents the output function of the bayesian neural network,
Figure BDA0002852248970000085
the partial differentiation of the output function of the Bayes neural network is calculated at the position of the extracted parameter, T represents the matrix transposition,
Figure BDA0002852248970000086
SVGD processing to represent output functions to a Bayesian neural network, in particular
Figure BDA0002852248970000087
Can be referred to above in relation to the function V (θ) of the SVGD part(s)) And will V (theta)(s)) Of(s)"replace with
Figure BDA0002852248970000088
V(θ(s)) Of(t)"alternative may be referenced.
The following describes the liquidity modeling of transaction objects (i.e., the model pre-training process of the liquidity prediction model set) according to the embodiment of the present disclosure with reference to the drawings. A model pre-training process based on a bayesian neural network in some embodiments of the present description is shown in fig. 3.
In the embodiments of the present specification, the target transaction object is a target object of liquidity prediction. After the target transaction object is determined, a data set can be constructed based on data of the target transaction object in a specified historical period; wherein the data within the specified historical period includes transaction data and external association data. The transaction data may include financial time series of the target transaction object and transaction subject (i.e., customer) related data, etc. The external related data is external related data which may affect the liquidity of the transaction object. For example, in one embodiment of the present disclosure, the external association data may include, but is not limited to, a security index (e.g., Shanghai 300 index, etc.), an interest rate (e.g., Shibor interest rate, Bingo interest rate).
By performing feature engineering on the transaction data and the external associated data, various feature data related to the transaction object can be extracted. For example, in an embodiment of the present specification, the extracted features may include time features (e.g., transaction time, etc.), customer behavior features (e.g., investment preference, consumption behavior, savings behavior, etc.), redemption features (i.e., redemption habits, etc.), and the like. The characteristic data not only covers the attribute characteristics (such as income, volume, risk level and the like) of the trading objects, the attribute characteristics (trading behaviors, asset characteristics, investment preference and the like) of trading subjects, but also contains the attribute characteristics of external associated data which can influence the liquidity of the trading objects, so that the fully comprehensive data support can be provided for subsequent modeling.
From the feature data, a feature matrix comprising a plurality of feature dimensions can be constructed, forming a data set. And then dividing the data set according to a set division strategy to obtain the training set and the test set.
Of course, in order to improve the training efficiency and facilitate reducing overfitting in the training process, the prior distribution of the trading objects can be constructed based on common sense (common sense) and expert knowledge (expert knowledge) related to the trading objects. The prior distribution refers to the prior distribution of the liquidity of the transaction objects. The prior distribution is: the a priori information about the unknown parameter x is represented by a distribution form p (x), which is referred to as the a priori distribution of the unknown parameter x. Therefore, the prior distribution can be understood as an empirical inference of a cause by the distribution of known information (related general knowledge and expertise) before the sample is extracted.
For example, Zhang III can be taken by walking, riding a bicycle, or driving to a place 10 km away, and assuming that everyone knows that Zhang III is a healthy person, then Zhang III has a lower possibility to drive to the place and a higher possibility to walk to the place, which is an a priori distribution obtained according to our common knowledge.
On the basis of obtaining the prior distribution, the posterior distribution can be calculated accordingly. The posterior distribution refers to the posterior distribution of liquidity of the transaction object. It has been elucidated above that prior to taking a sample, one knows about the unknown parameter as an a priori distribution. Then after the sample is taken, a priori distribution is obtained because the sample will typically contain new information about the unknown parameters, which may help one to correct the a priori distribution before sampling. Specifically, the conditional probability distribution of the unknown parameter can be calculated according to the extracted known sample information and the prior distribution of the unknown parameter. This distribution is called a posterior distribution because it is obtained after sampling.
Also taking the example that Zhang III can be used for body building by walking (including walking and running), cycling or driving to a certain place with a distance of 10 km away as an example, the prior distribution is as follows: the possibility of opening three cars to the place is small, and the possibility of walking to the place is large. In this event, the mode of transportation (walking, cycling or driving) may be considered the cause and the time spent is considered the result. If Zhang three takes an hour to arrive, then Zhang three is likely to be cycling past, and certainly less likely to be running past, or less driving past but traffic congestion is severe. If Zhang-three-one takes two hours to arrive, then presumably he was walking past. If Zhang three takes fifteen minutes to arrive, then Zhang three probably is driven.
As shown in fig. 3, in the application scenario of the transaction object in the embodiment of the present specification, it is assumed that the initial bayesian neural network model is θ(s)A posteriori distribution (i.e. prior distribution) tables for parametersShown in the figure. Wherein, theta(s)Representing a vector of parameters
Figure BDA0002852248970000101
θ(s)Is a distribution rather than a deterministic single value. Each time training is performed, a portion of samples may be randomly extracted (e.g., by hierarchical random sampling, etc.) from the training set as a training subset and input to the initial bayesian neural network model (e.g., in θ in fig. 3)(s)Is a posterior distribution of the parameter vectors) and computes a posterior distribution from the training subsets. That is, the sample information corresponding to the training subset can be used as a known condition, and the prior distribution can be calculated by using the above equation (1).
Extracting a parameter vector theta from the calculated posterior distribution(s)And according to said parameter vector theta(s)Generating a fluidity prediction model, i.e. generating a corresponding parameter vector theta(s)Each parameter (as in fig. 3)
Figure BDA0002852248970000102
) Is the posterior distribution of the parameters (as shown in the dashed box portion of fig. 3).
On the basis, whether the fluidity prediction model (shown as a dashed box part in fig. 3) meets the preset condition can be tested by using samples in the test set. That is, the flow prediction results may be merged by performing weighted average on the respective flow prediction sub-results (e.g., flow prediction sub-result 1 to flow prediction sub-result n in fig. 3) predicted by the flow prediction model. Of course, in the embodiments of the present description, the fluidity prediction result and each of the fluidity prediction sub-results are also a distribution, rather than a certain value. Thereafter, the fluidity prediction result can be calculated accordingly to determine whether it satisfies the preset condition according to the calculation result. The preset condition may include index values of one or more evaluation indexes for evaluating the quality of the model. The evaluation index may include any one or more of Accuracy (Accuracy), Confusion Matrix (fusion Matrix), Precision (Precision), Recall (Recall), and the like.
In the embodiment of the present specification, when the flowability prediction model satisfies the preset condition, a currently obtained flowability prediction model (as shown by a dashed box in fig. 3) may be output as a flowability prediction model set for a subsequent actual online application. When the mobility prediction model does not meet the preset condition, updating the parameter vectors in the posterior distribution according to a preset Bayesian inference method and performing iterative computation until the currently generated mobility prediction model meets the preset condition.
It has been stated hereinabove that in the pre-training of the embodiments of the present specification, either a traditional bayesian inference method, such as HMC or SVGD, may be used, or an original f-SVGD bayesian inference method may be used. When the f-SVGD Bayesian inference method is used, the problem that the SVGD is difficult to ensure the diversity of the obtained Bayesian neural network function space can be solved, so that the prediction precision improvement and the accurate uncertainty estimation brought by the Bayesian neural network can be better realized.
Based on the model obtained by the pre-training, the present specification provides embodiments of a transaction object liquidity prediction method, which can be applied to any suitable computer device. Referring to fig. 4, in some embodiments of the present description, the method for predicting liquidity of transaction objects may include the following steps:
s401, data to be processed of the target transaction object are obtained.
S402, respectively inputting the data to be processed into each fluidity prediction model of the fluidity prediction model set to obtain a plurality of fluidity prediction sub-results; the mobility prediction model set is generated through a pre-training Bayesian neural network model.
And S403, carrying out weighted average on the plurality of mobility predictor results according to the prediction weight of each mobility prediction model to obtain a mobility prediction result of the target transaction object.
In the embodiment of the present specification, since the mobility prediction model set is generated by pre-training the bayesian neural network model, the parameter of each mobility prediction model in the mobility prediction model set is no longer a determined value (for example, an optimal value), but is a conditional distribution of continuous random variables; i.e., a more likely parameter is given a greater prediction weight, and a less likely parameter is given a lesser prediction weight in the prediction (rather than ignoring the less likely parameter). Thus, by considering the uncertainty of the parameters, the uncertainty of the prediction output can be better estimated, and the accuracy of the liquidity prediction of the trading object is improved.
In some embodiments of the present specification, the acquiring the to-be-processed data of the target transaction object may include:
(1) and acquiring the transaction data to be processed of the target transaction object and external associated data.
The target transaction object, the transaction data and the external associated data may refer to the description of the relevant parts above, and are not described herein again.
(2) And extracting characteristic data matched with an input layer of the liquidity prediction model set from the transaction data and external associated data to serve as the data to be processed of the target transaction object.
In some embodiments of the present description, extracting feature data that matches an input layer of the set of flowability prediction models refers to: the extracted feature data is matched with the feature data in the pre-training process. For example, the feature data in the pre-training includes feature values of five dimensions of a, b, c, d, and e; in actual prediction, the data extracted from the transaction data and the external associated data should also include the feature values of the five dimensions a, b, c, d and e.
In the embodiment of the present specification, the prediction weight of each flowability prediction model in the flowability prediction model set may represent the importance (or contribution) of the flowability prediction model to the prediction in the entire flowability prediction model set. What has been elucidated above is: a more likely parameter is given a greater prediction weight, while a less likely parameter is given a lesser prediction weight in the prediction (rather than ignoring the less likely parameter). In addition, it should be further noted that, in the embodiment of the present specification, the prediction weight of each flowability prediction model in the flowability prediction model set is generally automatically generated in the process of pre-training the model, rather than being manually specified, so that the setting of the weight is more in line with the real situation, thereby being beneficial to further improving the prediction accuracy.
For example, in the embodiment shown in FIG. 5, the set of flowability prediction models comprises: posterior distribution is 1-m, and m fluidity prediction models are provided. When the feature data of the same batch is input into the m fluidity prediction models, m fluidity predictor results (for example, fluidity predictor result 1 to fluidity predictor result m in fig. 5) can be obtained correspondingly. On the basis, according to the prediction weight corresponding to each fluidity prediction model, the m fluidity predictor results can be weighted and averaged, so that a fused fluidity prediction result is obtained.
In some embodiments of the present description, the pre-training step of the mobility prediction model set includes:
acquiring a data set of a target transaction object; the data set is constructed on the basis of transaction data and external associated data in a designated historical period of a target transaction object;
calculating posterior distribution by using a training set in the data set; the posterior distribution is used for representing a Bayesian neural network model;
extracting parameter vectors from the posterior distribution, and generating a fluidity prediction model according to the parameter vectors;
testing whether the mobility prediction model meets a preset condition by using a test set in the data set;
and when the fluidity prediction model does not meet the preset condition, updating the parameter vectors in the posterior distribution according to a preset Bayesian inference method and performing iterative computation until the currently generated fluidity prediction model meets the preset condition.
In some embodiments of the present description, the bayesian inference method comprises a function space steiner variational gradient descent algorithm.
In some embodiments of the present description, the function space stewart factor gradient descent algorithm includes:
Figure BDA0002852248970000131
wherein, theta(s+1)Denotes the parameter, θ, after the S +1 sample iteration(s)Represents the parameters after the S-th iteration sampling, S represents the sequence number of the sampling sequence, X represents the training subset of the data, delta represents the step length,
Figure BDA0002852248970000132
represents the output function of the bayesian neural network,
Figure BDA0002852248970000133
the partial differentiation of the output function of the Bayes neural network is calculated at the position of the extracted parameter, and T represents the matrix transposition
Figure BDA0002852248970000134
SVGD processing of bayesian neural network output functions is represented.
In some embodiments of the present specification, the acquiring data to be processed of the target transaction object includes:
acquiring transaction data to be processed of a target transaction object and external associated data;
and extracting feature data matched with an input layer of the liquidity prediction model set from the transaction data and the external associated data to serve as the data to be processed of the target transaction object.
In some embodiments of the present description, the bayesian neural network model is a bayesian deep learning model.
In some embodiments of the present description, the target transaction object comprises a fixed-term money-class financing transaction object.
While the process flows described above include operations that occur in a particular order, it should be appreciated that the processes may include more or less operations that are performed sequentially or in parallel (e.g., using parallel processors or a multi-threaded environment).
In correspondence with the above-described transaction object liquidity prediction method, the present specification provides an embodiment of a transaction object liquidity prediction apparatus, and as shown in fig. 6, in some embodiments of the present specification, the transaction object liquidity prediction apparatus may include:
the data acquisition module 61 may be configured to acquire data to be processed of a target transaction object;
the result obtaining module 62 may be configured to input the data to be processed into each mobility prediction model of the mobility prediction model set, respectively, to obtain a plurality of mobility prediction sub-results; the mobility prediction model set is generated by a pre-trained Bayesian neural network model represented by posterior distribution;
the result fusion module 63 may be configured to perform weighted average on the multiple mobility predictor results according to the prediction weight of each mobility prediction model, so as to obtain a mobility prediction result of the target transaction object.
In some embodiments of the present description, the pre-training step of the mobility prediction model set includes:
acquiring a data set of a target transaction object; the data set is constructed on the basis of transaction data and external associated data in a designated historical period of a target transaction object;
calculating posterior distribution by using a training set in the data set; the posterior distribution is used for representing a Bayesian neural network model;
extracting parameter vectors from the posterior distribution, and generating a fluidity prediction model according to the parameter vectors;
testing whether the mobility prediction model meets a preset condition by using a test set in the data set;
and when the fluidity prediction model does not meet the preset condition, updating the parameter vectors in the posterior distribution according to a preset Bayesian inference method and performing iterative computation until the currently generated fluidity prediction model meets the preset condition.
In some embodiments of the present description, the bayesian inference method comprises a function space steiner variational gradient descent algorithm.
In some embodiments of the present description, the function space stewart factor gradient descent algorithm includes:
Figure BDA0002852248970000141
wherein, theta(s+1)Denotes the parameter, θ, after the S +1 sample iteration(s)Represents the parameters after the S-th iteration sampling, S represents the sequence number of the sampling sequence, X represents the training subset of the data, delta represents the step length,
Figure BDA0002852248970000142
represents the output function of the bayesian neural network,
Figure BDA0002852248970000143
the partial differentiation of the output function of the Bayes neural network is calculated at the position of the extracted parameter, T represents the matrix transposition,
Figure BDA0002852248970000144
SVGD processing of bayesian neural network output functions is represented.
In some embodiments of the present specification, the acquiring data to be processed of the target transaction object includes:
acquiring transaction data to be processed of a target transaction object and external associated data;
and extracting feature data matched with an input layer of the liquidity prediction model set from the transaction data and the external associated data to serve as the data to be processed of the target transaction object.
In some embodiments of the present description, the bayesian neural network model is a bayesian deep learning model.
In some embodiments of the present description, the target transaction object comprises a fixed-term money-class financing transaction object.
In the embodiment of the present specification, since the mobility prediction model set is generated by pre-training the bayesian neural network model, the parameter of each mobility prediction model in the mobility prediction model set is no longer a determined value (for example, an optimal value), but is a conditional distribution of continuous random variables; i.e., a more likely parameter is given a greater prediction weight, and a less likely parameter is given a lesser prediction weight in the prediction (rather than ignoring the less likely parameter). Thus, by considering the uncertainty of the parameters, the uncertainty of the prediction output can be better estimated, and the accuracy of the liquidity prediction of the trading object is improved. Moreover, in the embodiments of the present specification, a prior distribution of the transaction object may be constructed based on common sense (common sense) and expert knowledge (expert knowledge) about the transaction object, so as to serve as an initial posterior distribution for representing the initial bayesian neural network model, thereby improving training efficiency and being beneficial to reducing overfitting during training.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the various elements may be implemented in the same one or more software and/or hardware implementations of the present description.
Embodiments of the present description also provide a computer device. As shown in FIG. 7, in some embodiments of the present description, the computer device 702 may include one or more processors 704, such as one or more Central Processing Units (CPUs) or Graphics Processors (GPUs), each of which may implement one or more hardware threads. The computer device 702 may also include any memory 706 for storing any kind of information, such as code, settings, data, etc., and in a particular embodiment, a computer program on the memory 706 and executable on the processor 704, which computer program when executed by the processor 704 may perform instructions according to the above-described method. For example, and without limitation, the memory 706 can include any one or more of the following in combination: any type of RAM, any type of ROM, flash memory devices, hard disks, optical disks, etc. More generally, any memory may use any technology to store information. Further, any memory may provide volatile or non-volatile retention of information. Further, any memory may represent fixed or removable components of computer device 702. In one case, when the processor 704 executes associated instructions that are stored in any memory or combination of memories, the computer device 702 can perform any of the operations of the associated instructions. The computer device 702 also includes one or more drive mechanisms 708, such as a hard disk drive mechanism, an optical disk drive mechanism, or the like, for interacting with any memory.
Computer device 702 can also include an input/output module 710(I/O) for receiving various inputs (via input device 712) and for providing various outputs (via output device 714). One particular output mechanism may include a presentation device 716 and an associated graphical user interface 718 (GUI). In other embodiments, input/output module 710(I/O), input device 712, and output device 714 may also not be included, as only one computer device in a network. Computer device 702 can also include one or more network interfaces 720 for exchanging data with other devices via one or more communication links 722. One or more communication buses 724 couple the above-described components together.
Communication link 722 may be implemented in any manner, such as over a local area network, a wide area network (e.g., the Internet), a point-to-point connection, etc., or any combination thereof. Communication link 722 may include any combination of hardwired links, wireless links, routers, gateway functions, name servers, etc., governed by any protocol or combination of protocols.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products of some embodiments of the specification. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processor to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processor, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processor to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processor to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computer device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium which can be used to store information that can be accessed by a computer device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The embodiments of this specification may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The described embodiments may also be practiced in distributed computing environments where tasks are performed by remote processors that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment. In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of an embodiment of the specification. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A method for predicting liquidity of a transaction object, comprising:
acquiring data to be processed of a target transaction object;
respectively inputting the data to be processed into each fluidity prediction model of the fluidity prediction model set to obtain a plurality of fluidity prediction sub-results; the mobility prediction model set is generated through a pre-training Bayesian neural network model;
and according to the prediction weight of each mobility prediction model, carrying out weighted average on the plurality of mobility predictor results to obtain a mobility prediction result of the target transaction object.
2. The method for forecasting the liquidity of a transaction object according to claim 1, wherein the pre-training step of the liquidity forecasting model set comprises:
acquiring a data set of a target transaction object; the data set is constructed on the basis of transaction data and external associated data in a designated historical period of a target transaction object;
calculating posterior distribution by using a training set in the data set; the posterior distribution is used for representing a Bayesian neural network model;
extracting parameter vectors from the posterior distribution, and generating a fluidity prediction model according to the parameter vectors;
testing whether the mobility prediction model meets a preset condition by using a test set in the data set;
and when the fluidity prediction model does not meet the preset condition, updating the parameter vectors in the posterior distribution according to a preset Bayesian inference method and performing iterative computation until the currently generated fluidity prediction model meets the preset condition.
3. The method of predicting transaction object liquidity of claim 2, wherein the bayesian inference method comprises a function space steiny factor gradient descent algorithm.
4. The method of predicting transaction object liquidity of claim 3, wherein the function space stewart factor gradient descent algorithm comprises:
Figure FDA0002852248960000011
wherein, theta(s+1)Denotes the parameter, θ, after the S +1 sample iteration(s)Represents the parameters after the S-th iteration sampling, S represents the sequence number of the sampling sequence, X represents the training subset of the data, delta represents the step length,
Figure FDA0002852248960000012
represents the output function of the bayesian neural network,
Figure FDA0002852248960000013
bits representing parameters in the extractionPartial differentiation is carried out on the output function of the Bayes neural network, T represents matrix transposition,
Figure FDA0002852248960000014
SVGD processing of bayesian neural network output functions is represented.
5. The method for predicting transaction object liquidity of claim 1, wherein the obtaining the data to be processed of the target transaction object comprises:
acquiring transaction data to be processed of a target transaction object and external associated data;
and extracting feature data matched with an input layer of the liquidity prediction model set from the transaction data and the external associated data to serve as the data to be processed of the target transaction object.
6. The method of predicting transaction object liquidity of claim 1, wherein the bayesian neural network model is a bayesian deep learning model.
7. The transaction object liquidity prediction method of claim 1, wherein the target transaction object comprises a fixed-term money-class financing transaction object.
8. A transaction object liquidity prediction device is characterized by comprising:
the data acquisition module is used for acquiring data to be processed of the target transaction object;
the result obtaining module is used for respectively inputting the data to be processed into each fluidity prediction model of the fluidity prediction model set to obtain a plurality of fluidity prediction sub-results; the mobility prediction model set is generated by a pre-trained Bayesian neural network model represented by posterior distribution;
and the result fusion module is used for carrying out weighted average on the plurality of mobility predictor results according to the prediction weight of each mobility prediction model to obtain the mobility prediction result of the target transaction object.
9. A computer device comprising a memory, a processor, and a computer program stored on the memory, wherein the computer program, when executed by the processor, performs the instructions of the method of any one of claims 1-7.
10. A computer storage medium on which a computer program is stored, characterized in that the computer program, when being executed by a processor of a computer device, executes instructions of a method according to any one of claims 1-7.
CN202011531569.2A 2020-12-22 2020-12-22 Transaction object liquidity prediction method, device, equipment and storage medium Pending CN112508304A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011531569.2A CN112508304A (en) 2020-12-22 2020-12-22 Transaction object liquidity prediction method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011531569.2A CN112508304A (en) 2020-12-22 2020-12-22 Transaction object liquidity prediction method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112508304A true CN112508304A (en) 2021-03-16

Family

ID=74922010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011531569.2A Pending CN112508304A (en) 2020-12-22 2020-12-22 Transaction object liquidity prediction method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112508304A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113705786A (en) * 2021-08-26 2021-11-26 阿里巴巴(中国)有限公司 Model-based data processing method and device and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1107157A2 (en) * 1999-12-01 2001-06-13 International Business Machines Corporation System and method for performing predictive analysis
CN111178639A (en) * 2019-12-31 2020-05-19 北京明略软件系统有限公司 Method and device for realizing prediction based on multi-model fusion
CN111199472A (en) * 2019-12-12 2020-05-26 上海淇玥信息技术有限公司 Method and device for predicting liquidity of financial resources and electronic equipment
CN111861544A (en) * 2020-06-19 2020-10-30 银清科技有限公司 Participant account liquidity prediction method and device
CN112017061A (en) * 2020-07-15 2020-12-01 北京淇瑀信息科技有限公司 Financial risk prediction method and device based on Bayesian deep learning and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1107157A2 (en) * 1999-12-01 2001-06-13 International Business Machines Corporation System and method for performing predictive analysis
CN111199472A (en) * 2019-12-12 2020-05-26 上海淇玥信息技术有限公司 Method and device for predicting liquidity of financial resources and electronic equipment
CN111178639A (en) * 2019-12-31 2020-05-19 北京明略软件系统有限公司 Method and device for realizing prediction based on multi-model fusion
CN111861544A (en) * 2020-06-19 2020-10-30 银清科技有限公司 Participant account liquidity prediction method and device
CN112017061A (en) * 2020-07-15 2020-12-01 北京淇瑀信息科技有限公司 Financial risk prediction method and device based on Bayesian deep learning and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113705786A (en) * 2021-08-26 2021-11-26 阿里巴巴(中国)有限公司 Model-based data processing method and device and storage medium
CN113705786B (en) * 2021-08-26 2024-06-04 阿里巴巴(中国)有限公司 Model-based data processing method, device and storage medium

Similar Documents

Publication Publication Date Title
Karasu et al. Crude oil time series prediction model based on LSTM network with chaotic Henry gas solubility optimization
CN109461001B (en) Method and device for obtaining training sample of first model based on second model
US11663486B2 (en) Intelligent learning system with noisy label data
CN110097088A (en) A kind of dynamic multi-objective evolvement method based on transfer learning Yu particular point strategy
CN115661500B (en) Target detection method based on second-order distribution and uncertainty perception clustering fusion
Lytvynenko et al. Bayesian Networks' Development Based on Noisy-MAX Nodes for Modeling Investment Processes in Transport.
CN114219360A (en) Monitoring safety prediction method and system based on model optimization
Chu et al. Feature selection using approximated high-order interaction components of the Shapley value for boosted tree classifier
Burtini et al. Improving online marketing experiments with drifting multi-armed bandits
US20190139144A1 (en) System, method and computer-accessible medium for efficient simulation of financial stress testing scenarios with suppes-bayes causal networks
CN114565021A (en) Financial asset pricing method, system and storage medium based on quantum circulation neural network
CN112508304A (en) Transaction object liquidity prediction method, device, equipment and storage medium
CN111445025A (en) Method and device for determining hyper-parameters of business model
CN115358330A (en) Client user loss prediction method, device, equipment and storage medium
da Silva et al. Prior specification via prior predictive matching: Poisson matrix factorization and beyond
Zhou Explainable AI in request-for-quote
Sakieh et al. Rules versus layers: which side wins the battle of model calibration?
Jabbari et al. Obtaining accurate probabilistic causal inference by post-processing calibration
Bonet et al. Factored probabilistic belief tracking
Khairuddin et al. Parameter optimization of gradient tree boosting using dragonfly algorithm in crime forecasting and analysis
Al Ali et al. Enhancing financial distress prediction through integrated Chinese Whisper clustering and federated learning
Sukestiyarno et al. Algorithm Optimizer in GA-LSTM for Stock Price Forecasting
Sun et al. PAC-Bayesian offline Meta-reinforcement learning
BOLU et al. Comparing Regression Models: Balancing Accuracy with Computational Efficiency
BOLU et al. Accuracy vs. Efficiency: A Comparative Analysis of Regression Machine Learning Algorithms

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210316