CN108416460A - Cyanobacterial bloom prediction technique based on the random depth confidence network model of multifactor sequential- - Google Patents

Cyanobacterial bloom prediction technique based on the random depth confidence network model of multifactor sequential- Download PDF

Info

Publication number
CN108416460A
CN108416460A CN201810059108.6A CN201810059108A CN108416460A CN 108416460 A CN108416460 A CN 108416460A CN 201810059108 A CN201810059108 A CN 201810059108A CN 108416460 A CN108416460 A CN 108416460A
Authority
CN
China
Prior art keywords
moment
indicate
hidden layer
random
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810059108.6A
Other languages
Chinese (zh)
Other versions
CN108416460B (en
Inventor
王立
王小艺
许继平
于家斌
张天瑞
张慧妍
赵峙尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Technology and Business University
Original Assignee
Beijing Technology and Business University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Technology and Business University filed Critical Beijing Technology and Business University
Publication of CN108416460A publication Critical patent/CN108416460A/en
Application granted granted Critical
Publication of CN108416460B publication Critical patent/CN108416460B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Computational Linguistics (AREA)
  • Development Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Game Theory and Decision Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a kind of cyanobacterial bloom prediction techniques based on the random depth confidence network model of multifactor sequential-, belong to water environment electric powder prediction.Improved depth confidence network method is combined by the present invention with multifactor time series analysis method, build a kind of multifactor sequential-then random depth confidence network model uses MT-RDBN models go discretization, RCRBM learning algorithm and MT-RDBN model parameters are finely tuned.The present invention uses autoregression model and multifactor regression model in time series when establishing model, considering influence factor, therefore MT-DBN model can predict the characterization factor of future time instance by the chlorophyll concentration and influence factor data of current time and historical juncture, reduce sample usage amount, improves precision of prediction.

Description

Cyanobacterial bloom prediction based on the random depth confidence network model of multifactor sequential- Method
Technical field
The present invention relates to a kind of wawter bloom prediction techniques, belong to water environment electric powder prediction.Specifically, it is to wawter bloom A kind of raising precision of prediction for the random depth confidence network model of multifactor sequential-that generating process is established after being analyzed Cyanobacterial bloom prediction technique.
Background technology
With the development of the social economy, body eutrophication phenomenon is more universal, the normal life of people has been seriously affected. Body eutrophication phenomenon is happened in fresh water, and by Water, phosphorus, potassium content is excessively high leads to algae emergentness hyper-proliferative A kind of natural phenomena.Body eutrophication Crack cause be mainly the elements such as nitrogen, phosphorus, potassium be drained into flow velocity slowly, update week Phase length surface water body, make the aquatiles such as algae growth and breeding in large quantities, and make organic matter generate speed considerably beyond Depletion rate, organic matter is put aside in water body, destroys the process of aquatic ecological balance.When eutrophication occurs in water body, swim Algal bloom forms wawter bloom.Therefore, there is important meaning come the generation of effectively preventing wawter bloom phenomenon by predictions and simulations Justice.
At this stage, cyanobacterial bloom prediction multiselect uses chlorophyll concentration as the characterization factor of model.In existing blue algae water Mainly there is the historical juncture only by characterization factor in magnificent prediction technique and when prediction of coming following several moment characterize factors Data-driven model is built such as the methods of neural network, linear regression, support vector machines and according to the mechanism of breaking out of cyanobacterial bloom The mathematical model of vertical influence factor and the factor of characterization, such as the methods of vegetation ecology survey.Though the data-driven model of single factor test The mechanism of breaking out of cyanobacterial bloom is not considered so, but there is no the interactions considered between influence factor and characterization factor.And Although multifactor mathematical model can react the mathematics for breaking out mechanism, establishing between influence factor and characterization factor of wawter bloom Relationship, but mathematical model establishes difficulty, and also the mechanism of breaking out of wawter bloom has nonlinearity and uncertainty, therefore only with number Model is learned also to be difficult to predict the generation of wawter bloom well.
It is widely used in wawter bloom prediction currently based on the Artificial Neural Network of data-driven, such as BP neural network. But the problems such as it is long that BP neural network is susceptible to convergence time, Local Minimum.At this stage, the ecology based on mechanism driving is dynamic Mechanical model is widely used in the research of wawter bloom, but it does not account for characterization factor and influence factor at any time Between variation influence that breakout of cyanobacteria blooms is brought.
Therefore, after understanding the advantage and disadvantage of above-mentioned data-driven model and mechanism driving model, how in cyanobacterial bloom Time variable is introduced between characterization factor and influence factor, and it is cyanobacterial bloom research to find a kind of intelligent method of Optimal Parameters Urgent problem to be solved in field.
Invention content
That the purpose of the present invention is to solve existing wawter bloom precision of predictions is not high, can not only pass through single factors or list One mathematical model method processing nonlinearity system forecasting problem, by improved depth confidence network method with it is multifactor Time series analysis method is combined, and builds a kind of multifactor sequential-and random depth confidence network model, this method can improve The precision of prediction of wawter bloom provides a kind of new approaches for cyanobacterial bloom prediction.
It is provided by the invention based on multifactor sequential-the cyanobacterial bloom prediction technique of random depth confidence network model, it is main To include following 4 steps:
Step 1: establishing MT-RDBN models
Time series models are that the numerical value of certain statistical indicator is arranged in dynamic series according to chronological order.Water China's prediction is a dynamic time sequence problem, and chlorophyll is the characterization most direct index of cyanobacteria standing crop in water body, therefore will Characterization factor of the chlorophyll as reaction cyanobacterial bloom phenomenon.And breaking out for cyanobacterial bloom phenomenon is not only related with characterization factor, Also there is certain relationships with other influence factors, therefore using factor pH value, ammonia nitrogen and water temperature as the shadow of reaction wawter bloom phenomenon The factor of sound.Factor or influence factor are either characterized, they all have the characteristics of changing over time.Therefore future time instance is established Characterization factor and current time and the characterization factor of historical juncture and the sequential relationship of influence factor, and then when establishing multifactor Sequence-random depth confidence network model is abbreviated as MT-RDBN models.MT-RDBN models by multiple Stochastic Conditions Bohr hereby Graceful machine (RCRBM) and BP neural network composition, RCRBM are responsible for MT-the pre-training of RDBN models.BP neural network is responsible for fine tuning MT-the model parameter of RDBN models.
Step 2: MT-RDBN models go discretization
Since required pre-training and test data are all continuous time series data, it is therefore desirable to traditional processing two-value Depth confidence network DBN be improved, discretization is gone to S type functions and is added after gaussian random sequence again by sdpecific dispersion Algorithm carries out positive calculating and backwards calculation, can preferably handle continuous time series data.
Step 3: the learning algorithm of RCRBM
(1) the weight learning algorithm of RCRBM
The weight learning algorithm of RCRBM is still used to sdpecific dispersion algorithm, but adds Bernoulli Jacob's random entry after weight update, RCRBM models are made partly to be connected between current time input layer and current time hidden layer neuron in iteration each time And connection is different from each time.
(2) the input layer offset algorithm of RCRBM
RCRBM not only considers the biasing difference generated to sdpecific dispersion algorithm, also wants when carrying out input layer biasing update Consider that input layer chlorophyll concentration historical juncture and input layer influence factor current time and historical juncture and input layer leaf are green The offset change that the connection at plain concentration current time generates.
(3) the hidden layer offset algorithm of RCRBM
RCRBM not only considers the biasing difference generated to sdpecific dispersion algorithm when hidden layer is biased and updated, it is also contemplated that Input layer chlorophyll concentration historical juncture and input layer influence factor current time and historical juncture and hidden layer current time Connection generate offset change.When model training, increase Bernoulli Jacob's random entry closing hidden layer appropriate before biasing updates Partial nerve member, makes the god of input layer chlorophyll concentration historical juncture and input layer influence factor current time and historical juncture The neuron opened with hidden layer current time through member is connect, so make RCRBM models each time iteration all using different companies Connect mode.
Step 4: MT-RDBN model parameters are finely tuned
Due to MT-RDBN increases on the basis of traditional DBN historical juncture and the influence factor of chlorophyll concentration The connection at current time and historical juncture, therefore needed to multifactor sequential when using BP neural network back-propagation algorithm- The parameter that all connections generate in random depth confidence network model carries out global fine tuning.
The advantage of the invention is that:
1. the present invention uses autoregression model and multifactor regression model in time series when establishing model, consider Therefore MT-DBN model can pass through the chlorophyll concentration and influence factor data at current time and historical juncture to influence factor The characterization factor of future time instance is predicted, precision of prediction is improved, reduces sample usage amount.
2. S type functions are gone discretization and added after gaussian random is distributed again by being carried out just to sdpecific dispersion algorithm by the present invention To calculating and backwards calculation, enabling MT-RDBN models preferably handle continuous data.
3. the present invention weight more new algorithm of traditional RBM is improved, weight update after increase Bernoulli Jacob with Machine item removes the part connection between input layer current time and the neuron at hidden layer current time.It can slightly contract in this way The run time of short model simultaneously being capable of the generalization ability appropriate that improve model.
4. the present invention improves traditional RBM input layers biasing more new algorithm, it is dense to increase input layer chlorophyll It spends between historical juncture and input layer influence factor current time and historical juncture and input layer characterization factor current time Connection, processing sequence problem of being more convenient in this way, improves precision of prediction.
5. the present invention improves traditional RBM hidden layers biasing more new algorithm, it is dense to increase input layer chlorophyll The connection between historical juncture and input layer influence factor current time and historical juncture and hidden layer current time is spent, in this way It is more convenient for handling sequence problem, improves precision of prediction.It is random before hidden layer biasing update simultaneously to close partial nerve member, make defeated The neuron and hidden layer for entering layer chlorophyll concentration historical juncture and input layer influence factor current time and historical juncture are current The neuron at moment connects at random, can slightly shorten MT in this way-and the run time of RDBN models simultaneously being capable of raising mould appropriate The generalization ability of type.
Polynary sequential depth confidence network parameter is carried out 6. present invention employs the back-propagation algorithms of BP neural network Fine tuning.Multifactor sequential-random depth confidence network increases 4 kinds of connections on the basis of traditional DBN.Therefore reversed The parameter generated to all connections is needed to be finely adjusted when fine tuning.Due to increasing chlorophyll concentration historical juncture and influence factor Current time and historical juncture, in reversed fine tuning can better Optimal Parameters, improve precision of prediction.
Description of the drawings
Fig. 1 is the cyanobacterial bloom prediction technique flow based on the random depth confidence network of multifactor sequential-of the present invention Figure.
Fig. 2 is that the present invention is based on multifactor sequential-the lake and reservoir algal bloom prediction technique of random depth confidence network model Structure chart (two layers of RCRBM, and two layers of RCRBM structure is identical, therefore second of RCRBM is not drawn specifically).
Fig. 3 is the chlorophyll concentration change curve of selected training sample after normalized.
Fig. 4, Fig. 5 and Fig. 6 are that influence factor pH value, the ammonia nitrogen of the selected training sample after normalized are dense The change curve of degree and water temperature.
Fig. 7 is by multifactor sequential-the chlorophyll at (t+1) moment that random depth confidence network model is predicted The predicted value of concentration and the change curve of actual value.
Specific implementation mode
Below in conjunction with drawings and examples 1, the present invention is described in further detail.
The present invention is a kind of based on multifactor sequential-the cyanobacterial bloom prediction technique of random depth confidence network model, tool The flow chart that body method is implemented is as shown in Figure 1, be achieved by the steps of:
Step 1: establishing MT-RDBN models
Relationship between input data and output data is as shown in Figure 2.As shown in Figure 2, vtIndicate that input layer t moment leaf is green Plain concentration matrix, vt-pIndicate input layer t-p moment chlorophyll concentration matrixes,Indicate j-th of input layer t-p moment influence because Plain concentration matrix.htIndicate hidden layer t moment matrix.Input layer is responsible for receiving the chlorophyll concentration of historical juncture and current time With the value of influence factor, and the autoregression model between input layer establishes input data and multifactor regression model.It hides Layer is responsible for the feature of extraction input layer data, and output layer then indicates the concentration of the following chlorophyll, finally builds MT-RCRBM moulds Type.
Step 2: MT-RDBN models go discretization
In order to enable MT-RDBN models preferably handle continuous data, by calculating hidden layer to sdpecific dispersion algorithm It is the gaussian random sequence that 0 variance is 1 with one mean value of discretization and addition is gone to S type functions before the value of input layer.It is specific public Formula is as follows:
Wherein, WtIndicate the weight matrix of t moment, atIndicate the input layer bias matrix of t moment, btIndicate the hidden of t moment Layer bias matrix is hidden, N (0,1) indicates that mean value is the gaussian random distribution that 0 variance is 1.Indicate a S type function, table It is up to formula:
Step 3: the learning algorithm of RCRBM
(1) RCRBM weights learning algorithm
RCRBM still using the variation for calculating sdpecific dispersion algorithm weight, carries out more weight after model iteration is primary Newly.Specific more new formula is as follows:
Wherein,WithThe front and back weight matrix of t moment update is indicated respectively.ΔWtIt indicates using to sdpecific dispersion algorithm Weight variation afterwards, specific formula are as follows:
ΔWt=η (< vtht0- < vtht1) (5)
Wherein, < >0Indicate the mathematic expectaion of data set, < >1It indicates by once to the weight after sdpecific dispersion algorithm The mathematic expectaion of structure value.η indicates the learning rate of study RCRBM.
After updating weight, addition Bernoulli Jacob's random entry and then appropriate random reduction input layer current time and hidden layer Neuron connection between current time, in this way in iteration each time, input layer current time and hidden layer current time The connection type of neuron is different.Specific formula is as follows:
Wherein, the probability of connection is preserved after r expressions addition Bernoulli Jacob's random entry.SymbolIndicate a kind of operation, operation Rule is symbolThe Bernoulli Jacob's random matrix and symbol that left side is generated with r probabilityElement in the matrix of right side, which corresponds to, to be multiplied.
Since iteration each time is all to randomly generate the RCRBM of one " thin " to be iterated.Therefore right after iteration Newer t moment weight matrix WtIt is restored, obtains final weight matrix, specific formula is as follows:
Wherein,Final weight matrix after expression iteration.
(2) input layer of RCRBM biases learning algorithm
It is the history that chlorophyll concentration is increased on the basis of traditional RBM algorithms that RCRBM input layers, which bias learning algorithm, Connection and chlorophyll concentration current time and influence factor current time and history of the moment with chlorophyll concentration current time The connection at moment.Therefore the more new formula of RCRBM input layer offset algorithms is:
Wherein,Indicate the input layer bias matrix before update.ΔAt-pIndicate the input layer chlorophyll concentration t-p moment with The weight variation that the connection of input layer chlorophyll concentration t moment generates,Indicate j-th of influence factor t-p moment of input layer The weight that connection with input layer chlorophyll concentration t moment generates changes.ΔA'tIndicate input layer chlorophyll concentration t moment warp Cross the biasing difference to itself generation after sdpecific dispersion algorithm.Wherein, Δ At-p, Δ A't,Formula it is as follows:
ΔAt-p=vt-p(< vt0- < vt1) (9)
ΔAt'=< vt0- < vt1 (10)
(3) hidden layer of RCRBM biases learning algorithm
RCRBM hidden layers biasing learning algorithm is broadly divided into two steps.
The first step is the history that input layer chlorophyll concentration is increased on the basis of traditional RBM hidden layers learning algorithm When the connection at moment and hidden layer current time and input layer influence factor current time and historical juncture current with hidden layer The connection at quarter.Therefore the more new formula of improved RBM hidden layers offset algorithm is:
Wherein,WithThe front and back hidden layer biasing of update, Δ C are indicated respectivelyt-pIndicate input layer chlorophyll concentration t-p The weight that the connection of moment and hidden layer t moment generates changes,Indicate j-th of influence factor t-p moment of input layer with it is hidden Hide the weight variation that the connection of layer t moment generates.ΔC'tIndicate the biasing that hidden layer itself generates after to sdpecific dispersion algorithm Difference.Wherein, Δ Ct-p, Δ C't,Formula it is as follows:
ΔCt-p=vt-p(< ht0- < ht1) (13)
ΔC't=< ht0- < ht1 (14)
Second step, when hidden layer is biased and updated, addition Bernoulli Jacob random entry at random closes the partial nerve member of hidden layer It closes, in this way in iteration each time, hidden layer biasing all can make input layer because of the random closing of hidden layer partial nerve member Chlorophyll concentration historical juncture and the neuron of influence factor current time and historical juncture and hidden layer current time neuron Between connection it is different and then be updated in different ways.Specific formula is as follows:
Wherein r1Indicate the probability that hidden layer neuron opens, Δ C't-pIndicate the input layer after addition Bernoulli Jacob's random entry The weight that the connection of chlorophyll concentration t-p moment and hidden layer t moment generates changes, Δ C't' indicate addition Bernoulli Jacob's random entry Using to after sdpecific dispersion algorithm hidden layer itself generate biasing difference,Indicate defeated after addition Bernoulli Jacob's random entry Enter j-th of the influence factor t-p moment of layer and the weight of the connection generation of hidden layer t moment changes.
Therefore, the more new formula of the final biasing of RCRBM hidden layers is:
WhereinIt indicates to update preceding hidden layer bias matrix after Bernoulli Jacob's random entry is added.
Since iteration each time is all to randomly generate the RCRBM of one " thin " to be iterated.Therefore right after iteration The hidden layer of final updated biases btIt is restored, specific formula is as follows:
Wherein,Indicate the final hidden layer bias matrix after iteration.
Step 4: MT-RDBN model parameters are finely tuned
Compared with traditional DBN, it is current that MT-RDBN models increase the input of characterization factor historical juncture, influence factor The input and the input of influence factor historical juncture at moment, so when BP neural network is reversely finely tuned, it be to entire MT - RDBN model parameters are finely adjusted.By taking input layer t-p moment chlorophyll concentration and the first hidden layer t moment as an example, input layer T-p moment chlorophyll concentration and the variation formula of the weight of the first hidden layer t moment are as follows:
WhereinIndicate the weight transformation matrices of input layer t-p moment chlorophyll concentration and the first hidden layer t moment, ε1Indicate the learning rate of BP neural network, etIndicate the error matrix of hidden layer t moment.First hidden layer t moment and input layer Weight variation between influence factor is same as described above, therefore is no longer specifically described.
Embodiment 1:
With Jiangsu Province's Taihu Lake basin chlorophyll-a concentration, influence factor pH value, the present invention is used for ammonia nitrogen and water temperature data Institute's extracting method carries out cyanobacterial bloom prediction.With the observation data instance in Taihu Lake in 2008, after data screening and normalized, The 766 chlorophyll concentration data samples and three influence factor samples in July to September are had chosen altogether, as shown in Fig. 3~Fig. 6, Wherein each influence factor is made of 764 samples, and is classified as two groups.First group of sample data is dense by 508 chlorophyll Sample and influence factor pH value, ammonia nitrogen and water temperature data composition are spent, wherein each influence factor data are by 507 sample groups At.Second group of sample data is made of 258 chlorophyll concentration samples and influence factor pH value, ammonia nitrogen and water temperature data, wherein Each influence factor data are made of 257 samples.Using first group of data as training sample, using second group of data as survey Sample sheet.
Step 1: establishing MT-RDBN models
The MT of chlorophyll concentration data and influence factor is established according to the structure of Fig. 1-the algal bloom prediction mould of RDBN Type.Data in selected training sample are formed to the window sequentially moved forward in temporal sequence, are classified as 33 windows Mouthful, each window has 500 time series datas, wherein output number of the last 1 chlorophyll concentration data window as training sample According to.Remaining 32 window is as input data.Similarly, the data in test sample are also divided into 33 moving windows, each window 250 time series datas of mouth, carry out test verification.
Step 2: MT-RDBN models go discretization
After going discretization to S type functions according to formula (1) (2) (3) and add gaussian random sequence, calculated according to sdpecific dispersion Method carries out positive calculating and backwards calculation, obtains input layer reconstruct value matrix and hidden layer actual value and reconstruct value matrix.
Step 3: the learning algorithm of RCRBM
(1) the weight learning algorithm of RCRBM
R=0.6 is enabled, by formula (4) (5) (6) it is found that the more new formula of weight is:
After the iterations for reaching setting, final weighted value is restored, by formula (7) it is found that final weight For:
(2) input layer of RCRBM biases learning algorithm
N=7, m=3 known to step 1, by formula (8), (9), (10), (11) it is found that the update of input layer biasing is public Formula is:
(3) hidden layer of RCRBM biases innovatory algorithm
N=7, m=3, r known to step 11=0.8 by formula (12)~(19) it is found that the update of hidden layer biasing is public Formula is:
After the iterations for reaching setting, final hidden layer bias is restored, it is by formula (20) it is found that final Hidden layer be biased to:
Step 4: MT-RDBN model parameters are finely tuned
By step 1 it is found that input layer by current time and historical juncture the characterization factor at totally 8 moment and 3 kinds it is current Moment and historical juncture, totally 8 moment influence factors formed.Therefore by taking input layer t-1 moment and hidden layer t moment as an example, weight Variation is:
The weight variation at other moment is identical as above-mentioned formula.
After completing aforementioned four step, when the training stage two layers of RCRBM being selected to establish multifactor with BP neural network Sequence-random depth confidence network model, input layer is by chlorophyll concentration and three pH value, ammonia nitrogen and water temperature influence factors Current time and historical juncture totally 32 moment composition, output layer will be made of 1 moment of future of chlorophyll concentration, i.e., to leaf One step of concentration forward prediction of green element.First layer selects 100 neurons, the second layer to select 50 neurons in RCRBM, learns It is 0.1 to practise rate, and iterations are 1500.In BP neural network, learning rate 1, iterations are that 3000. training are tied Shu Houyong test sets data carry out test verification to model.The comparison diagram of prediction result and actual result is as shown in fig. 7, prediction knot The variation tendency of fruit and actual value is essentially identical.It is computed, the root-mean-square error of one step of forward prediction is 4.02%, illustrates that this is pre- Survey method precision is higher, therefore the multifactor sequential established by the method for the present invention-and random depth confidence network model can be effective Realize wawter bloom prediction in ground.

Claims (5)

1. the cyanobacterial bloom prediction technique based on the random depth confidence network model of multifactor sequential-, it is characterised in that:Including Following steps,
Step 1: establishing MT-RDBN models;
Chlorophyll is showed as the characterization factor of reaction cyanobacterial bloom phenomenon using factor pH value, ammonia nitrogen and water temperature as reaction wawter bloom The influence factor of elephant establishes the characterization factor of future time instance and the characterization factor and influence factor of current time and historical juncture Sequential relationship, input layer is responsible for receiving historical juncture and the chlorophyll concentration at current time and the value of influence factor, and is inputting Layer establishes autoregression model and multifactor regression model between input data;Hidden layer is responsible for the spy of extraction input layer data Sign, output layer then indicate the concentration of the following chlorophyll, finally establish multifactor sequential-random depth confidence network model is write a Chinese character in simplified form For MT-RDBN models;
Step 2: MT-RDBN models go discretization;
Discretization is gone to S type functions and is added after gaussian random sequence again by carrying out positive calculate and reversed to sdpecific dispersion algorithm It calculates, can preferably handle continuous time series data;
Step 3: the learning algorithm of RCRBM;The input layer offset algorithm of weight learning algorithm, RCRBM including RCRBM and The hidden layer offset algorithm of RCRBM;
Step 4: MT-RDBN model parameters are finely tuned;
Needed when using BP neural network back-propagation algorithm to multifactor sequential-own in random depth confidence network model The parameter that connection generates carries out global fine tuning.
2. the cyanobacterial bloom prediction side according to claim 1 based on the random depth confidence network model of multifactor sequential- Method, it is characterised in that:Step 2 is specially:
It is by going discretization to S type functions before the value to sdpecific dispersion algorithm calculating hidden layer and input layer and a mean value being added The gaussian random sequence that 0 variance is 1, specific formula are as follows:
Wherein, WtIndicate the weight matrix of t moment, atIndicate the input layer bias matrix of t moment, btIndicate the hidden layer of t moment Bias matrix, N (0,1) indicate that mean value is the gaussian random distribution that 0 variance is 1.Indicate a S type function, expression formula For:
3. the cyanobacterial bloom prediction side according to claim 1 based on the random depth confidence network model of multifactor sequential- Method, it is characterised in that:RCRBM weight learning algorithms described in step 3, specifically,
RCRBM uses the variation for calculating sdpecific dispersion algorithm weight, is updated to weight after model iteration is primary, specifically more New formula is as follows:
Wherein,WithThe front and back weight matrix of t moment update is indicated respectively;ΔWtIt indicates after using to sdpecific dispersion algorithm Weight changes, and specific formula is as follows:
ΔWt=η (< vtht0- < vtht1) (5)
Wherein, < >0Indicate the mathematic expectaion of data set, < >1It indicates by once to the reconstruction value after sdpecific dispersion algorithm Mathematic expectaion, η indicate study RCRBM learning rate;
After updating weight, Bernoulli Jacob's random entry is added, in this way in iteration each time, input layer current time and hidden layer are worked as The connection type of the neuron at preceding moment is different, and specific formula is as follows:
Wherein, the probability of connection, symbol are preserved after r expressions addition Bernoulli Jacob's random entryIndicating a kind of operation, operation rule is SymbolThe Bernoulli Jacob's random matrix and symbol that left side is generated with r probabilityElement in the matrix of right side, which corresponds to, to be multiplied;
After iteration, to newer t moment weight matrix WtIt is restored, obtains final weight matrix, specific formula is such as Under:
Wherein,Final weight matrix after expression iteration.
4. the cyanobacterial bloom prediction side according to claim 1 based on the random depth confidence network model of multifactor sequential- Method, it is characterised in that:The input layer of RCRBM described in step 3 biases learning algorithm, specifically,
The more new formula of RCRBM input layer offset algorithms is:
Wherein,Indicate the input layer bias matrix before update, Δ At-pIndicate input layer chlorophyll concentration t-p moment and input layer The weight variation that the connection of chlorophyll concentration t moment generates,Indicate j-th of influence factor t-p moment of input layer and input The weight variation that the connection of layer chlorophyll concentration t moment generates, Δ A'tIndicate input layer chlorophyll concentration t moment by comparison The biasing difference that itself is generated after divergence algorithm, wherein Δ At-p, Δ A't,Formula it is as follows:
ΔAt-p=vt-p(< vt0- < vt1) (9)
ΔAt'=< vt0- < vt1 (10)
5. the cyanobacterial bloom prediction side according to claim 1 based on the random depth confidence network model of multifactor sequential- Method, it is characterised in that:The hidden layer of RCRBM described in step 3 biases learning algorithm, specifically,
The more new formula of the first step, RBM hidden layer offset algorithms is:
Wherein,WithThe front and back hidden layer biasing of update, Δ C are indicated respectivelyt-pIndicate the input layer chlorophyll concentration t-p moment with The weight variation that the connection of hidden layer t moment generates,Indicate j-th of influence factor t-p moment of input layer and hidden layer t The weight variation that the connection at moment generates, Δ C'tIndicate the biasing difference that hidden layer itself generates after to sdpecific dispersion algorithm, Wherein, Δ Ct-p, Δ C't,Formula it is as follows:
ΔCt-p=vt-p(< ht0- < ht1) (13)
ΔC't=< ht0- < ht1 (14)
Second step, when hidden layer is biased and updated, addition Bernoulli Jacob random entry at random closes the partial nerve member of hidden layer, this For sample in iteration each time, hidden layer biasing all can make input layer chlorophyll because of the random closing of hidden layer partial nerve member Between concentration history moment and the neuron and hidden layer current time neuron at influence factor current time and historical juncture Connection is different and then is updated in different ways, and specific formula is as follows:
Wherein r1Indicate the probability that hidden layer neuron opens, Δ C't-pIndicate that the input layer leaf after addition Bernoulli Jacob's random entry is green The weight that the connection of plain concentration t-p moment and hidden layer t moment generates changes, Δ C't' indicate that addition Bernoulli Jacob's random entry passes through again The biasing difference generated to hidden layer itself after sdpecific dispersion algorithm is crossed,Indicate the input layer after addition Bernoulli Jacob's random entry The weight that the connection of j-th of influence factor t-p moment and hidden layer t moment generates changes,
Therefore, the more new formula of the final biasing of RCRBM hidden layers is:
WhereinIt indicates to update preceding hidden layer bias matrix after Bernoulli Jacob's random entry is added;
B is biased to the hidden layer of final updated after iterationtIt is restored, specific formula is as follows:
Wherein,Indicate the final hidden layer bias matrix after iteration.
CN201810059108.6A 2018-01-19 2018-01-22 Blue algae bloom prediction method based on multi-factor time sequence-random depth confidence network model Active CN108416460B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2018100523746 2018-01-19
CN201810052374 2018-01-19

Publications (2)

Publication Number Publication Date
CN108416460A true CN108416460A (en) 2018-08-17
CN108416460B CN108416460B (en) 2022-01-28

Family

ID=63126038

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810059108.6A Active CN108416460B (en) 2018-01-19 2018-01-22 Blue algae bloom prediction method based on multi-factor time sequence-random depth confidence network model

Country Status (1)

Country Link
CN (1) CN108416460B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109308544A (en) * 2018-08-21 2019-02-05 北京师范大学 Based on to sdpecific dispersion-shot and long term memory network cyanobacterial bloom prediction technique
CN109413663A (en) * 2018-09-29 2019-03-01 联想(北京)有限公司 A kind of information processing method and equipment
CN110689179A (en) * 2019-09-18 2020-01-14 北京工商大学 Water bloom prediction method based on space-time sequence mixed model
CN112862173A (en) * 2021-01-29 2021-05-28 北京工商大学 Lake and reservoir cyanobacterial bloom prediction method based on self-organizing deep confidence echo state network
CN112884197A (en) * 2021-01-05 2021-06-01 福建省厦门环境监测中心站(九龙江流域生态环境监测中心) Water bloom prediction method and device based on double models
CN113011397A (en) * 2021-04-27 2021-06-22 北京工商大学 Multi-factor cyanobacterial bloom prediction method based on remote sensing image 4D-FractalNet
CN115330070A (en) * 2022-08-18 2022-11-11 长沙学院 Power transmission and transformation water environment index prediction method based on multi-factor coupling

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100023307A1 (en) * 2008-07-24 2010-01-28 University Of Cincinnati Methods for prognosing mechanical systems
CN102135531A (en) * 2010-12-24 2011-07-27 中国科学院南京地理与湖泊研究所 Method for forecasting blue-green algae water bloom in large-scale shallow lake within 72 hours
CN104899653A (en) * 2015-06-02 2015-09-09 北京工商大学 Lake and reservoir cyanobacterial bloom prediction method based on expert system and cyanobacterial growth mechanism timing model
CN106198909A (en) * 2016-06-30 2016-12-07 中南大学 A kind of aquaculture water quality Forecasting Methodology based on degree of depth study

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100023307A1 (en) * 2008-07-24 2010-01-28 University Of Cincinnati Methods for prognosing mechanical systems
CN102135531A (en) * 2010-12-24 2011-07-27 中国科学院南京地理与湖泊研究所 Method for forecasting blue-green algae water bloom in large-scale shallow lake within 72 hours
CN104899653A (en) * 2015-06-02 2015-09-09 北京工商大学 Lake and reservoir cyanobacterial bloom prediction method based on expert system and cyanobacterial growth mechanism timing model
CN106198909A (en) * 2016-06-30 2016-12-07 中南大学 A kind of aquaculture water quality Forecasting Methodology based on degree of depth study

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王立等: "蓝藻生长时变系统非线性动力学分析及水华预测方法", 《化工学报》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109308544A (en) * 2018-08-21 2019-02-05 北京师范大学 Based on to sdpecific dispersion-shot and long term memory network cyanobacterial bloom prediction technique
CN109308544B (en) * 2018-08-21 2021-07-20 北京师范大学 Blue algae bloom prediction method based on contrast divergence-long and short term memory network
CN109413663A (en) * 2018-09-29 2019-03-01 联想(北京)有限公司 A kind of information processing method and equipment
CN109413663B (en) * 2018-09-29 2021-11-16 联想(北京)有限公司 Information processing method and equipment
CN110689179A (en) * 2019-09-18 2020-01-14 北京工商大学 Water bloom prediction method based on space-time sequence mixed model
CN112884197A (en) * 2021-01-05 2021-06-01 福建省厦门环境监测中心站(九龙江流域生态环境监测中心) Water bloom prediction method and device based on double models
CN112884197B (en) * 2021-01-05 2022-08-16 福建省厦门环境监测中心站(九龙江流域生态环境监测中心) Water bloom prediction method and device based on double models
CN112862173A (en) * 2021-01-29 2021-05-28 北京工商大学 Lake and reservoir cyanobacterial bloom prediction method based on self-organizing deep confidence echo state network
CN113011397A (en) * 2021-04-27 2021-06-22 北京工商大学 Multi-factor cyanobacterial bloom prediction method based on remote sensing image 4D-FractalNet
CN113011397B (en) * 2021-04-27 2024-03-29 北京工商大学 Multi-factor cyanobacterial bloom prediction method based on remote sensing image 4D-Fractalnet
CN115330070A (en) * 2022-08-18 2022-11-11 长沙学院 Power transmission and transformation water environment index prediction method based on multi-factor coupling
CN115330070B (en) * 2022-08-18 2023-04-07 长沙学院 Power transmission and transformation water environment index prediction method based on multi-factor coupling

Also Published As

Publication number Publication date
CN108416460B (en) 2022-01-28

Similar Documents

Publication Publication Date Title
CN108416460A (en) Cyanobacterial bloom prediction technique based on the random depth confidence network model of multifactor sequential-
CN108764540B (en) Water supply network pressure prediction method based on parallel LSTM series DNN
CN104899653B (en) Lake storehouse blue-green alga bloom Forecasting Methodology based on expert system and blue algae growth mechanism temporal model
CN102693450B (en) A prediction method for crankshaft fatigue life based on genetic nerve network
CN102622515B (en) A kind of weather prediction method
CN107563567A (en) Core extreme learning machine Flood Forecasting Method based on sparse own coding
Recknagel et al. Comparative application of artificial neural networks and genetic algorithms for multivariate time-series modelling of algal blooms in freshwater lakes
CN109308544B (en) Blue algae bloom prediction method based on contrast divergence-long and short term memory network
CN109840595B (en) Knowledge tracking method based on group learning behavior characteristics
Wang et al. An approach of improved Multivariate Timing-Random Deep Belief Net modelling for algal bloom prediction
CN106022954A (en) Multiple BP neural network load prediction method based on grey correlation degree
CN103942461A (en) Water quality parameter prediction method based on online sequential extreme learning machine
CN108334943A (en) The semi-supervised soft-measuring modeling method of industrial process based on Active Learning neural network model
Ning et al. GA-BP air quality evaluation method based on fuzzy theory.
Omidi et al. Forecasting stock prices using financial data mining and Neural Network
CN107729988B (en) Blue algae bloom prediction method based on dynamic deep belief network
CN106529185A (en) Historic building displacement combined prediction method and system
CN109408896B (en) Multi-element intelligent real-time monitoring method for anaerobic sewage treatment gas production
CN108334977B (en) Deep learning-based water quality prediction method and system
CN110298506A (en) A kind of urban construction horizontal forecast system
Mi et al. Prediction of accumulated temperature in vegetation period using artificial neural network
CN109858127B (en) Blue algae bloom prediction method based on recursive time sequence deep confidence network
CN106021924A (en) Sewage online soft-measurement method based on multi-attribute Gaussian kernel function fast relevance vector machine
CN112862173B (en) Lake and reservoir cyanobacterial bloom prediction method based on self-organizing deep confidence echo state network
CN104598770B (en) Wheat aphid quantitative forecasting technique and system based on human evolution's gene expression programming

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant