CN118277959A - Pig house temperature prediction method fused with resonance sparse transducer network - Google Patents
Pig house temperature prediction method fused with resonance sparse transducer network Download PDFInfo
- Publication number
- CN118277959A CN118277959A CN202410644882.9A CN202410644882A CN118277959A CN 118277959 A CN118277959 A CN 118277959A CN 202410644882 A CN202410644882 A CN 202410644882A CN 118277959 A CN118277959 A CN 118277959A
- Authority
- CN
- China
- Prior art keywords
- sequence
- temperature
- frequency
- prediction
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 70
- 230000010355 oscillation Effects 0.000 claims abstract description 17
- 238000000354 decomposition reaction Methods 0.000 claims abstract description 16
- 238000004364 calculation method Methods 0.000 claims abstract description 8
- 230000001276 controlling effect Effects 0.000 claims abstract description 5
- 230000001105 regulatory effect Effects 0.000 claims abstract description 5
- 239000013598 vector Substances 0.000 claims description 41
- 230000007246 mechanism Effects 0.000 claims description 30
- 239000011159 matrix material Substances 0.000 claims description 26
- 238000013528 artificial neural network Methods 0.000 claims description 25
- 230000006870 function Effects 0.000 claims description 18
- 125000003275 alpha amino acid group Chemical group 0.000 claims description 15
- 150000001875 compounds Chemical class 0.000 claims description 15
- 230000009466 transformation Effects 0.000 claims description 14
- 230000002457 bidirectional effect Effects 0.000 claims description 6
- 238000010606 normalization Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 claims description 4
- 238000013527 convolutional neural network Methods 0.000 claims description 4
- 230000004913 activation Effects 0.000 claims description 3
- 230000003416 augmentation Effects 0.000 claims description 3
- 230000008602 contraction Effects 0.000 claims description 3
- 239000003550 marker Substances 0.000 claims description 3
- 230000000873 masking effect Effects 0.000 claims 2
- 230000007774 longterm Effects 0.000 abstract description 4
- 230000010365 information processing Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 8
- 230000008901 benefit Effects 0.000 description 5
- 230000008878 coupling Effects 0.000 description 4
- 238000010168 coupling process Methods 0.000 description 4
- 238000005859 coupling reaction Methods 0.000 description 4
- 238000009395 breeding Methods 0.000 description 3
- 230000001488 breeding effect Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000006806 disease prevention Effects 0.000 description 2
- 230000004083 survival effect Effects 0.000 description 2
- 206010012735 Diarrhoea Diseases 0.000 description 1
- 241000233866 Fungi Species 0.000 description 1
- 206010035664 Pneumonia Diseases 0.000 description 1
- 241000196252 Ulva Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000005541 medical transmission Effects 0.000 description 1
- 230000004060 metabolic process Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 235000015277 pork Nutrition 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 208000023504 respiratory system disease Diseases 0.000 description 1
Landscapes
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention belongs to the field of agricultural information processing, in particular to a pig house temperature prediction method fused with a resonance sparse transform network, which utilizes a resonance sparse decomposition method to decompose temperature sequence data of each temperature acquisition measuring point to obtain a low-frequency temperature trend sequence and a high-frequency fluctuation sequence of each temperature acquisition measuring point; predicting each low-frequency temperature trend sequence to obtain a low-frequency temperature prediction sequence of each temperature acquisition measuring point; predicting each high-frequency fluctuation sequence to obtain a high-frequency temperature prediction sequence of each temperature acquisition measuring point; and summing and calculating the low-frequency temperature prediction sequence and the high-frequency temperature prediction sequence to obtain the temperature prediction data of each final temperature acquisition measuring point, and further regulating and controlling the pigsty environment. The method considers the characteristics of low-frequency trend and high-frequency oscillation of the intensive pig house temperature sequence data, can effectively capture the long-term dependency relationship in the time sequence data, and has low calculation complexity and high prediction precision.
Description
Technical Field
The invention belongs to the technical field of agricultural engineering and information processing, and particularly relates to a pig house temperature prediction method fused with a resonance sparse transducer network.
Background
In the intensive and large-scale pig breeding process monitored by the Internet of things system, the temperature of the pig house has important influences on the pig breeding survival rate, pork growth metabolism, disease transmission (such as diarrhea, respiratory diseases, pneumonia, fungus propagation and the like) and the like. Therefore, the temperature of the pig house is accurately predicted, the environment of the pig house is timely regulated and controlled according to the temperature fluctuation trend, and an early warning and epidemic prevention scheme is formulated, so that the method has important significance in reducing the energy consumption, reducing the feed consumption, improving the breeding survival rate, preventing and controlling diseases and the like of intensive pig raising.
At present, an intensive pig house is used as a closed microclimate environment, the influence of the internal and external environments of the pig house is caused, the internal temperature of the pig house has the characteristics of small-range fluctuation throughout the day, thick tail of data distribution peak, multiple coupling and the like, a traditional model driving prediction method and a traditional data driving prediction method, such as a time sequence prediction method and a neural network prediction method, are influenced by application climate scenes, data sample scale and model parameters, and the model stability and prediction precision do not meet engineering practical application.
Disclosure of Invention
The embodiment of the invention aims to provide a pig house temperature prediction method fused with a resonance sparse transducer network, and aims to solve the problems that the traditional prediction method is influenced by application climate scenes, data sample scale and model parameters, and the model stability and prediction accuracy do not meet the practical application of engineering.
In order to achieve the above purpose, the present invention provides the following technical solutions.
Specifically, the invention provides a pig house temperature prediction method fused with a resonance sparse transducer network, which comprises the following steps:
step S1: setting a plurality of temperature acquisition measuring points in a pig house, and picking up temperature sequence data of each temperature acquisition measuring point;
Step S2: decomposing the temperature sequence data of each temperature acquisition measuring point by using a resonance sparse decomposition method to obtain a low-frequency temperature trend sequence and a high-frequency fluctuation sequence of each temperature acquisition measuring point;
Step S3: predicting each low-frequency temperature trend sequence by using a transducer network model method to obtain a low-frequency temperature prediction sequence of each temperature acquisition measuring point;
Step S4: predicting each high-frequency fluctuation sequence by using a bidirectional long-short-time memory network CNN-BiLSTM model of the convolutional neural network to obtain a high-frequency temperature prediction sequence of each temperature acquisition measuring point;
Step S5: summing and calculating the low-frequency temperature prediction sequence and the high-frequency temperature prediction sequence to obtain temperature prediction data of each final temperature acquisition measuring point, calculating errors of a prediction time sequence and an actual time sequence, and adjusting super parameters of a prediction model in real time;
Step S6: and regulating and controlling the pigsty environment according to the temperature prediction data.
Further, the step S2 specifically includes:
step S21: the temperature sequence data of each measuring point of the pigsty collected by the temperature sensor is expressed as follows:
(1);
In the formula (1), the components are as follows, In order to observe the data of the object,;、The trend sequence component of the low-frequency oscillation characteristic to be estimated and the harmonic component of the high-frequency oscillation characteristic are adopted;
Step S22: signal component 、Using overcomplete wavelet basisAnd overcomplete wavelet basisThe matching is characterized by:, (2);
In the formula (2), the amino acid sequence of the compound, And (3) withRespectively as componentsWavelet transform coefficients of (a);
Step S23: the constructed resonance sparse decomposition objective function is expressed as:
(3);
In the formula (3), the amino acid sequence of the compound, And (3) withIs a regularization parameter;
Step S24: solving and constructing minimum value of resonance sparse decomposition objective function by using split augmentation Lagrangian contraction algorithm, and repeatedly updating And (3) withObtaining updated wavelet transform coefficientsAnd (3) withThe trend sequence component of the low-frequency oscillation characteristic to be estimated and the harmonic component of the high-frequency oscillation characteristic are respectively:
(4)。
further, the step S3 specifically includes:
Step S31: the transducer model consists of a model encoder Encoder and a model Decoder; the model encoder Encoder comprises vector position coding, a multi-head self-attention mechanism, residual connection, network layer normalization processing and a feedforward neural network; the model Decoder comprises a mask multi-head self-attention mechanism, a feedforward neural network and a full-connection layer;
Step S32: in the model encoder Encoder, vector position coding is used to add marker information to each position in the input sequence X, distinguishing between different positions and order of the input sequence;
Step S33: in a model Decoder, for predicting that a certain step of data is not related to future data, a mask multi-head self-attention mechanism is constructed for the future position by creating a mask matrix, so that the attention score of the future position is divided into infinitesimal parts, the current element is only related to the historical element, and the model is ensured to only depend on the historical element to predict the value of the future element.
Further, in step S32:
Step S321: linear transformation using sin and cos functions provides model position information, specifically operating as:
(5);
In the formula (5), the amino acid sequence of the compound, For inputting the position of a sequence, e.g.=0,1,2,…,N;Is the sequence dimension; And Representing parity of sequence dimensions; The size of the embedded space dimension;
Step S322: in model encoder Encoder, the multi-headed self-attention mechanism uses multiple parallel self-attention mechanisms, a single self-attention mechanism captures subspace information by learning different weights, specifically operating as:
adding the vector after the position coding according to the vector position coding, and passing through three weight matrixes: query matrix Key matrixSum matrixThe transformation is to the vectors Q, K and V required by the self-attention mechanism:,, Wherein, the method comprises the steps of, wherein, For adding the input vector after the position coding;
Step S323: a dot product method is used to calculate a correlation score between each element in the input sequence: score=q×kt;
in order to stabilize the gradient during model training, the correlation score between each element is normalized: Wherein, the method comprises the steps of, wherein, Is the dimension of vector K; converting the score vector between each element in the position coded vector into probability distribution between [0,1] through a Softmax function, wherein the calculation formula is as follows:
(6);
step S324: multiple head attention mechanism using multiple sets of weight matrix ,,) Obtaining a plurality of groups of required vectors Q, vectors K and vectors V, and connecting the outputs of the plurality of heads to perform linear transformation matrix Z:
(7);
In the formula (7), the amino acid sequence of the compound, ,,Is the ith vector Q, K and V; a weight matrix for the attention head;
Step S325: the residual connection and network layer normalization process is as follows:
after the multi-head attention mechanism is output in the last step, residual connection operation is carried out Normalizing operation with network layer;
Step S326: the feedforward neural network is a two-layer neural network, which is transformed first, then is subjected to non-linear transformation of ReLU, and then is subjected to linear transformation, specifically:
(8);
In the formula (8), the amino acid sequence of the compound, For the output of the previous layer,And (3) withAs the weighting coefficients of the feed-forward neural network,And (3) withFor biasing the feedforward neural network, connecting all layers by residual connection mode。
Further, in step S33:
the mask multi-head self-attention mechanism is specifically: (9);
In equation (9), mmask is a mask matrix, Attention weight matrix with mask;
And the feedforward neural network in the model Decoder performs nonlinear conversion on the upper layer result, so that the network can capture the nonlinear relation representing the complexity, and the full connection layer uses a ReLU function as an activation function.
Compared with the prior art, the pig house temperature prediction method fused with the resonance sparse transducer network has the beneficial effects that:
firstly, the pig house temperature prediction method provided by the invention considers the characteristics of low-frequency trend and high-frequency oscillation of the intensive pig house temperature sequence data, can accurately extract the pig house temperature fluctuation trend, and prevents the prediction trend from being distorted;
Second, compared with the time series prediction method and the data driving prediction method, the method provided by the invention can effectively capture the long-term dependency relationship in the time series data, and has the advantages of low calculation complexity, high prediction precision and high algorithm running speed.
In conclusion, the method solves the problem of inaccurate traditional prediction methods, particularly solves the prediction problems under the characteristics of small-range fluctuation of data, thick tail of data distribution peak, multiple coupling and the like, can realize intensive accurate pig house temperature prediction, and provides theoretical basis for fine regulation and control of pig house environment and disease prevention and control.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are necessary for the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention and that other embodiments may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic block diagram of a pig house temperature prediction method fused with a resonance sparse transducer network according to the present invention;
FIG. 2 is a schematic diagram of a waveform of temperature change at a certain measuring point of a pig house collected according to an embodiment of the present invention;
FIG. 3 is a diagram of low frequency components obtained by resonance sparse decomposition in an embodiment of the present invention;
FIG. 4 is a high frequency component diagram obtained by resonance sparse decomposition of an embodiment of the present invention;
FIG. 5 is a schematic diagram of a low-frequency temperature and humidity prediction result of a certain measuring point of an intensive pig house based on a transform network model prediction in the embodiment of the invention;
FIG. 6 is a schematic diagram of a high-frequency temperature prediction result of a certain measuring point of an intensive pig house based on a bidirectional long-short-term memory network CNN-BiLSTM model prediction in an embodiment of the invention;
FIG. 7 shows the temperature prediction result of a certain measuring point of the pig house predicted based on the pig house temperature prediction method;
FIG. 8 is a graph showing temperature prediction errors based on the pig house temperature prediction method of the present invention;
FIG. 9 is a flow chart of a pig house temperature prediction method of the present invention fused with a resonance sparse transducer network.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
At present, an intensive pig house is used as a closed microclimate environment, the influence of the internal and external environments of the pig house is caused, the internal temperature of the pig house has the characteristics of small-range fluctuation throughout the day, thick tail of data distribution peak, multiple coupling and the like, a traditional model driving prediction method and a traditional data driving prediction method, such as a time sequence prediction method and a neural network prediction method, are influenced by application climate scenes, data sample scale and model parameters, and the model stability and prediction precision do not meet engineering practical application.
In order to solve the problems, the invention provides a pig house temperature prediction method fused with a resonance sparse transducer network, which considers the characteristics of low-frequency trend and high-frequency oscillation of intensive pig house temperature sequence data, can accurately extract the pig house temperature fluctuation trend, and prevents the distortion of the prediction trend; compared with a time sequence prediction method and a data driving prediction method, the method provided by the invention can effectively capture the long-term dependency relationship in the time sequence data, and has the advantages of low calculation complexity, high prediction precision and high algorithm running speed.
Specific implementations of the invention are described in detail below in connection with specific embodiments.
As shown in fig. 1 and 9, the pig house temperature prediction method of the fused resonance sparse transducer network provided by the invention comprises the following steps:
step S1: setting a plurality of temperature acquisition measuring points in a pig house, and picking up temperature sequence data of each temperature acquisition measuring point;
Step S2: decomposing the temperature sequence data of each temperature acquisition measuring point by using a resonance sparse decomposition method to obtain a low-frequency temperature trend sequence and a high-frequency fluctuation sequence of each temperature acquisition measuring point;
Step S3: predicting each low-frequency temperature trend sequence by using a transducer network model method to obtain a low-frequency temperature prediction sequence of each temperature acquisition measuring point;
Step S4: predicting each high-frequency fluctuation sequence by using a bidirectional long-short-time memory network CNN-BiLSTM model of the convolutional neural network to obtain a high-frequency temperature prediction sequence of each temperature acquisition measuring point;
Step S5: summing and calculating the low-frequency temperature prediction sequence and the high-frequency temperature prediction sequence to obtain temperature prediction data of each final temperature acquisition measuring point, calculating errors of a prediction time sequence and an actual time sequence, and adjusting super parameters of a prediction model in real time;
Step S6: and regulating and controlling the pigsty environment according to the temperature prediction data.
Further, the step S2 specifically includes:
step S21: the temperature sequence data of each measuring point of the pigsty collected by the temperature sensor is expressed as follows:
(1);
In the formula (1), the components are as follows, In order to observe the data of the object,;、The trend sequence component of the low-frequency oscillation characteristic to be estimated and the harmonic component of the high-frequency oscillation characteristic are adopted;
Step S22: signal component 、Using overcomplete wavelet basisAnd overcomplete wavelet basisThe matching is characterized by:
, (2);
In the formula (2), the amino acid sequence of the compound, And (3) withRespectively as componentsWavelet transform coefficients of (a);
Step S23: the constructed resonance sparse decomposition objective function is expressed as:
(3);
In the formula (3), the amino acid sequence of the compound, And (3) withIs a regularization parameter;
Step S24: solving and constructing minimum value of resonance sparse decomposition objective function by using split augmentation Lagrangian contraction algorithm, and repeatedly updating And (3) withObtaining updated wavelet transform coefficientsAnd (3) withThe trend sequence component of the low-frequency oscillation characteristic to be estimated and the harmonic component of the high-frequency oscillation characteristic are respectively:
(4);
further, the step S3 specifically includes:
Step S31: the transducer model consists of a model encoder Encoder and a model Decoder; the model encoder Encoder comprises vector position coding, a multi-head self-attention mechanism, residual connection, network layer normalization processing and a feedforward neural network; the model Decoder comprises a mask multi-head self-attention mechanism, a feedforward neural network and a full-connection layer;
Step S32: in the model encoder Encoder, vector position coding is used to add marker information to each position in the input sequence X, distinguishing between different positions and order of the input sequence;
Step S33: in a model Decoder, for predicting that a certain step of data is not related to future data, a mask multi-head self-attention mechanism is constructed for the future position by creating a mask matrix, so that the attention score of the future position is divided into infinitesimal parts, the current element is only related to the historical element, and the model is ensured to only depend on the historical element to predict the value of the future element.
Further, step S32 includes:
Step S321: linear transformation using sin and cos functions provides model position information, specifically operating as:
(5);
In the formula (5), the amino acid sequence of the compound, For inputting the position of a sequence, e.g.=0,1,2,…,N;Is the sequence dimension; And Representing parity of sequence dimensions; The size of the embedded space dimension;
Step S322: in model encoder Encoder, the multi-headed self-attention mechanism uses multiple parallel self-attention mechanisms, a single self-attention mechanism captures subspace information by learning different weights, specifically operating as:
Adding the vector after position coding according to the vector position coding, and passing through three weight matrixes ,,) I.e. query matrixKey matrixValue matrixThe vectors Q, K and V required for the conversion to self-attention mechanism, i.e,,,For adding the input vector after the position coding;
Step S323: a dot product method is used to calculate a correlation score between each element in the input sequence: score=q×kt;
in order to stabilize the gradient during model training, the correlation score between each element is normalized: Wherein, the method comprises the steps of, wherein, Is the dimension of vector K; converting the score vector between each element in the position coded vector into probability distribution between [0,1] through a Softmax function, wherein the calculation formula is as follows:
(6);
step S324: multiple head attention mechanism using multiple sets of weight matrix ,,) Obtaining a plurality of groups of required vectors Q, vectors K and vectors V, and connecting the outputs of the plurality of heads to perform linear transformation matrix Z:
(7);
In the formula (7), the amino acid sequence of the compound, ,,Is the ith vector Q, K and V; a weight matrix for the attention head;
Step S325: the residual connection and network layer normalization process is as follows:
after the multi-head attention mechanism is output in the last step, residual connection operation is carried out Normalizing operation with network layer;
Step S326: the feedforward neural network is a two-layer neural network, which is transformed first, then is subjected to non-linear transformation of ReLU, and then is subjected to linear transformation, specifically:
(8);
In the formula (8), the amino acid sequence of the compound, For the output of the previous layer,And (3) withAs the weighting coefficients of the feed-forward neural network,And (3) withFor biasing the feedforward neural network, connecting all layers by residual connection mode。
Further, in step S33:
The mask multi-head self-attention mechanism is specifically: (9);
In equation (9), mmask is a mask matrix, Attention weight matrix with mask;
The feedforward neural network in the model Decoder performs nonlinear conversion on the upper layer result, so that the network can capture the nonlinear relationship representing the complexity, and the full connection layer uses the ReLU function as an activation function, and the calculation process is similar to that of the model encoder Encoder.
The pig house temperature prediction method provided by the invention considers the characteristics of low-frequency trend and high-frequency oscillation of the intensive pig house temperature sequence data, can accurately extract the pig house temperature fluctuation trend, and prevents the distortion of the prediction trend;
Compared with a time sequence prediction method and a data driving prediction method, the method provided by the invention can effectively capture the long-term dependency relationship in the time sequence data, and has the advantages of low calculation complexity, high prediction precision and high algorithm running speed.
In conclusion, the method solves the problem of inaccurate traditional prediction methods, particularly solves the prediction problems under the characteristics of small-range fluctuation of data, thick tail of data distribution peak, multiple coupling and the like, can realize intensive accurate pig house temperature prediction, and provides theoretical basis for fine regulation and control of pig house environment and disease prevention and control.
The invention adopts the technology of integrating Ethernet and sensors to pick up the temperature sequence data of a certain measuring point by a certain intensive pig raising enterprise in the Ulva, the temperature change sequence of the certain measuring point for about 11 months is shown in figure 2, 1 temperature point is collected every day, and the total data is 333 points.
Decomposing the temperature time sequence of a certain measuring point by using a resonance sparse decomposition method { reference I. W. Selesnick, Resonance-based signal decomposition: a new sparsity-enabled signal analysis method, SignalProcessing, 2011, 91(12) :2793-2809.} }, so as to obtain a low-frequency temperature trend sequence and a high-frequency fluctuation sequence of the measuring point, wherein fig. 3 is a low-frequency component diagram obtained by resonance sparse decomposition in the embodiment of the invention, and fig. 4 is a high-frequency component diagram obtained by resonance sparse decomposition in the embodiment of the invention, so that it can be seen that a low-frequency component signal reflects the temperature fluctuation trend of a pigsty, and a high-frequency component signal reflects the fluctuation influence of external interference factors on the environmental temperature.
The low-frequency temperature trend sequence of a certain measuring point is predicted by using a transducer network model { referring to A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A.N. Gomez, L. Kaiser,I. Polosukhin, Attention is all you need [J]. Advances in neural informationprocessing systems, 2017, 30, 1-15. } method, so that a low-frequency temperature prediction sequence of a certain measuring point is obtained, and the result is shown in figure 5, so that the transducer network model method can accurately track the temperature fluctuation trend of a pigsty.
Predicting the high-frequency fluctuation sequence of a certain measuring point by using a bidirectional long-short-time memory network (CNN-BiLSTM) model { reference A. Graves, J. Schmidhuber, Framewise phoneme classification with bidirectional LSTM and other neural network architectures,Neural Networks, 2005, 18(5-6), 602-610.} of the convolutional neural network to obtain a high-frequency temperature prediction sequence of the certain measuring point, wherein the result is shown in figure 6, and the CNN-BiLSTM model can accurately track the oscillation process of a high-frequency temperature component; finally, summing and calculating the low-frequency temperature prediction sequence and the high-frequency temperature prediction sequence to obtain temperature prediction data of a certain measuring point, as shown in fig. 7; and FIG. 8 shows the temperature prediction error based on the method according to the embodiment of the invention, so that the method provided by the invention has the advantages of small fluctuation of the prediction error range, high prediction precision and good industrial application value.
Although embodiments of the invention have been disclosed above, they are not limited to the use listed in the specification and embodiments. It can be applied to various fields suitable for the present invention. Additional modifications will readily occur to those skilled in the art. Therefore, the invention is not to be limited to the specific details and illustrations shown and described herein, without departing from the general concepts defined in the claims and their equivalents.
Claims (5)
1. The pig house temperature prediction method fused with the resonance sparse transducer network is characterized by comprising the following steps of:
step S1: setting a plurality of temperature acquisition measuring points in a pig house, and picking up temperature sequence data of each temperature acquisition measuring point;
Step S2: decomposing the temperature sequence data of each temperature acquisition measuring point by using a resonance sparse decomposition method to obtain a low-frequency temperature trend sequence and a high-frequency fluctuation sequence of each temperature acquisition measuring point;
Step S3: predicting each low-frequency temperature trend sequence by using a transducer network model method to obtain a low-frequency temperature prediction sequence of each temperature acquisition measuring point;
Step S4: predicting each high-frequency fluctuation sequence by using a bidirectional long-short-time memory network CNN-BiLSTM model of the convolutional neural network to obtain a high-frequency temperature prediction sequence of each temperature acquisition measuring point;
Step S5: summing and calculating the low-frequency temperature prediction sequence and the high-frequency temperature prediction sequence to obtain temperature prediction data of each final temperature acquisition measuring point, calculating errors of a prediction time sequence and an actual time sequence, and adjusting super parameters of a prediction model in real time;
Step S6: and regulating and controlling the pigsty environment according to the temperature prediction data.
2. The pig house temperature prediction method of the fused resonance sparse fransformer network according to claim 1, wherein the step S2 specifically comprises:
step S21: the temperature sequence data of each measuring point of the pigsty collected by the temperature sensor is expressed as follows:
(1);
In the formula (1), the components are as follows, In order to observe the data of the object,;、The trend sequence component of the low-frequency oscillation characteristic to be estimated and the harmonic component of the high-frequency oscillation characteristic are adopted;
Step S22: signal component 、Using overcomplete wavelet basisAnd overcomplete wavelet basisThe matching is characterized by:
; (2);
In the formula (2), the amino acid sequence of the compound, And (3) withRespectively as componentsWavelet transform coefficients of (a);
Step S23: the constructed resonance sparse decomposition objective function is expressed as:
(3);
In the formula (3), the amino acid sequence of the compound, And (3) withIs a regularization parameter;
Step S24: solving and constructing minimum value of resonance sparse decomposition objective function by using split augmentation Lagrangian contraction algorithm, and repeatedly updating And (3) withObtaining updated wavelet transform coefficientsAnd (3) withThe trend sequence component of the low-frequency oscillation characteristic to be estimated and the harmonic component of the high-frequency oscillation characteristic are respectively:
(4)。
3. The pig house temperature prediction method of the fused resonance sparse fransformer network according to claim 2, wherein the step S3 specifically comprises:
Step S31: the transducer model consists of a model encoder Encoder and a model Decoder; the model encoder Encoder comprises vector position coding, a multi-head self-attention mechanism, residual connection, network layer normalization processing and a feedforward neural network; the model Decoder comprises a mask multi-head self-attention mechanism, a feedforward neural network and a full-connection layer;
Step S32: in the model encoder Encoder, vector position coding is used to add marker information to each position in the input sequence X, distinguishing between different positions and order of the input sequence;
step S33: in the model Decoder, a masking multi-headed self-attention mechanism is used to construct future locations by creating a masking matrix such that the attention score for the future locations is infinitely small, causing the current element to be tied only to historical elements.
4. The pig house temperature prediction method fused with the resonance sparse fransformer network according to claim 3, wherein in step S32:
Step S321: linear transformation using sin and cos functions provides model position information, specifically operating as:
(5);
In the formula (5), the amino acid sequence of the compound, For inputting the position of a sequence, e.g.=0,1,2,…,N;Is the sequence dimension; And Representing parity of sequence dimensions; The size of the embedded space dimension;
Step S322: in model encoder Encoder, the multi-headed self-attention mechanism uses multiple parallel self-attention mechanisms, a single self-attention mechanism captures subspace information by learning different weights, specifically operating as:
adding the vector after the position coding according to the vector position coding, and passing through three weight matrixes: query matrix Key matrixSum matrixThe transformation is to the vector Q, the vector K and the vector V required by the self-attention mechanism respectively:
,, Wherein, the method comprises the steps of, wherein, For adding the input vector after the position coding;
Step S323: a dot product method is used to calculate a correlation score between each element in the input sequence: score=q×kt;
Normalizing the relevance scores among the elements: Wherein, the method comprises the steps of, wherein, Is the dimension of vector K;
Converting the score vector between each element in the position coded vector into probability distribution between [0,1] through a Softmax function, wherein the calculation formula is as follows:
(6);
Step S324: multi-head attention mechanism uses multiple sets of weight matrices ,,Obtaining a plurality of groups of required vectors Q, vectors K and vectors V, and connecting the outputs of the plurality of heads to perform linear transformation matrix Z:
(7);
In the formula (7), the amino acid sequence of the compound, ,,Is the ith vector Q, K and V; a weight matrix for the attention head;
Step S325: the residual connection and network layer normalization process is as follows:
after the multi-head attention mechanism is output in the last step, residual connection operation is carried out Normalizing operation with network layer;
Step S326: the feedforward neural network is a two-layer neural network, which is transformed first, then is subjected to non-linear transformation of ReLU, and then is subjected to linear transformation, specifically:
(8);
In the formula (8), the amino acid sequence of the compound, For the output of the previous layer,And (3) withAs the weighting coefficients of the feed-forward neural network,And (3) withFor biasing the feedforward neural network, connecting all layers by residual connection mode。
5. The method for pig house temperature prediction by fusing a resonance sparse fransformer network according to claim 4, wherein in step S33:
the mask multi-head self-attention mechanism is specifically: (9);
In equation (9), mmask is a mask matrix, Attention weight matrix with mask;
And the feedforward neural network in the model Decoder performs nonlinear conversion on the upper layer result, so that the network can capture the nonlinear relation representing the complexity, and the full connection layer uses a ReLU function as an activation function.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410644882.9A CN118277959B (en) | 2024-05-23 | 2024-05-23 | Pig house temperature prediction method fused with resonance sparse transducer network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410644882.9A CN118277959B (en) | 2024-05-23 | 2024-05-23 | Pig house temperature prediction method fused with resonance sparse transducer network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118277959A true CN118277959A (en) | 2024-07-02 |
CN118277959B CN118277959B (en) | 2024-08-23 |
Family
ID=91640157
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410644882.9A Active CN118277959B (en) | 2024-05-23 | 2024-05-23 | Pig house temperature prediction method fused with resonance sparse transducer network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118277959B (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110119767A (en) * | 2019-04-19 | 2019-08-13 | 淮阴工学院 | A kind of cucumber green house temperature intelligent detection device based on LVQ neural network |
US20210190362A1 (en) * | 2019-06-04 | 2021-06-24 | Lg Electronics Inc. | Apparatus for generating temperature prediction model and method for providing simulation environment |
CN115890340A (en) * | 2022-12-15 | 2023-04-04 | 安徽农业大学 | Tool wear prediction method of compressed learning adaptive network under sparse frame |
CN116029435A (en) * | 2023-01-10 | 2023-04-28 | 盐城工学院 | Environmental comfort early warning system is bred to live pig facility |
CN117132132A (en) * | 2023-09-06 | 2023-11-28 | 大连民族大学 | Photovoltaic power generation power prediction method based on meteorological data |
CN117436562A (en) * | 2023-04-14 | 2024-01-23 | 青岛海洋科技中心 | Ocean temperature long-term prediction method and device based on improved transducer model |
WO2024077969A1 (en) * | 2022-10-14 | 2024-04-18 | 南京国电南自轨道交通工程有限公司 | Lstm-svr subway station temperature prediction method based on characteristic of multiple periods |
CN117973440A (en) * | 2024-04-02 | 2024-05-03 | 长江三峡集团实业发展(北京)有限公司 | Regional ionosphere delay prediction method based on LSTM-transducer model |
-
2024
- 2024-05-23 CN CN202410644882.9A patent/CN118277959B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110119767A (en) * | 2019-04-19 | 2019-08-13 | 淮阴工学院 | A kind of cucumber green house temperature intelligent detection device based on LVQ neural network |
US20210190362A1 (en) * | 2019-06-04 | 2021-06-24 | Lg Electronics Inc. | Apparatus for generating temperature prediction model and method for providing simulation environment |
WO2024077969A1 (en) * | 2022-10-14 | 2024-04-18 | 南京国电南自轨道交通工程有限公司 | Lstm-svr subway station temperature prediction method based on characteristic of multiple periods |
CN115890340A (en) * | 2022-12-15 | 2023-04-04 | 安徽农业大学 | Tool wear prediction method of compressed learning adaptive network under sparse frame |
CN116029435A (en) * | 2023-01-10 | 2023-04-28 | 盐城工学院 | Environmental comfort early warning system is bred to live pig facility |
CN117436562A (en) * | 2023-04-14 | 2024-01-23 | 青岛海洋科技中心 | Ocean temperature long-term prediction method and device based on improved transducer model |
CN117132132A (en) * | 2023-09-06 | 2023-11-28 | 大连民族大学 | Photovoltaic power generation power prediction method based on meteorological data |
CN117973440A (en) * | 2024-04-02 | 2024-05-03 | 长江三峡集团实业发展(北京)有限公司 | Regional ionosphere delay prediction method based on LSTM-transducer model |
Non-Patent Citations (2)
Title |
---|
XIANQI ZHANG ET AL: "A monthly temperature prediction based on the CEEMDAN–BO–BiLSTM coupled model", NATURE, 8 January 2024 (2024-01-08) * |
曾志雄等: "基于时间序列和多元模型的集约化猪舍温度预测", 华南农业大学学报, 3 March 2021 (2021-03-03) * |
Also Published As
Publication number | Publication date |
---|---|
CN118277959B (en) | 2024-08-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112116162B (en) | Power transmission line icing thickness prediction method based on CEEMDAN-QFAOA-LSTM | |
CN112116080A (en) | CNN-GRU water quality prediction method integrated with attention mechanism | |
CN110443417A (en) | Multi-model integrated load prediction method based on wavelet transformation | |
CN114154401B (en) | Soil erosion modulus calculation method and system based on machine learning and observation data | |
CN111222992A (en) | Stock price prediction method of long-short term memory neural network based on attention mechanism | |
Li et al. | A PLS-based pruning algorithm for simplified long–short term memory neural network in time series prediction | |
CN113988449A (en) | Wind power prediction method based on Transformer model | |
CN116050621A (en) | Multi-head self-attention offshore wind power ultra-short-time power prediction method integrating lifting mode | |
CN115169742A (en) | Short-term wind power generation power prediction method | |
CN111210089A (en) | Stock price prediction method of gated cyclic unit neural network based on Kalman filtering | |
CN116504060A (en) | Diffusion diagram attention network traffic flow prediction method based on Transformer | |
CN113850438A (en) | Public building energy consumption prediction method, system, equipment and medium | |
CN113128666A (en) | Mo-S-LSTMs model-based time series multi-step prediction method | |
CN116629126A (en) | Soft measurement modeling method based on dynamic multi-head attention mechanism | |
CN116483036B (en) | Transformer-based self-encoder soft measurement modeling method | |
CN118277959B (en) | Pig house temperature prediction method fused with resonance sparse transducer network | |
CN114662389A (en) | Air pollutant-oriented self-correlation error Informer model long time sequence prediction method and system | |
CN117852714A (en) | Non-constant temperature numerical control workshop short-term temperature prediction method based on AFSA (automatic force analysis) optimization transducer-GRU (gas-insulated switchgear) | |
CN109061544B (en) | Electric energy metering error estimation method | |
CN116090645A (en) | Air quality prediction method, storage medium and equipment for public area of underground track | |
CN115719115A (en) | Multi-factor wind power plant generating capacity prediction method | |
CN111460738B (en) | RNN-ARX modeling method and RNN-ARX model of magnetic suspension system | |
CN111797987A (en) | Dynamic machine learning method | |
CN116627035A (en) | Internet of things intelligent monitoring method and system for livestock and poultry housing cultivation environment with big data | |
CN115907081A (en) | Training method of wind speed correction model and wind power prediction method applying model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |