CN111222798B - Complex industrial process key index soft measurement method - Google Patents

Complex industrial process key index soft measurement method Download PDF

Info

Publication number
CN111222798B
CN111222798B CN202010030254.3A CN202010030254A CN111222798B CN 111222798 B CN111222798 B CN 111222798B CN 202010030254 A CN202010030254 A CN 202010030254A CN 111222798 B CN111222798 B CN 111222798B
Authority
CN
China
Prior art keywords
time
data
network
output
soft measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010030254.3A
Other languages
Chinese (zh)
Other versions
CN111222798A (en
Inventor
刘金平
蒋楚蓉
何捷舟
史雅琴
赵爽爽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Normal University
Original Assignee
Hunan Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Normal University filed Critical Hunan Normal University
Priority to CN202010030254.3A priority Critical patent/CN111222798B/en
Publication of CN111222798A publication Critical patent/CN111222798A/en
Application granted granted Critical
Publication of CN111222798B publication Critical patent/CN111222798B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management

Abstract

The invention discloses a soft measurement method for key indexes of a complex industrial process, which comprises the following steps: 1. and collecting visual data and process parameter data, and testing in a laboratory at a later stage to obtain a value corresponding to the key index. 2. A video data space-time sequence feature extraction network model is constructed on video data by adopting a cross-frame fusion convolutional neural network (DCFNN) provided by the invention. 3. And constructing a time sequence characteristic extraction model on the industrial parameter data by adopting a GRU network. 4. Inputting the acquired data and process parameters, carrying out dual-channel concurrent extraction of characteristic data with time series, and carrying out soft measurement on key indexes by adopting a full-connection network of an attention mechanism. And carrying out reverse model training according to the real key index value obtained by the test. 5. And (3) estimating key indexes which are difficult to monitor on line in real time by adopting the trained soft measurement model.

Description

Complex industrial process key index soft measurement method
Technical Field
The invention belongs to the field of online monitoring of key indexes of a complex industrial process, and particularly relates to a soft measurement method of key indexes of the complex industrial process based on machine vision and engineering parameter dual-channel fusion.
Background
In the modern industrial process, the real-time monitoring of key indexes has important significance for ensuring the process safety and the production quality. However, in many cases, the key indexes of complex industrial processes are difficult to detect on line due to the problems of long process flow, unclear internal mechanism, multiple influencing factors and the like.
In recent years, the soft measurement technology is widely applied to the online monitoring of key indexes in a complex industrial process by the advantages of high response speed, low maintenance cost, accurate prediction result and the like. Machine vision has been widely used in industrial process monitoring due to its advantages of rapidity, real-time, and non-contact in complex industrial processes. Therefore, many experts and scholars at home and abroad carry out a series of researches on the machine vision-based complex industrial process key index soft measurement technology, so that some industrial characteristics are intuitively acquired through the machine vision, and a soft measurement model of the key indexes is established by combining industrial process parameters to realize online monitoring.
The traditional soft measurement model based on machine vision usually adopts an image processing mode, the characteristics of the image such as color, outline and the like are extracted through artificial experience to serve as characteristic parameters for establishing the soft measurement model, in recent years, a plurality of expert scholars in the rapid development of machine learning apply the machine learning to the establishment of the soft measurement model based on machine vision, wherein a deep convolution neural network can extract effective image characteristics in a self-adaptive mode, subjective defects caused by artificial characteristic extraction are avoided, and therefore a good application effect is achieved.
Many time-varying industrial processes often exist in a complex industrial process, and image data and industrial parameters of the time-varying industrial processes have typical time series characteristics, for example, chemical reactions of substances in a chemical process need a certain time to obtain a final finished medicament. The conventional learning methods for processing time series, such as LSTM, GRU, RNN, can only extract time series characteristics from the traditional industrial process parameter variables (such as temperature, flow rate), and the sampling rate of the industrial parameters is not consistent with the sampling rate of the machine vision image. Therefore, time sequence characteristics in the video data are effectively extracted, industrial process parameter characteristics are fused and used for key index monitoring and scientific research of the complex industrial process to provide important information required for monitoring, optimizing and controlling the complex industrial process, and further energy conservation and benefit maximization of the industrial process are achieved.
According to the analysis, the complex industrial process is very complex, the influence factors are more, the product quality cannot be accurately monitored by the conventional manual monitoring, and the key indexes are difficult to detect on line, so that the problems of low yield of industrial products, low utilization rate of raw materials, high resource consumption and the like are caused. Machine vision, as the most direct indicator, can effectively extract the characteristic information related to key indexes. The invention provides a complex industrial process key index soft measurement method based on machine vision and engineering parameter double-channel fusion, and the method is applied to prediction of clinker quality of a rotary cement kiln, and the result is matched with the actual situation. The method is beneficial to realizing the online prediction of key indexes of the complex industrial process, and further guiding the monitoring and optimization of the complex industrial process.
The noun explains:
DepthConcat: splicing two or more feature maps in a channel dimension according to the size of a row and a column
DCFNN network: and fusing the convolutional neural network across frames.
GRU network: a very effective variant of the LSTM network is simpler and more effective than the LSTM network.
The Attention mechanism: attention is paid to a mechanism, and high-value information is quickly screened out from a large amount of information. The method is mainly used for solving the problem that the final reasonable vector representation is difficult to obtain when the input sequence of the LSTM/RNN model is long, and the method is characterized in that the intermediate result of the LSTM is reserved, the LSTM is learned by a new model, and the LSTM is associated with the output, so that the purpose of information screening is achieved.
Fully connected network: i.e. a fully connected neural network.
Disclosure of Invention
The invention aims to provide a soft measurement method for key indexes of a complex industrial process. The invention establishes the fault early warning method based on the industrial process running state trend by combining qualitative trend analysis and process state prediction, and can accurately and intuitively reflect the industrial process running state.
The content of the invention comprises:
a soft measurement method for key indexes of a complex industrial process is characterized by comprising the following steps:
s1: collecting related industrial parameter data of key indexes to be measured and machine vision data by using a double-channel network as a basis for establishing a related soft measurement model, namely collecting industrial parameter characteristic data according to a time interval T and collecting machine vision data corresponding to the time interval T; obtaining key indexes corresponding to time points through detection to serve as labels for soft measurement model training;
s2: carrying out double-channel concurrent extraction on the video data and the industrial parameter characteristic data respectively to extract characteristic data with time sequences; wherein, the video data stream adopts a cross-frame fusion convolution neural network to extract the foam video space-time sequence characteristics; extracting time sequence characteristics of the industrial parameter data by adopting a GRU network;
s3, fusing the features extracted by the cross-frame fusion convolutional neural network and the features extracted by the GRU network into features
Figure BDA0002364053430000021
Where m represents the characteristic dimension, t represents the time, h t Expressed as a fused feature vector;
s4, for the data sample at the T moment, calculating the attention distribution probability of each dimension characteristic of the output characteristics corresponding to the two-channel network by adopting an attention mechanism, and weighting the output characteristics of the two-channel network to improve the influence of the dimension of each characteristic on the final prediction result:
output y for any time t t Constructing a soft measurement model denoted as y t =F(C t ,y 1 ,y 2 ,y 3 ,...,y t-1 ) F (-) represents a non-linear mapping relation, and represents the value y needing soft measurement currently t From the previous time y 1 ,y 2 ,y 3 ,...,y t-1 Related to attention weighted features at the current time, where C t Corresponding to the input h t The attention assignment probability distribution of (2) is calculated as follows:
Figure BDA0002364053430000031
wherein, S (x) i ) Representing input x at time t t The output value via the two-channel network in the ith dimension, i.e.
Figure BDA0002364053430000032
Representing the attention-distribution coefficient, representing the input x at time t t Attention weight in the ith dimension; i ∈ (1, \8230;, m), calculated as follows: />
Figure BDA0002364053430000033
Wherein
Figure BDA0002364053430000034
Is->
Figure BDA0002364053430000035
Attention score, which is calculated as follows:
Figure BDA0002364053430000036
wherein V, W and U represent weight transformation matrix, b is bias term, and finally formed output characteristic C t As an input to a fully connected network;
s5, performing error calculation according to the output result of the full-connection network and the result of the real industrial process, and performing reverse modification through an error function to obtain a final soft measurement model;
and S6, measuring key indexes of the industrial process by adopting the final soft measurement model.
In a further improvement, the machine vision data is video data.
In a further improvement, the step of extracting the foam video space-time sequence features of the video data stream by adopting a cross-frame fusion convolutional neural network is as follows:
3.1 Sampling video data of certain sample data into 2^ n pictures;
3.2 Carrying out primary convolution on 2^ n pictures respectively, and pooling the feature map after the convolution to obtain a feature map with a larger receptive field; sending the two adjacent feature graphs into a DepthConcat for fusion according to the span with the span of 1;
3.3 N = n-1, if n is not equal to 0, go back to step 3.2), otherwise pull up the last feature map into a vector as spatio-temporal sequence feature data of the video data.
Further, the method for extracting the time sequence characteristics of the industrial parameter data by adopting the GRU network comprises the following steps:
4.1 Initialize the network parameters and historical outputs h at time 1 based on the GRU network model 0 (ii) a For a GRU network at time t, the GRU first pair the input data
Figure BDA0002364053430000037
And the last time data x t-1 Corresponding historical output h t-1 Performing door calculation; />
Figure BDA0002364053430000038
A characteristic value of the data at the time t in the mth dimension;
4.2 By resetting the gate to calculate how much output information at the previous time was written to the candidate output
Figure BDA0002364053430000039
Above, the smaller the value, the less the written data, the calculation formula is as follows:
Figure BDA0002364053430000041
wherein W r ,U r ,b r Are trainable parameters; sigma is a sigmoid activation function;
4.3 The calculation formula is as follows, the larger the value is, the more the state information is brought in at the previous moment is
z t =σ(W z X t +U z h t-1 +b z )
Wherein W z ,U z ,b z Is a trainable parameter; sigma is a sigmoid activation function;
4.4 ) the remake door r calculated according to step 4.2) t And output information h of the previous moment t-1 Computing candidate outputs
Figure BDA0002364053430000046
The calculation steps are as follows: />
Figure BDA0002364053430000042
Wherein W c ,U c ,b c Is a trainable parameter; tan h is an activation function;
4.5 Updated gate z) calculated from 4.3) t Outputting information h at the previous moment t-1 And current time candidate output
Figure BDA0002364053430000045
Calculating the output information h of the current time t
Figure BDA0002364053430000043
So the final output of the GRU network is the last output h t-1 And current candidate output
Figure BDA0002364053430000044
A weighted sum of; therefore, the time sequence characteristic information in the process parameter data is effectively extracted.
In a further refinement, the method is used to measure rotary cement kiln data.
The invention has the beneficial effects that:
1. the cross-frame fusion convolution neural network provided by the invention can effectively extract the space-time sequence characteristics in the foam video, and simultaneously
The problem that the sampling rate of the video data is inconsistent with that of the industrial parameter data can be solved.
2. The attention mechanism-based dual-channel fusion feature weighting soft measurement model provided by the invention can effectively measure according to each
The influence of the characteristics on the detection of the key indexes is added with the weighted value, so that the detection effect is enhanced. And the detection precision is improved.
3. The effective real-time detection of key indexes of complex industrial processes can effectively provide accurate indication for industrial process control
Indexes, stable production flow, reduced production consumption and reduced manual monitoring cost.
Drawings
FIG. 1 is a diagram of a hole convolution fusion neural network structure in the algorithm of the present invention;
FIG. 2 is a flow chart of the method;
FIG. 3 is a graph showing the soft measurement results of the cement clinker according to the present invention;
FIG. 4 is a graph comparing the results of the present invention with GRUs.
Detailed Description
S1 as shown in fig. 1 and 2: collecting the characteristic data of the industrial parameters of the rotary cement kiln such as the feeding amount, the feeding coal amount, the temperature of raw materials entering the kiln, the temperature of secondary air, the rotating speed of the rotary kiln and the like according to the time interval T, and simultaneously collecting the fire watching video of the rotary kiln with the corresponding time span from the previous sampling time to the current sampling time being T. And the strength of the cement after 3 days at the corresponding time point is obtained through later-stage laboratory detection and is used as a clinker quality index and also a monitoring index of the soft measurement model.
S2: and carrying out double-channel concurrent extraction on the fire observation video and the industrial parameter data respectively to obtain characteristic data with time sequences. The fire-watching video data adopts a cross-frame fusion neural network (DCFNN) provided by the invention to extract the fire-watching video space-time sequence characteristics, the specific steps are S3.1-S3.3, the industrial parameter data adopts a GRU network to extract the time sequence characteristics, and the specific steps are S4.
S3.1: for a video image with the time length T, sampling a frame of image at intervals of T, and ensuring that the number of the sampled images is 2 to the power of n.
S3.2: and performing convolution once on the 2^ n pictures, and performing pooling once to obtain a feature map with a larger receptive field. And (5) sending the feature maps into DepthConcat in pairs for fusion.
S3.3: let n = n-1, go back to step S3.2 if n is not equal to 0, otherwise perform S5.
S4: the method for extracting the time series characteristics of the industrial parameter characteristic data by adopting the GRU network comprises the following steps:
s4.1: the GRU has a double gate structure, for a GRU network at time t, for time t, the GRU first pair input data
Figure BDA0002364053430000051
And the last time data x t-1 Corresponding historical output h t-1 A gate calculation is performed.
S4.2: calculating how much output information of the previous time is written into the candidate output by the reset gate
Figure BDA0002364053430000052
In the above, the smaller the value, the less the written data, the calculation formula is as follows:
r t =σ(W r X t +U r h t-1 +b r )
wherein W r ,U r ,b r Are trainable parameters. Sigma is sigmoid activation function.
S4.3: the method comprises the steps of calculating how much state information at the previous moment is brought into the current state through an updating gate, wherein the larger the value of the state information is, the more the state information is brought into the previous moment, and the calculation formula is as follows
z t =σ(W z X t +U z h t-1 +b z )
Wherein W z ,U z ,b z Are trainable parameters. Sigma is sigmoid activation function.
S4.4: reproduced door r calculated according to S3.1 t And output information h of the previous moment t-1 Computing candidate outputs
Figure BDA0002364053430000055
The calculation steps are as follows:
Figure BDA0002364053430000054
wherein W c ,U c ,b c For trainable parameters, tanh is an activation function.
S4.5: updated gate z calculated from S3.2 t Outputting information h at the previous moment t-1 And current time candidate output
Figure BDA00023640534300000613
Calculating output information of the current moment:
Figure BDA0002364053430000062
so the final output of the GRU network is the last output h t-1 And current candidate output
Figure BDA0002364053430000063
Is calculated as a weighted sum of. The adaptive recursive weighting can automatically record specific information at a specific time point, and simultaneously has a filtering effect,and the output is stabilized, so that the time sequence characteristic information in the data is effectively extracted.
S5: fusing the extracted features of the DCFNN network and the GRU network into the features
Figure BDA0002364053430000064
S6: for the data sample at the T moment, the corresponding output characteristics of the two-channel network
Figure BDA0002364053430000065
And calculating the Attention distribution probability of each dimension characteristic by adopting an Attention mechanism, and weighting the Attention distribution probability to improve the influence of each dimension characteristic on a final prediction result. The method comprises the following steps:
s6.1 output y for any time t t Can be represented as y t =F(C t ,y 1 ,y 2 ,…,y t-1 ) In which C is t Corresponding to the input x t Output characteristic h after two-channel fusion t The probability distribution is assigned to attention, which is calculated as follows:
Figure BDA0002364053430000066
wherein, S (x) i ) Representing the input x at time t t The output value over the two-channel network in the ith dimension (i.e. the ith feature) is
Figure BDA0002364053430000067
It is calculated as shown in steps S3-S5, based on>
Figure BDA0002364053430000068
Representing the attention-distribution coefficient, representing the input x at time t t The attention weight in the ith dimension is calculated as follows:
Figure BDA0002364053430000069
wherein
Figure BDA00023640534300000610
Is->
Figure BDA00023640534300000611
Attention score, which is calculated as follows:
Figure BDA00023640534300000612
wherein V, W and U represent weight transformation matrix, b is bias term, and finally formed output characteristic C t As an input to a fully connected network; the above t represents the time t, i.e., the tth sample data. m denotes a characteristic dimension representing each sample, and i is an arbitrary value from 1 to m.
S7: and error calculation is carried out according to the output result of the full-connection network and the real laboratory test result, and reverse modification is carried out through an error function, so that the whole soft measurement model can accurately detect the quality of the cement clinker.
S8: the soft measurement model trained in the process is adopted to carry out real-time soft measurement on the clinker quality in the rotary kiln cement production process, the prediction result of the soft measurement of the clinker quality (cement strength after 3 days) applied to the cement production process is shown in figure 3, the error comparison is carried out on the prediction result of the clinker quality of the common GRU model, and the error result is shown in figure 4.
While embodiments of the invention have been disclosed above, it is not limited to the applications set forth in the specification and the embodiments, which are fully applicable to various fields of endeavor for which the invention pertains, and further modifications may readily be made by those skilled in the art, it being understood that the invention is not limited to the details shown and described herein without departing from the general concept defined by the appended claims and their equivalents.

Claims (5)

1. A soft measurement method for key indexes of a complex industrial process is characterized by comprising the following steps:
s1: collecting related industrial parameter data of key indexes to be measured and machine vision data by using a two-channel network as a basis for establishing a related soft measurement model, namely collecting industrial parameter characteristic data according to a time interval T and collecting machine vision data corresponding to the time interval T; obtaining key indexes corresponding to time points through detection to serve as labels for soft measurement model training;
s2: carrying out double-channel concurrent extraction on the video data and the industrial parameter characteristic data respectively to extract characteristic data with time sequences; wherein, the video data stream adopts a cross-frame fusion convolution neural network to extract the foam video space-time sequence characteristics; extracting the time sequence characteristics of the industrial parameter data by adopting a GRU network;
s3, fusing the features extracted by the cross-frame fusion convolutional neural network and the features extracted by the GRU network into features
Figure FDA0002364053420000011
Where m represents the characteristic dimension, t represents the time, h t Expressed as a fused feature vector;
s4, for the data sample at the moment T, calculating the attention distribution probability of each dimension characteristic of the output characteristics corresponding to the two-channel network by adopting an attention mechanism, and weighting the output characteristics of the two-channel network to improve the influence of the dimension of each characteristic on the final prediction result:
output y for any time t t Constructing a soft measurement model denoted as y t =F(C t ,y 1 ,y 2 ,y 3 ,...,y t-1 ) F (-) represents a non-linear mapping relation and represents a value y needing soft measurement currently t From the previous time y 1 ,y 2 ,y 3 ,...,y t-1 Related to attention weighted features at the current time, where C t Corresponding to the input h t The attention assignment probability distribution of (2) is calculated as follows:
Figure FDA0002364053420000012
wherein, S (x) i ) Representing input x at time t t Output values over a two-channel network in the ith dimension, i.e.
Figure FDA0002364053420000013
Figure FDA0002364053420000014
Representing the attention-distribution coefficient, representing the input x at time t t Attention weight in the ith dimension; i epsilon (1 \8230m), calculated as follows:
Figure FDA0002364053420000015
wherein
Figure FDA0002364053420000016
Is->
Figure FDA0002364053420000017
Attention score, which is calculated as follows:
Figure FDA0002364053420000018
wherein V, W and U represent weight transformation matrix, b is bias term, and finally formed output characteristic C t As an input to a fully connected network;
s5, performing error calculation according to the output result of the full-connection network and the result of the real industrial process, and performing reverse modification through an error function to obtain a final soft measurement model;
and S6, measuring key indexes of the industrial process by adopting the final soft measurement model.
2. A method as defined in claim 1, wherein the machine vision data is video data.
3. The method for soft measurement of key indicators in complex industrial process as claimed in claim 1, wherein the step of performing foam video spatiotemporal sequence feature extraction on the video data stream by adopting cross-frame fusion convolutional neural network is as follows:
3.1 Sampling video data of certain sample data into 2^ n pictures;
3.2 Carrying out primary convolution on the 2^ n pictures respectively, and pooling the feature images after the convolution to obtain a feature image with a larger receptive field; sending two adjacent feature maps into a DepthConcat for fusion according to the span with the span of 1;
3.3 N = n-1, if n is not equal to 0, go back to step 3.2), otherwise pull up the last feature map into a vector as spatio-temporal sequence feature data of the video data.
4. The method for soft measurement of key indicators of complex industrial processes as defined in claim 2, wherein the step of extracting the time series characteristics of the industrial parameter data by using the GRU network is as follows:
4.1 Initialize the network parameters and historical outputs h at time 1 based on the GRU network model 0 (ii) a For a GRU network at time t, the GRU first pair the input data
Figure FDA0002364053420000021
And the last time data x t-1 Corresponding historical output h t-1 Performing door calculation; />
Figure FDA0002364053420000022
A characteristic value of the data at the time t in the mth dimension;
4.2 By resetting the gate to calculate how much output information at the previous time was written to the candidate output
Figure FDA0002364053420000023
Above, smaller values indicate less data to write, and the calculation formula is asThe following:
r t =σ(W r X t +U r h t-1 +b r )
wherein W r ,U r ,b r Are trainable parameters; sigma is a sigmoid activation function;
4.3 Calculate how much state information of the previous time is brought into the current state by the update gate, the larger the value is, the more state information is brought into the previous time, and the calculation formula is as follows
z t =σ(W z X t +U z h t-1 +b z )
Wherein W z ,U z ,b z Is a trainable parameter; sigma is a sigmoid activation function;
4.4 ) the remake door r calculated according to step 4.2) t And output information h of the previous moment t-1 Computing candidate outputs
Figure FDA0002364053420000024
The calculation steps are as follows:
Figure FDA0002364053420000025
wherein W c ,U c ,b c Is a trainable parameter; tan h is an activation function;
4.5 Updated gate z) calculated according to 4.3) t Outputting information h at the previous moment t-1 And current time candidate output
Figure FDA0002364053420000026
Calculating the output information h of the current time t
Figure FDA0002364053420000031
So the final output of the GRU network is the last output h t-1 And current candidate output
Figure FDA0002364053420000032
A weighted sum of; therefore, the time sequence characteristic information in the process parameter data is effectively extracted.
5. A method for soft measurement of a critical indicator of a complex industrial process as claimed in claim 1, wherein the method is used for measuring a critical indicator of a rotary cement kiln.
CN202010030254.3A 2020-01-13 2020-01-13 Complex industrial process key index soft measurement method Active CN111222798B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010030254.3A CN111222798B (en) 2020-01-13 2020-01-13 Complex industrial process key index soft measurement method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010030254.3A CN111222798B (en) 2020-01-13 2020-01-13 Complex industrial process key index soft measurement method

Publications (2)

Publication Number Publication Date
CN111222798A CN111222798A (en) 2020-06-02
CN111222798B true CN111222798B (en) 2023-04-07

Family

ID=70829404

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010030254.3A Active CN111222798B (en) 2020-01-13 2020-01-13 Complex industrial process key index soft measurement method

Country Status (1)

Country Link
CN (1) CN111222798B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111832910B (en) * 2020-06-24 2024-03-12 陕西法士特齿轮有限责任公司 Multi-index abnormal sound judgment threshold value determining method, system and computer equipment
CN112001527B (en) * 2020-07-29 2024-01-30 中国计量大学 Industrial production process target data prediction method of multi-feature fusion depth neural network
CN113277761A (en) * 2021-06-23 2021-08-20 湖南师范大学 Cement formula limestone proportion adjusting method based on model prediction framework

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108629144A (en) * 2018-06-11 2018-10-09 湖北交投智能检测股份有限公司 A kind of bridge health appraisal procedure
CN108985376A (en) * 2018-07-17 2018-12-11 东北大学 It is a kind of based on convolution-Recognition with Recurrent Neural Network rotary kiln sequence operating mode's switch method
CN110378044A (en) * 2019-07-23 2019-10-25 燕山大学 Multiple Time Scales convolutional neural networks flexible measurement method based on attention mechanism
EP3564862A1 (en) * 2018-05-03 2019-11-06 Siemens Aktiengesellschaft Determining influence of attributes in recurrent neural networks trained on therapy prediction
CN110597240A (en) * 2019-10-24 2019-12-20 福州大学 Hydroelectric generating set fault diagnosis method based on deep learning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3564862A1 (en) * 2018-05-03 2019-11-06 Siemens Aktiengesellschaft Determining influence of attributes in recurrent neural networks trained on therapy prediction
CN108629144A (en) * 2018-06-11 2018-10-09 湖北交投智能检测股份有限公司 A kind of bridge health appraisal procedure
CN108985376A (en) * 2018-07-17 2018-12-11 东北大学 It is a kind of based on convolution-Recognition with Recurrent Neural Network rotary kiln sequence operating mode's switch method
CN110378044A (en) * 2019-07-23 2019-10-25 燕山大学 Multiple Time Scales convolutional neural networks flexible measurement method based on attention mechanism
CN110597240A (en) * 2019-10-24 2019-12-20 福州大学 Hydroelectric generating set fault diagnosis method based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
耿志强等.基于深度学习的复杂化工过程软测量模型研究与应用.《电子测量与仪器学报》.2019,第70卷(第2期),全文. *

Also Published As

Publication number Publication date
CN111222798A (en) 2020-06-02

Similar Documents

Publication Publication Date Title
CN111222798B (en) Complex industrial process key index soft measurement method
CN110598736B (en) Power equipment infrared image fault positioning, identifying and predicting method
CN109765053B (en) Rolling bearing fault diagnosis method using convolutional neural network and kurtosis index
CN105678332B (en) Converter steelmaking end point judgment method and system based on flame image CNN recognition modeling
CN107909206B (en) PM2.5 prediction method based on deep structure recurrent neural network
CN112488235A (en) Elevator time sequence data abnormity diagnosis method based on deep learning
CN113723010B (en) Bridge damage early warning method based on LSTM temperature-displacement correlation model
CN108647643B (en) Packed tower flooding state online identification method based on deep learning
CN112257911B (en) TCN multivariate time sequence prediction method based on parallel space-time attention mechanism
CN108197743A (en) A kind of prediction model flexible measurement method based on deep learning
CN112685950B (en) Method, system and equipment for detecting abnormality of ocean time sequence observation data
CN111191726B (en) Fault classification method based on weak supervision learning multilayer perceptron
WO2021114320A1 (en) Wastewater treatment process fault monitoring method using oica-rnn fusion model
CN114169638A (en) Water quality prediction method and device
CN115673596B (en) Welding abnormity real-time diagnosis method based on Actor-Critic reinforcement learning model
CN110222825B (en) Cement product specific surface area prediction method and system
CN113110398B (en) Industrial process fault diagnosis method based on dynamic time consolidation and graph convolution network
CN114239397A (en) Soft measurement modeling method based on dynamic feature extraction and local weighted deep learning
CN111832479B (en) Video target detection method based on improved self-adaptive anchor point R-CNN
CN110045691B (en) Multi-task processing fault monitoring method for multi-source heterogeneous big data
Jiang et al. A new monitoring method for the blocking time of the taphole of blast furnace using molten iron flow images
CN115169660A (en) Cutter wear prediction method based on multi-scale space-time feature fusion neural network
CN115274009A (en) Polyester melt quality online prediction method based on semi-supervised GRU regression model
CN113988210A (en) Method and device for restoring distorted data of structure monitoring sensor network and storage medium
CN117491357B (en) Quality monitoring method and system for paint

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant