CN111222798A - Soft measurement method for key indexes of complex industrial process - Google Patents

Soft measurement method for key indexes of complex industrial process Download PDF

Info

Publication number
CN111222798A
CN111222798A CN202010030254.3A CN202010030254A CN111222798A CN 111222798 A CN111222798 A CN 111222798A CN 202010030254 A CN202010030254 A CN 202010030254A CN 111222798 A CN111222798 A CN 111222798A
Authority
CN
China
Prior art keywords
data
time
network
output
soft measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010030254.3A
Other languages
Chinese (zh)
Other versions
CN111222798B (en
Inventor
刘金平
蒋楚蓉
何捷舟
史雅琴
赵爽爽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Normal University
Original Assignee
Hunan Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Normal University filed Critical Hunan Normal University
Priority to CN202010030254.3A priority Critical patent/CN111222798B/en
Publication of CN111222798A publication Critical patent/CN111222798A/en
Application granted granted Critical
Publication of CN111222798B publication Critical patent/CN111222798B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Game Theory and Decision Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a soft measurement method for key indexes of a complex industrial process, which comprises the following steps: 1. and collecting visual data and process parameter data, and testing in a laboratory at a later stage to obtain a value corresponding to the key index. 2. A video data space-time sequence feature extraction network model is constructed on video data by adopting a cross-frame fusion convolutional neural network (DCFNN) provided by the invention. 3. And constructing a time sequence characteristic extraction model on the industrial parameter data by adopting a GRU network. 4. Inputting the acquired data and process parameters, carrying out dual-channel concurrent extraction of characteristic data with time series, and carrying out soft measurement on key indexes by adopting a full-connection network of an attention mechanism. And carrying out reverse model training according to the real key index value obtained by the assay. 5. And (3) estimating the key indexes which are difficult to monitor on line in real time by adopting the trained soft measurement model.

Description

Soft measurement method for key indexes of complex industrial process
Technical Field
The invention belongs to the field of online monitoring of key indexes of a complex industrial process, and particularly relates to a soft measurement method of key indexes of the complex industrial process based on machine vision and engineering parameter dual-channel fusion.
Background
In the modern industrial process, the real-time monitoring of key indexes has important significance for ensuring the process safety and the production quality. However, in many cases, the key indexes of complex industrial processes are difficult to detect on line due to the problems of long process flow, unclear internal mechanism, multiple influencing factors and the like.
In recent years, the soft measurement technology is widely applied to online monitoring of key indexes in complex industrial processes due to the advantages of high response speed, low maintenance cost, accurate prediction result and the like. Machine vision has been widely used in industrial process monitoring due to its advantages of rapidity, real-time, and non-contact in complex industrial processes. Therefore, many experts and scholars at home and abroad carry out a series of researches on the machine vision-based complex industrial process key index soft measurement technology, so that some industrial characteristics are intuitively acquired through the machine vision, and a soft measurement model of the key indexes is established by combining industrial process parameters to realize online monitoring.
The traditional soft measurement model based on machine vision usually adopts an image processing mode, the characteristics of the image such as color, outline and the like are extracted through artificial experience to serve as characteristic parameters for establishing the soft measurement model, in recent years, a plurality of expert scholars in the rapid development of machine learning apply the machine learning to the establishment of the soft measurement model based on machine vision, wherein a deep convolution neural network can extract effective image characteristics in a self-adaptive mode, subjective defects caused by artificial characteristic extraction are avoided, and therefore a good application effect is achieved.
Many time-varying industrial processes often exist in a complex industrial process, and image data and industrial parameters of the time-varying industrial processes have typical time series characteristics, for example, chemical reactions of substances in a chemical process need a certain time to obtain a final finished medicament. The conventional learning methods for processing time series, such as LSTM, GRU, RNN, can only extract time series characteristics from the traditional industrial process parameter variables (such as temperature, flow rate), and the sampling rate of the industrial parameters is not consistent with the sampling rate of the machine vision image. Therefore, time sequence characteristics in the video data are effectively extracted, industrial process parameter characteristics are fused and used for key index monitoring and scientific research of the complex industrial process to provide important information required for monitoring, optimizing and controlling the complex industrial process, and further energy conservation and benefit maximization of the industrial process are achieved.
According to the analysis, the complex industrial process is very complex, the influence factors are more, the product quality cannot be accurately monitored by the conventional manual monitoring, and the key indexes are difficult to detect on line, so that the problems of low yield of industrial products, low utilization rate of raw materials, high resource consumption and the like are caused. Machine vision, as the most direct indicator, can effectively extract the characteristic information related to key indexes. The invention provides a complex industrial process key index soft measurement method based on machine vision and engineering parameter double-channel fusion, and the method is applied to prediction of clinker quality of a rotary cement kiln, and the result is matched with the actual situation. The method is beneficial to realizing the online prediction of key indexes of the complex industrial process, and further guiding the monitoring and optimization of the complex industrial process.
The noun explains:
DepthConcat: splicing two or more feature maps in a channel dimension according to the size of a row and a column
DCFNN network: and fusing the convolutional neural network across frames.
GRU network: a very effective variant of the LSTM network is simpler and more effective than the LSTM network.
The Attention mechanism: attention is paid to a mechanism, and high-value information is rapidly screened out from a large amount of information. The method is mainly used for solving the problem that the final reasonable vector representation is difficult to obtain when the input sequence of the LSTM/RNN model is long, and the method is characterized in that the intermediate result of the LSTM is reserved, the LSTM is learned by a new model, and the LSTM is associated with the output, so that the purpose of information screening is achieved.
Full connection network: i.e. a fully connected neural network.
Disclosure of Invention
The invention aims to provide a soft measurement method for key indexes of a complex industrial process. The invention establishes the fault early warning method based on the industrial process running state trend by combining qualitative trend analysis and process state prediction, and can accurately and intuitively reflect the industrial process running state.
The content of the invention comprises:
a soft measurement method for key indexes of a complex industrial process is characterized by comprising the following steps:
s1: collecting related industrial parameter data of key indexes to be measured and machine vision data by using a two-channel network as a basis for establishing a related soft measurement model, namely collecting industrial parameter characteristic data according to a time interval T and collecting machine vision data corresponding to the time interval T; obtaining key indexes corresponding to time points through detection to serve as labels for soft measurement model training;
s2: carrying out double-channel concurrent extraction on the video data and the industrial parameter characteristic data respectively to extract characteristic data with time sequences; wherein, the video data stream adopts a cross-frame fusion convolution neural network to extract the foam video space-time sequence characteristics; extracting time sequence characteristics of the industrial parameter data by adopting a GRU network;
s3, fusing the features extracted by the cross-frame fusion convolutional neural network and the features extracted by the GRU network into features
Figure BDA0002364053430000021
Where m represents the characteristic dimension, t represents the time, htExpressed as a fused feature vector;
s4, for the data sample at the T moment, calculating the attention distribution probability of each dimension characteristic of the output characteristics corresponding to the two-channel network by adopting an attention mechanism, and weighting the output characteristics of the two-channel network to improve the influence of the dimension of each characteristic on the final prediction result:
output y for any time ttConstructing a soft measurement model denoted as yt=F(Ct,y1,y2,y3,...,yt-1) F (-) represents a non-linear mapping relation, and represents the value y needing soft measurement currentlytFrom the previous time y1,y2,y3,...,yt-1Related to attention weighted features at the current time, where CtCorresponding to the input htThe attention assignment probability distribution of (2) is calculated as follows:
Figure BDA0002364053430000031
wherein, S (x)i) Representing input x at time ttThe output value via the two-channel network in the ith dimension, i.e.
Figure BDA0002364053430000032
Representing the attention-distribution coefficient, representing the input x at time ttAttention weight in the ith dimension; i ∈ (1, …, m), calculated as follows:
Figure BDA0002364053430000033
wherein
Figure BDA0002364053430000034
Is composed of
Figure BDA0002364053430000035
Attention score, which is calculated as follows:
Figure BDA0002364053430000036
wherein V, W and U represent weight conversion matrix, b is bias term, and finally formed output characteristic CtAs an input to a fully connected network;
s5, performing error calculation according to the output result of the full-connection network and the result of the real industrial process, and performing reverse modification through an error function to obtain a final soft measurement model;
and S6, measuring key indexes of the industrial process by adopting the final soft measurement model.
In a further improvement, the machine vision data is video data.
In a further improvement, the step of extracting the foam video space-time sequence features of the video data stream by adopting a cross-frame fusion convolutional neural network is as follows:
3.1) sampling the video data of a certain sample data into 2^ n pictures;
3.2) carrying out convolution on the 2^ n pictures respectively, and pooling the feature images after convolution to obtain a feature image with a larger receptive field; sending the two adjacent feature graphs into a DepthConcat for fusion according to the span with the span of 1;
3.3) let n be n-1, if n is not equal to 0, go back to step 3.2), otherwise, pull up the last feature map into a vector as the spatio-temporal sequence feature data of the video data.
In a further improvement, the step of extracting the time sequence characteristics of the industrial parameter data by adopting a GRU network is as follows:
4.1) initializing the network parameters and historical output h at the moment 1 according to the GRU network model0(ii) a For a GRU network at time t, the GRU first pair the input data
Figure BDA0002364053430000037
And the last time data xt-1Corresponding historical output ht-1Performing door calculation;
Figure BDA0002364053430000038
a characteristic value of the data at the time t in the mth dimension;
4.2) calculating how much output information of the previous moment is written into the candidate output by resetting the gate
Figure BDA0002364053430000039
Above, the smaller the value, the less the written data, the calculation formula is as follows:
Figure BDA0002364053430000041
wherein Wr,Ur,brAre trainable parameters; sigma is a sigmoid activation function;
4.3) calculating how much the state information of the previous moment is brought into the current state by the updating gate, wherein the larger the value is, the more the state information is brought into the previous moment is, and the calculation formula is as follows
zt=σ(WzXt+Uzht-1+bz)
Wherein Wz,Uz,bzIs a trainable parameter; sigma is a sigmoid activation function;
4.4) calculating the reproduced door r according to step 4.2)tAnd output information h of the previous momentt-1Computing candidate outputs
Figure BDA0002364053430000046
The calculation steps are as follows:
Figure BDA0002364053430000042
wherein Wc,Uc,bcIs a trainable parameter; tan h is an activation function;
4.5) update Gate z calculated according to 4.3)tOutputting information h at the previous momentt-1And current time candidate output
Figure BDA0002364053430000045
Calculating the output information h of the current timet
Figure BDA0002364053430000043
So the final output of the GRU network is the last output ht-1And current candidate output
Figure BDA0002364053430000044
A weighted sum of; therefore, the time sequence characteristic information in the process parameter data is effectively extracted.
In a further refinement, the method is used to measure rotary cement kiln data.
The invention has the beneficial effects that:
1. the cross-frame fusion convolution neural network provided by the invention can effectively extract the space-time sequence characteristics in the foam video and simultaneously
The problem that the sampling rate of the video data is inconsistent with that of the industrial parameter data can be solved.
2. The attention mechanism-based dual-channel fusion feature weighting soft measurement model provided by the invention can effectively measure according to each
The influence of the characteristics on the detection of the key indexes is added with the weighted value, so that the detection effect is enhanced. And the detection precision is improved.
3. The effective real-time detection of key indexes of complex industrial processes can effectively provide accurate indication for industrial process control
Indexes, stable production flow, reduced production consumption and reduced manual monitoring cost.
Drawings
FIG. 1 is a diagram of a hole convolution fusion neural network structure in the algorithm of the present invention;
FIG. 2 is a flow chart of a method;
FIG. 3 is a graph showing the soft measurement results of the cement clinker according to the present invention;
FIG. 4 is a graph comparing the results of the present invention with GRUs.
Detailed Description
S1 as shown in fig. 1 and 2: collecting the characteristic data of the industrial parameters of the rotary cement kiln such as the dosage of feed, the quantity of coal feed, the temperature of raw materials entering the kiln, the temperature of secondary air, the rotating speed of the rotary kiln and the like at a time interval T, and simultaneously collecting the fire watching video of the rotary kiln with the corresponding time span of T from the previous sampling moment to the current sampling moment. And the strength of the cement after 3 days at the corresponding time point is obtained through later-stage laboratory detection and is used as a clinker quality index and also a monitoring index of the soft measurement model.
S2: and carrying out double-channel concurrent extraction on the fire observation video and the industrial parameter data respectively to obtain characteristic data with time sequences. The fire-watching video data adopts a cross-frame fusion neural network (DCFNN) provided by the invention to extract the fire-watching video space-time sequence characteristics, the specific steps are S3.1-S3.3, the industrial parameter data adopts a GRU network to extract the time sequence characteristics, and the specific steps are S4.
S3.1: for a video image with the time length T, sampling a frame of image at intervals of T, and ensuring that the number of the sampled images is 2 to the power of n.
S3.2: and performing primary convolution on the 2^ n pictures, and performing primary pooling to obtain a feature map with a larger receptive field. And (5) sending the feature maps into DepthConcat in pairs for fusion.
S3.3: let n be n-1, go back to step S3.2 if n is not equal to 0, otherwise go to S5.
S4: the method for extracting the time series characteristics of the industrial parameter characteristic data by adopting the GRU network comprises the following steps:
s4.1: the GRU has a double gate structure, and for a GRU network at time t, the GRU first inputs data at time t
Figure BDA0002364053430000051
And the last time data xt-1Corresponding historical output ht-1A gate calculation is performed.
S4.2: calculating how much output information of the previous time is written into the candidate output by the reset gate
Figure BDA0002364053430000052
In the above, the smaller the value, the less the written data, the calculation formula is as follows:
rt=σ(WrXt+Urht-1+br)
wherein Wr,Ur,brAre trainable parameters. Sigma is sigmoid activation function.
S4.3: the method comprises the steps of calculating how much state information at the previous moment is brought into the current state through an updating gate, wherein the larger the value of the state information is, the more the state information is brought into the previous moment, and the calculation formula is as follows
zt=σ(WzXt+Uzht-1+bz)
Wherein Wz,Uz,bzAre trainable parameters. Sigma is sigmoid activation function.
S4.4: reproduced door r calculated according to S3.1tAnd output information h of the previous momentt-1Computing candidate outputs
Figure BDA0002364053430000055
The calculation steps are as follows:
Figure BDA0002364053430000054
wherein Wc,Uc,bcFor trainable parameters, tanh is an activation function.
S4.5: updated gate z calculated from S3.2tOutputting information h at the previous momentt-1And current time candidate output
Figure BDA00023640534300000613
Calculating output information of the current moment:
Figure BDA0002364053430000062
so the final output of the GRU network is the last output ht-1And current candidate output
Figure BDA0002364053430000063
Is calculated as a weighted sum of. The self-adaptive recursive weighting can automatically record the specific information of a specific time point, has a filtering effect and stably outputs, thereby effectively extracting the time sequence characteristic information in the data.
S5: connecting DCFNN network with GRU networkThe extracted features are fused into features
Figure BDA0002364053430000064
S6: for the data sample at the T moment, the corresponding output characteristics of the two-channel network
Figure BDA0002364053430000065
And calculating the Attention distribution probability of each dimension characteristic by adopting an Attention mechanism, and weighting the Attention distribution probability to improve the influence of each dimension characteristic on a final prediction result. The method comprises the following steps:
s6.1 output y for any time ttCan be represented as yt=F(Ct,y1,y2,…,yt-1) In which C istCorresponding to the input xtOutput characteristic h after two-channel fusiontThe attention assignment probability distribution of (2), which is calculated as follows:
Figure BDA0002364053430000066
wherein, S (x)i) Representing input x at time ttThe output value through the two-channel network in the ith dimension (i.e. the ith characteristic) is
Figure BDA0002364053430000067
The calculation is as shown in steps S3-S5,
Figure BDA0002364053430000068
representing the attention-distribution coefficient, representing the input x at time ttThe attention weight in the ith dimension is calculated as follows:
Figure BDA0002364053430000069
wherein
Figure BDA00023640534300000610
Is composed of
Figure BDA00023640534300000611
Attention score, which is calculated as follows:
Figure BDA00023640534300000612
wherein V, W and U represent weight conversion matrix, b is bias term, and finally formed output characteristic CtAs an input to a fully connected network; the above t represents the time t, i.e., the tth sample data. m denotes a characteristic dimension representing each sample, and i is an arbitrary value from 1 to m.
S7: and error calculation is carried out according to the output result of the full-connection network and the real laboratory test result, and reverse modification is carried out through an error function, so that the whole soft measurement model can accurately detect the quality of the cement clinker.
S8: the soft measurement model trained in the process is adopted to carry out real-time soft measurement on the clinker quality in the rotary kiln cement production process, the prediction result of the soft measurement of the clinker quality (cement strength after 3 days) applied to the cement production process is shown in figure 3, the error comparison is carried out on the prediction result of the clinker quality of the common GRU model, and the error result is shown in figure 4.
While embodiments of the invention have been disclosed above, it is not limited to the applications set forth in the specification and the embodiments, which are fully applicable to various fields of endeavor for which the invention pertains, and further modifications may readily be made by those skilled in the art, it being understood that the invention is not limited to the details shown and described herein without departing from the general concept defined by the appended claims and their equivalents.

Claims (5)

1. A soft measurement method for key indexes of a complex industrial process is characterized by comprising the following steps:
s1: collecting related industrial parameter data of key indexes to be measured and machine vision data by using a two-channel network as a basis for establishing a related soft measurement model, namely collecting industrial parameter characteristic data according to a time interval T and collecting machine vision data corresponding to the time interval T; obtaining key indexes corresponding to time points through detection to serve as labels for soft measurement model training;
s2: carrying out double-channel concurrent extraction on the video data and the industrial parameter characteristic data respectively to extract characteristic data with time sequences; wherein, the video data stream adopts a cross-frame fusion convolution neural network to extract the foam video space-time sequence characteristics; extracting time sequence characteristics of the industrial parameter data by adopting a GRU network;
s3, fusing the features extracted by the cross-frame fusion convolutional neural network and the features extracted by the GRU network into features
Figure FDA0002364053420000011
Where m represents the characteristic dimension, t represents the time, htExpressed as a fused feature vector;
s4, for the data sample at the T moment, calculating the attention distribution probability of each dimension characteristic of the output characteristics corresponding to the two-channel network by adopting an attention mechanism, and weighting the output characteristics of the two-channel network to improve the influence of the dimension of each characteristic on the final prediction result:
output y for any time ttConstructing a soft measurement model denoted as yt=F(Ct,y1,y2,y3,...,yt-1) F (-) represents a non-linear mapping relation, and represents the value y needing soft measurement currentlytFrom the previous time y1,y2,y3,...,yt-1Related to attention weighted features at the current time, where CtCorresponding to the input htThe attention assignment probability distribution of (2) is calculated as follows:
Figure FDA0002364053420000012
wherein, S (x)i) Representing input x at time ttThe output value via the two-channel network in the ith dimension, i.e.
Figure FDA0002364053420000013
Figure FDA0002364053420000014
Representing the attention-distribution coefficient, representing the input x at time ttAttention weight in the ith dimension; i ∈ (1 … m), calculated as follows:
Figure FDA0002364053420000015
wherein
Figure FDA0002364053420000016
Is composed of
Figure FDA0002364053420000017
Attention score, which is calculated as follows:
Figure FDA0002364053420000018
wherein V, W and U represent weight conversion matrix, b is bias term, and finally formed output characteristic CtAs an input to a fully connected network;
s5, performing error calculation according to the output result of the full-connection network and the result of the real industrial process, and performing reverse modification through an error function to obtain a final soft measurement model;
and S6, measuring key indexes of the industrial process by adopting the final soft measurement model.
2. A method as claimed in claim 1, wherein the machine vision data is video data.
3. The method for soft measurement of key indicators in complex industrial process as claimed in claim 1, wherein the step of performing foam video spatiotemporal sequence feature extraction on the video data stream by adopting cross-frame fusion convolutional neural network is as follows:
3.1) sampling the video data of a certain sample data into 2^ n pictures;
3.2) carrying out convolution on the 2^ n pictures respectively, and pooling the feature images after convolution to obtain a feature image with a larger receptive field; sending the two adjacent feature graphs into a DepthConcat for fusion according to the span with the span of 1;
3.3) let n be n-1, if n is not equal to 0, go back to step 3.2), otherwise, pull up the last feature map into a vector as the spatio-temporal sequence feature data of the video data.
4. The method for soft measurement of key indicators of complex industrial processes as claimed in claim 2, wherein the step of extracting the time sequence characteristics of the industrial parameter data by using the GRU network is as follows:
4.1) initializing the network parameters and historical output h at the moment 1 according to the GRU network model0(ii) a For a GRU network at time t, the GRU first pair the input data
Figure FDA0002364053420000021
And the last time data xt-1Corresponding historical output ht-1Performing door calculation;
Figure FDA0002364053420000022
a characteristic value of the data at the time t in the mth dimension;
4.2) calculating how much output information of the previous moment is written into the candidate output by resetting the gate
Figure FDA0002364053420000023
Above, the smaller the value, the less the written data, the calculation formula is as follows:
rt=σ(WrXt+Urht-1+br)
wherein Wr,Ur,brAre trainable parameters; sigma is a sigmoid activation function;
4.3) calculating how much the state information of the previous moment is brought into the current state by the updating gate, wherein the larger the value is, the more the state information is brought into the previous moment is, and the calculation formula is as follows
zt=σ(WzXt+Uzht-1+bz)
Wherein Wz,Uz,bzIs a trainable parameter; sigma is a sigmoid activation function;
4.4) calculating the reproduced door r according to step 4.2)tAnd output information h of the previous momentt-1Computing candidate outputs
Figure FDA0002364053420000024
The calculation steps are as follows:
Figure FDA0002364053420000025
wherein Wc,Uc,bcIs a trainable parameter; tan h is an activation function;
4.5) update Gate z calculated according to 4.3)tOutputting information h at the previous momentt-1And current time candidate output
Figure FDA0002364053420000026
Calculating the output information h of the current timet
Figure FDA0002364053420000031
So the final output of the GRU network is the last output ht-1And current candidate output
Figure FDA0002364053420000032
A weighted sum of; therefore, the time sequence characteristic information in the process parameter data is effectively extracted.
5. A method for soft measurement of a critical indicator of a complex industrial process as claimed in claim 1, wherein the method is used for measuring a critical indicator of a rotary cement kiln.
CN202010030254.3A 2020-01-13 2020-01-13 Complex industrial process key index soft measurement method Active CN111222798B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010030254.3A CN111222798B (en) 2020-01-13 2020-01-13 Complex industrial process key index soft measurement method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010030254.3A CN111222798B (en) 2020-01-13 2020-01-13 Complex industrial process key index soft measurement method

Publications (2)

Publication Number Publication Date
CN111222798A true CN111222798A (en) 2020-06-02
CN111222798B CN111222798B (en) 2023-04-07

Family

ID=70829404

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010030254.3A Active CN111222798B (en) 2020-01-13 2020-01-13 Complex industrial process key index soft measurement method

Country Status (1)

Country Link
CN (1) CN111222798B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111832910A (en) * 2020-06-24 2020-10-27 陕西法士特齿轮有限责任公司 Method and system for determining multi-index abnormal sound judgment threshold value and computer equipment
CN112001527A (en) * 2020-07-29 2020-11-27 中国计量大学 Industrial production process target data prediction method of multi-feature fusion deep neural network
CN113277761A (en) * 2021-06-23 2021-08-20 湖南师范大学 Cement formula limestone proportion adjusting method based on model prediction framework

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108629144A (en) * 2018-06-11 2018-10-09 湖北交投智能检测股份有限公司 A kind of bridge health appraisal procedure
CN108985376A (en) * 2018-07-17 2018-12-11 东北大学 It is a kind of based on convolution-Recognition with Recurrent Neural Network rotary kiln sequence operating mode's switch method
CN110378044A (en) * 2019-07-23 2019-10-25 燕山大学 Multiple Time Scales convolutional neural networks flexible measurement method based on attention mechanism
EP3564862A1 (en) * 2018-05-03 2019-11-06 Siemens Aktiengesellschaft Determining influence of attributes in recurrent neural networks trained on therapy prediction
CN110597240A (en) * 2019-10-24 2019-12-20 福州大学 Hydroelectric generating set fault diagnosis method based on deep learning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3564862A1 (en) * 2018-05-03 2019-11-06 Siemens Aktiengesellschaft Determining influence of attributes in recurrent neural networks trained on therapy prediction
CN108629144A (en) * 2018-06-11 2018-10-09 湖北交投智能检测股份有限公司 A kind of bridge health appraisal procedure
CN108985376A (en) * 2018-07-17 2018-12-11 东北大学 It is a kind of based on convolution-Recognition with Recurrent Neural Network rotary kiln sequence operating mode's switch method
CN110378044A (en) * 2019-07-23 2019-10-25 燕山大学 Multiple Time Scales convolutional neural networks flexible measurement method based on attention mechanism
CN110597240A (en) * 2019-10-24 2019-12-20 福州大学 Hydroelectric generating set fault diagnosis method based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
耿志强等: "基于深度学习的复杂化工过程软测量模型研究与应用" *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111832910A (en) * 2020-06-24 2020-10-27 陕西法士特齿轮有限责任公司 Method and system for determining multi-index abnormal sound judgment threshold value and computer equipment
CN111832910B (en) * 2020-06-24 2024-03-12 陕西法士特齿轮有限责任公司 Multi-index abnormal sound judgment threshold value determining method, system and computer equipment
CN112001527A (en) * 2020-07-29 2020-11-27 中国计量大学 Industrial production process target data prediction method of multi-feature fusion deep neural network
CN112001527B (en) * 2020-07-29 2024-01-30 中国计量大学 Industrial production process target data prediction method of multi-feature fusion depth neural network
CN113277761A (en) * 2021-06-23 2021-08-20 湖南师范大学 Cement formula limestone proportion adjusting method based on model prediction framework

Also Published As

Publication number Publication date
CN111222798B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
CN110598736B (en) Power equipment infrared image fault positioning, identifying and predicting method
CN111222798B (en) Complex industrial process key index soft measurement method
CN105678332B (en) Converter steelmaking end point judgment method and system based on flame image CNN recognition modeling
CN107909206B (en) PM2.5 prediction method based on deep structure recurrent neural network
CN112488235A (en) Elevator time sequence data abnormity diagnosis method based on deep learning
CN113723010B (en) Bridge damage early warning method based on LSTM temperature-displacement correlation model
CN108647643B (en) Packed tower flooding state online identification method based on deep learning
CN110633750A (en) Electric valve fault detection method based on LSTM model
CN112257911B (en) TCN multivariate time sequence prediction method based on parallel space-time attention mechanism
CN108197743A (en) A kind of prediction model flexible measurement method based on deep learning
CN112685950B (en) Method, system and equipment for detecting abnormality of ocean time sequence observation data
CN115673596B (en) Welding abnormity real-time diagnosis method based on Actor-Critic reinforcement learning model
CN113110398B (en) Industrial process fault diagnosis method based on dynamic time consolidation and graph convolution network
WO2021114320A1 (en) Wastewater treatment process fault monitoring method using oica-rnn fusion model
CN114282443A (en) Residual service life prediction method based on MLP-LSTM supervised joint model
CN110222825B (en) Cement product specific surface area prediction method and system
CN114239397A (en) Soft measurement modeling method based on dynamic feature extraction and local weighted deep learning
CN113988210A (en) Method and device for restoring distorted data of structure monitoring sensor network and storage medium
CN110045691B (en) Multi-task processing fault monitoring method for multi-source heterogeneous big data
CN113203953A (en) Lithium battery residual service life prediction method based on improved extreme learning machine
CN116739304A (en) Production error monitoring system and method based on product history data
CN113033845B (en) Construction method and device for power transmission resource co-construction and sharing
Jiang et al. A new monitoring method for the blocking time of the taphole of blast furnace using molten iron flow images
CN115169660A (en) Cutter wear prediction method based on multi-scale space-time feature fusion neural network
CN112423031A (en) KPI monitoring method, device and system based on IPTV

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant