CN112418324B - Cross-modal data fusion method for electrical equipment state perception - Google Patents
Cross-modal data fusion method for electrical equipment state perception Download PDFInfo
- Publication number
- CN112418324B CN112418324B CN202011334424.3A CN202011334424A CN112418324B CN 112418324 B CN112418324 B CN 112418324B CN 202011334424 A CN202011334424 A CN 202011334424A CN 112418324 B CN112418324 B CN 112418324B
- Authority
- CN
- China
- Prior art keywords
- sensor
- data
- electrical equipment
- parameter
- cross
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 13
- 230000008447 perception Effects 0.000 title abstract description 24
- 238000000605 extraction Methods 0.000 claims abstract description 24
- 238000013527 convolutional neural network Methods 0.000 claims abstract description 18
- 230000004927 fusion Effects 0.000 claims description 17
- 238000011176 pooling Methods 0.000 claims description 17
- 238000013528 artificial neural network Methods 0.000 claims description 16
- 239000011159 matrix material Substances 0.000 claims description 16
- 238000012549 training Methods 0.000 claims description 15
- 238000005070 sampling Methods 0.000 claims description 13
- 238000000034 method Methods 0.000 claims description 11
- 238000010606 normalization Methods 0.000 claims 2
- 238000004364 calculation method Methods 0.000 claims 1
- 150000001875 compounds Chemical class 0.000 claims 1
- 238000012544 monitoring process Methods 0.000 abstract description 3
- 230000006870 function Effects 0.000 description 10
- 238000012360 testing method Methods 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010219 correlation analysis Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000009916 joint effect Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000005065 mining Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/254—Fusion techniques of classification results, e.g. of results related to same input data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Image Analysis (AREA)
Abstract
本发明公开了一种用于电气设备状态感知的跨模态数据融合方法,基于多传感器数据和图像数据这两类跨模态数据,对电气设备的状态进行融合感知。本发明首先将多传感器时间序列数据转换为递归图;然后分别用不同的卷积神经网络对递归图和电气设备图像数据进行特征提取;之后按照权重对这两类数据特征进行有效拼接,最后对融合后的特征进行进一步特征提取和状态等级感知。本发明充分利用了电气设备监测数据中的多传感器和图像这两类跨模态数据,一定程度上解决了基于单模态数据感知中精确率低和容错性差的问题。
The invention discloses a cross-modal data fusion method for electrical equipment state perception. Based on two types of cross-modal data, multi-sensor data and image data, the electrical equipment state is fused and perceived. The invention first converts multi-sensor time series data into recursive graphs; then uses different convolutional neural networks to extract features from recursive graphs and electrical equipment image data; The fused features undergo further feature extraction and state-level perception. The invention makes full use of two types of cross-modal data, such as multi-sensors and images, in the monitoring data of electrical equipment, and solves the problems of low accuracy and poor fault tolerance in perception based on single-modal data to a certain extent.
Description
技术领域technical field
本发明属于电气设备状态技术领域,具体涉及一种用于电气设备状态感知的跨模态数据融合方法。The invention belongs to the technical field of electrical equipment status, and in particular relates to a cross-modal data fusion method for electrical equipment status perception.
背景技术Background technique
对电气设备进行有效的状态感知,及时发现隐患并采取相应的措施,是电网安全稳定运行的关键。Effective state awareness of electrical equipment, timely detection of hidden dangers and corresponding measures are the keys to the safe and stable operation of the power grid.
电气设备的状态是多种参量共同作用的结果,对各影响因素进行有效融合,使其相互补充和增强,从而从多角度、全方位进行状态感知,有利于提高感知的准确性。同时,当某一类参量由于通讯、测量误差而出现数据质量下降时,仍有其他类型数据进行补充,可有效提高感知的容错性。The state of electrical equipment is the result of the joint action of a variety of parameters. The effective integration of various influencing factors makes them complement and enhance each other, so that the state perception can be carried out from multiple angles and all directions, which is beneficial to improve the accuracy of perception. At the same time, when the data quality of a certain type of parameter is degraded due to communication and measurement errors, there are still other types of data to supplement, which can effectively improve the fault tolerance of perception.
随着数字化基建和电力物联网的逐步发展,电力感知终端在数量上越来越多、类型上越来越广,为电气设备的全方位多角度状态感知提供了有效的数据获取途径。电力物联网下多源感知终端产生的多模态参量包括以电气量测为代表的时间序列等结构化参量,也包括图像、检修报告等非结构化参量,二者在物理意义和表征形式上有很大的差别,即多模态参量。针对同一模态的数据,由于其数据形式相同,可以看作是同一坐标体系下的参量,融合难度相对较小;针对跨模态的数据,由于其结构形式、物理意义均不同,难以进行统一描述,融合难度大。可见,电网的迅速发展为基于跨模态数据融合的电气设备状态感知提供了充分的数据源,但跨类型、多维度的数据分析技术薄弱,状态量间的关联分析挖掘能力不足,对电气设备的状态感知不足。With the gradual development of digital infrastructure and power Internet of Things, the number and types of power sensing terminals are increasing, providing an effective way to obtain data for all-round, multi-angle state sensing of electrical equipment. The multi-modal parameters generated by the multi-source sensing terminal under the power Internet of Things include structured parameters such as time series represented by electrical measurements, as well as unstructured parameters such as images and maintenance reports. There is a big difference, namely multimodal parameters. For the data of the same mode, due to the same data form, it can be regarded as a parameter under the same coordinate system, and the difficulty of fusion is relatively small; for the data of cross-modality, due to its different structural forms and physical meanings, it is difficult to unify Description, integration is difficult. It can be seen that the rapid development of the power grid provides sufficient data sources for electrical equipment state perception based on cross-modal data fusion, but the cross-type and multi-dimensional data analysis technology is weak, and the correlation analysis and mining capabilities between state quantities are insufficient. lack of state awareness.
目前,图像参量和各类结构化参量在电气设备的状态评估中应用较为广泛,如使用红外图像数据监测设备的温度情况,使用各类传感器监测设备运行环境中的气体量等。但现有方法仍以基于单模态数据的状态评估为主,其描述角度较为单一,且精确性低,容错性差。At present, image parameters and various structural parameters are widely used in the state assessment of electrical equipment, such as using infrared image data to monitor the temperature of the equipment, and using various sensors to monitor the amount of gas in the operating environment of the equipment. However, the existing methods are still mainly based on state evaluation based on unimodal data, which has a single description angle, low accuracy and poor fault tolerance.
发明内容SUMMARY OF THE INVENTION
本发明解决的技术问题是提供一种用于电气设备状态感知的跨模态数据融合方法,解决现有电气设备状态感知方法仅依赖于单模态数据,其感知精确性低,容错性差的问题。The technical problem solved by the present invention is to provide a cross-modal data fusion method for electrical equipment state perception, which solves the problems that the existing electrical equipment state perception method only relies on single-modal data, and its perception accuracy is low and fault tolerance is poor. .
本发明提供一种用于电气设备状态感知的跨模态数据融合方法,包括以下步骤:The present invention provides a cross-modal data fusion method for electrical equipment state perception, comprising the following steps:
S1、对多传感器时序数据进行归一化处理,并生成多参量递归图;S1, normalize the multi-sensor time series data, and generate a multi-parameter recursive graph;
S2、利用卷积神经网络对多参量递归图进行特征提取,得到多传感器参量特征;S2, using the convolutional neural network to perform feature extraction on the multi-parameter recursive graph to obtain multi-sensor parameter features;
S3、利用Faster R-CNN网络对图像数据进行特征提取,得到图像参量特征;S3. Use the Faster R-CNN network to perform feature extraction on the image data to obtain image parameter features;
S4、构建跨模态数据融合模型,该模型将多传感器参量特征和图像参量特征按权重进行融合,并对融合后的特征进行特征提取,最后通过多类别神经网络进行电气设备状态分类;S4. Build a cross-modal data fusion model, which fuses multi-sensor parameter features and image parameter features according to weights, extracts features from the fused features, and finally classifies electrical equipment status through a multi-category neural network;
S5、以对应的多传感器时序数据、图像数据和电气设备状态,对跨模态数据融合模型进行训练;S5, train the cross-modal data fusion model with the corresponding multi-sensor time series data, image data and electrical equipment state;
S6、利用训练好的跨模态数据融合模型,感知电气设备状态。S6. Use the trained cross-modal data fusion model to perceive the state of the electrical equipment.
进一步地,步骤S1中,对多传感器时序数据进行归一化处理的具体方法为:Further, in step S1, the specific method for normalizing the multi-sensor time series data is as follows:
式中,表示传感器某一参量i在t1到tn时刻的采样值,n表示时间序列的长度。In the formula, It represents the sampling value of a certain parameter i of the sensor from time t 1 to t n , and n represents the length of the time series.
进一步地,步骤S1中,生成多参量递归图具体包括:Further, in step S1, generating a multi-parameter recursive graph specifically includes:
S11、设传感器时间序列的长度为n,则生成递归矩阵Rn,S11. Set the length of the sensor time series as n, then generate a recursive matrix R n ,
式中,Rij表示递归矩阵Rn中的元素,表示第ti时刻j类传感器的采样值,xj(ti)为第j类传感器在第ti时刻的采样值。In the formula, R ij represents the elements in the recursive matrix R n , represents the sampling value of the j-th sensor at time t i , and x j (t i ) is the sampling value of the j-th sensor at time t i .
S12、将递归矩阵中的所有元素均乘以预设的匹配系数,使递归矩阵与图像的元素值匹配,以匹配后的元素值作为像素值绘制递归图。S12. Multiply all elements in the recursive matrix by a preset matching coefficient, so that the recursive matrix matches the element values of the image, and draws a recursive graph with the matched element values as pixel values.
进一步地,步骤S2中的卷积神经网络结构具体为:Further, the convolutional neural network structure in step S2 is specifically:
第一层:输入层,输入大小为n×n的多传感器递归图;The first layer: input layer, the input size is n×n multi-sensor recursive graph;
第二层:卷积层,32个大小为5×5的卷积层,步长s=1,填充padding=2,并用relu函数进行非线性变换;The second layer: convolutional layer, 32 convolutional layers of
第三层:池化层,大小为2×2的最大池化函数,步长s=2,填充padding=0;The third layer: pooling layer, the maximum pooling function of
第四层:卷积层,32个大小为5×5的卷积层,步长s=1,填充padding=2,并用relu函数进行非线性变换;The fourth layer: convolutional layer, 32 convolutional layers of
第五层:池化层,大小为2×2的最大池化函数,步长s=2,填充padding=0;The fifth layer: pooling layer, the maximum pooling function of
第六层:大小为1×125的全连接层。The sixth layer: a fully connected layer of
进一步地,步骤S3中,利用Faster R-CNN网络对图像数据进行特征提取具体为:利用Faster R-CNN中ROI Pooling后的第一个全连接层及其之前的网络作为图像数据的特征提取网络,对图像数据进行特征提取。Further, in step S3, using the Faster R-CNN network to perform feature extraction on the image data is specifically: using the first fully connected layer after ROI Pooling in the Faster R-CNN and its previous network as the feature extraction network of the image data. to extract features from the image data.
进一步地,步骤S4中,多传感器参量特征和图像参量特征的权重因子通过训练得到。Further, in step S4, the weighting factors of the multi-sensor parameter features and the image parameter features are obtained through training.
进一步地,步骤S4中,将多传感器参量特征和图像参量特征按权重进行融合具体为:Further, in step S4, the multi-sensor parameter features and the image parameter features are fused according to weights, specifically:
设多传感器参数特征的权重为W1,图像数据特征的权重为W2,将二者分别与对应的权重相乘后进行拼接。The weight of the multi-sensor parameter feature is set as W 1 , and the weight of the image data feature as W 2 , and the two are multiplied by the corresponding weights respectively and then spliced.
进一步地,步骤S4中,对融合后的特征进行特征提取,最后通过多类别神经网络进行电气设备状态分类具体为:将融合后的特征分别经过一个4096的全连接层和一个m的全连接层,最后与1×m的多类别神经网络连接;其中,m为多类别神经网络的层数,即电气设备状态的类别数。Further, in step S4, feature extraction is performed on the fused features, and finally the electrical equipment state classification is performed through a multi-category neural network. Specifically, the fused features are respectively passed through a 4096 fully connected layer and an m fully connected layer. , and finally connected to a 1×m multi-category neural network; where m is the number of layers of the multi-category neural network, that is, the number of categories of electrical equipment states.
进一步地,步骤S5中,以对应的多传感器时序数据、图像数据和电气设备状态,对跨模态数据融合模型进行训练具体为:以对应的多传感器时序数据和图像数据对作为模型输入,以电气设备状态类别为输出,进行模型训练。Further, in step S5, the cross-modal data fusion model is trained with the corresponding multi-sensor time series data, image data and electrical equipment state, specifically: using the corresponding multi-sensor time series data and image data pairs as the model input, to The electrical equipment state category is the output, and the model is trained.
进一步地,模型训练的目标函数为Lcls:Further, the objective function of model training is L cls :
式中,m表示训练的样本数,pi为第i个样本的预测分类结果,由多类别神经网络分类公式计算得到,表示第i个样本的真实分类结果。In the formula, m represents the number of training samples, pi is the predicted classification result of the ith sample, which is calculated by the multi-class neural network classification formula, represents the true classification result of the ith sample.
本发明的有益效果是:本发明的用于电气设备状态感知的跨模态数据融合方法基于多传感器时序数据和图像数据这两类跨模态数据,对电气设备状态进行融合感知,充分利用了电气设备监测数据中的多传感器和图像两类跨模态数据,解决了基于单模态数据感知中精确性低和容错性差的问题。The beneficial effects of the present invention are as follows: the cross-modal data fusion method for electrical equipment state perception of the present invention is based on two types of cross-modal data, multi-sensor time series data and image data, to fuse and perceive the electrical equipment state, making full use of the The multi-sensor and image cross-modal data in electrical equipment monitoring data solves the problems of low accuracy and poor fault tolerance in perception based on single-modal data.
附图说明Description of drawings
图1是本发明用于电气设备状态感知的跨模态数据融合方法的流程图。FIG. 1 is a flow chart of the cross-modal data fusion method for electrical equipment state perception according to the present invention.
图2是本发明基于递归图的电气设备多传感器参量特征提取网络图。FIG. 2 is a network diagram of multi-sensor parameter feature extraction of electrical equipment based on recursive graph of the present invention.
图3是本发明电气设备图像特征提取网络图。FIG. 3 is a network diagram of image feature extraction of electrical equipment according to the present invention.
图4是本发明基于权重因子的特征融合网络图。FIG. 4 is a feature fusion network diagram based on weighting factors of the present invention.
具体实施方式Detailed ways
下面将结合附图对本发明作进一步的说明:The present invention will be further described below in conjunction with the accompanying drawings:
本发明的用于电气设备状态感知的跨模态数据融合方法,基于多传感器数据和图像数据这两类跨模态数据,对电气设备的状态进行融合感知。本发明首先将多传感器时间序列数据转换为递归图;然后分别用不同的卷积神经网络对递归图和电气设备图像数据进行特征提取;之后按照权重对这两类数据特征进行有效拼接,最后对融合后的特征进行进一步特征提取和状态等级感知。本发明充分利用了电气设备监测数据中的多传感器和图像这两类跨模态数据,一定程度上解决了基于单模态数据感知中精确率低和容错性差的问题。The cross-modal data fusion method for electrical equipment state perception of the present invention is based on two types of cross-modal data, multi-sensor data and image data, to perform fusion perception on the electrical equipment state. The invention first converts multi-sensor time series data into recursive graphs; then uses different convolutional neural networks to extract features from recursive graphs and electrical equipment image data; The fused features undergo further feature extraction and state-level perception. The present invention makes full use of two types of cross-modal data, such as multi-sensors and images, in the monitoring data of electrical equipment, and solves the problems of low accuracy and poor fault tolerance in perception based on single-modal data to a certain extent.
本发明实施例的用于电气设备状态感知的跨模态数据融合方法,如图1所示,包括以下步骤:The cross-modal data fusion method for electrical equipment state perception according to the embodiment of the present invention, as shown in FIG. 1 , includes the following steps:
S1、对多传感器时序数据进行归一化处理,并生成多参量递归图;S1, normalize the multi-sensor time series data, and generate a multi-parameter recursive graph;
S2、利用卷积神经网络对多参量递归图进行特征提取,得到多传感器参量特征;S2, using the convolutional neural network to perform feature extraction on the multi-parameter recursive graph to obtain multi-sensor parameter features;
S3、利用Faster R-CNN网络对图像数据进行特征提取,得到图像参量特征;S3. Use the Faster R-CNN network to perform feature extraction on the image data to obtain image parameter features;
S4、构建跨模态数据融合模型,该模型将多传感器参量特征和图像参量特征按权重进行融合,并对融合后的特征进行特征提取,最后通过多类别神经网络进行电气设备状态分类;S4. Build a cross-modal data fusion model, which fuses multi-sensor parameter features and image parameter features according to weights, extracts features from the fused features, and finally classifies electrical equipment status through a multi-category neural network;
S5、以对应的多传感器时序数据、图像数据和电气设备状态,对跨模态数据融合模型进行训练;S5, train the cross-modal data fusion model with the corresponding multi-sensor time series data, image data and electrical equipment state;
S6、利用训练好的跨模态数据融合模型,感知电气设备状态。S6. Use the trained cross-modal data fusion model to perceive the state of the electrical equipment.
进一步地,步骤S1中,对多传感器时序数据进行归一化处理的具体方法为:Further, in step S1, the specific method for normalizing the multi-sensor time series data is as follows:
式中,表示传感器某一参量i在t1到tn时刻的采样值,n表示时间序列的长度,由图像数据和传感器数据的采样频率决定。为了使时序数据与图像数据相对应,即在时间上相匹配,多传感器时序数据中最后一个数据的采样时刻与图像采样时刻相同或相近。In the formula, It represents the sampling value of a certain parameter i of the sensor from time t 1 to t n , and n represents the length of the time series, which is determined by the sampling frequency of image data and sensor data. In order to make the time series data correspond to the image data, that is, match in time, the sampling time of the last data in the multi-sensor time series data is the same as or close to the image sampling time.
进一步地,步骤S1中,生成多参量递归图具体包括:Further, in step S1, generating a multi-parameter recursive graph specifically includes:
S11、设传感器时间序列的长度为n,则生成递归矩阵Rn,S11. Set the length of the sensor time series as n, then generate a recursive matrix R n ,
式中,Rij表示递归矩阵Rn中的元素,表示第ti时刻j类传感器的采样值,xj(ti)为第j类传感器在第ti时刻的采样值。In the formula, R ij represents the elements in the recursive matrix R n , represents the sampling value of the j-th sensor at time t i , and x j (t i ) is the sampling value of the j-th sensor at time t i .
S12、为了使递归图中的元素值与图像数据中的元素值相匹配,将递归矩阵中的所有元素均乘以预设的匹配系数,本实施例为100,使递归矩阵与图像的元素值匹配,以匹配后的元素值作为像素值绘制递归图。S12. In order to match the element values in the recursive graph with the element values in the image data, multiply all the elements in the recursive matrix by a preset matching coefficient, which is 100 in this embodiment, so that the recursive matrix matches the element values of the image Match, draw the recursive graph with the matched element value as the pixel value.
进一步地,步骤S2中的卷积神经网络结构如图2所示,具体为:Further, the convolutional neural network structure in step S2 is shown in Fig. 2, specifically:
第一层:输入层,输入大小为n×n的多传感器递归图;The first layer: input layer, the input size is n×n multi-sensor recursive graph;
第二层:卷积层,32个大小为5×5的卷积层,步长s=1,填充padding=2,并用relu函数进行非线性变换;The second layer: convolutional layer, 32 convolutional layers of
第三层:池化层,大小为2×2的最大池化函数,步长s=2,填充padding=0;The third layer: pooling layer, the maximum pooling function of
第四层:卷积层,32个大小为5×5的卷积层,步长s=1,填充padding=2,并用relu函数进行非线性变换;The fourth layer: convolutional layer, 32 convolutional layers of
第五层:池化层,大小为2×2的最大池化函数,步长s=2,填充padding=0;The fifth layer: pooling layer, the maximum pooling function of
第六层:大小为1×125的全连接层。The sixth layer: a fully connected layer of
其中,k为传感器类别数,n为时间序列的长度。conv(32*5*5)代表32个大小为5*5的卷积核;maxpooling1(2*2)代表大小为2*2的最大池化;s为卷积时在图像每一维的步长;p为padding,即填充在图像四周的部分,若p=2,则说明在图像周围填充两圈0。Among them, k is the number of sensor categories, and n is the length of the time series. conv(32*5*5) represents 32 convolution kernels of
进一步地,步骤S3中,利用Faster R-CNN网络对图像数据进行特征提取具体为:利用Faster R-CNN中ROI Pooling(感兴趣区域池化)后的第一个全连接层及其之前的网络作为图像数据的特征提取网络,如图3所示,对图像数据进行特征提取。Further, in step S3, using the Faster R-CNN network to perform feature extraction on the image data is specifically: using the first fully connected layer after ROI Pooling (region of interest pooling) in Faster R-CNN and the network before it. As a feature extraction network for image data, as shown in Figure 3, feature extraction is performed on image data.
进一步地,步骤S4中,将多传感器参量特征和图像参量特征按权重进行融合具体为:设多传感器参数特征的权重为W1,图像数据特征的权重为W2,将二者分别与对应的权重相乘后进行拼接。为避免人为因素的影响,步骤S4中的多传感器参量特征和图像参量特征的权重因子可通过训练得到。Further, in step S4, the multi-sensor parameter feature and the image parameter feature are fused according to the weight, specifically: set the weight of the multi-sensor parameter feature to be W 1 and the weight of the image data feature to be W 2 , and combine the two with the corresponding The weights are multiplied and stitched together. In order to avoid the influence of human factors, the weighting factors of the multi-sensor parameter features and the image parameter features in step S4 can be obtained through training.
进一步地,步骤S4中,对融合后的特征进行特征提取,最后通过多类别神经网络进行电气设备状态分类具体为:将融合后的特征分别经过一个4096的全连接层和一个m的全连接层,最后与1×m的多类别神经网络连接;其中,m为多类别神经网络的层数,即电气设备状态的类别数。在本实施例中,多参量传感器参数的特征提取网络输出大小为1×125的特征矩阵,图像的特征提取网络输出大小为1×4096的特征矩阵,设多传感器参数特征对应的权重为W1,图像数据特征对应的权重为W2,如图4所示,将二者按照权重进行拼接,形成大小为1×4221的特征,接下来对融合后的特征进行特征提取:将该特征作为输入,以电气设备状态感知为目标进行特征提取和参数训练,即分别经过一个4096和m的全连接层,最后连接到一个1×m的softmax(多类别神经网络)进行电气设备状态分类。将电气设备的状态分为m类,因此softmax输出1×m的矩阵,其中,第i个元素代表输入是第i类别的概率。Further, in step S4, feature extraction is performed on the fused features, and finally the electrical equipment state classification is performed through a multi-category neural network. Specifically, the fused features are respectively passed through a 4096 fully connected layer and an m fully connected layer. , and finally connected with a 1×m multi-category neural network; where m is the number of layers of the multi-category neural network, that is, the number of categories of electrical equipment states. In this embodiment, the feature extraction network of the multi-parameter sensor parameters outputs a feature matrix with a size of 1×125, and the feature extraction network of the image outputs a feature matrix with a size of 1×4096. Let the weight corresponding to the multi-sensor parameter feature be W 1 , the weight corresponding to the image data feature is W 2 , as shown in Figure 4, the two are spliced according to the weight to form a feature with a size of 1×4221, and then the feature extraction is performed on the fused feature: this feature is used as input , to perform feature extraction and parameter training with the goal of electrical equipment state perception, that is, through a fully connected layer of 4096 and m respectively, and finally connected to a 1×m softmax (multi-category neural network) for electrical equipment state classification. The states of electrical devices are divided into m classes, so softmax outputs a 1×m matrix, where the ith element represents the probability that the input is the ith class.
对于softmax,设其输入为x,第j类目标对应概率为p(y=j|x),For softmax, let its input be x, and the corresponding probability of the j-th target is p(y=j|x),
式中,W为x到y的模型参数。where W is the model parameter from x to y.
进一步地,步骤S5中,以对应的多传感器时序数据、图像数据和电气设备状态,对跨模态数据融合模型进行训练具体为:以对应的多传感器时序数据和图像数据对作为模型输入,以电气设备状态类别为输出,进行模型训练。对多传感器数据和图像数据生成联合标签,即以对应时刻的多传感器数据和图像数据为数据对,生成一个共同的标签,然后将其作为以多传感器数和图像数据对作为模型输入,以电气设备状态类别为输出,进行模型训练。Further, in step S5, the cross-modal data fusion model is trained with the corresponding multi-sensor time series data, image data and electrical equipment state, specifically: using the corresponding multi-sensor time series data and image data pairs as the model input, to The electrical equipment state category is the output, and the model is trained. Generate a joint label for multi-sensor data and image data, that is, take the multi-sensor data and image data at the corresponding time as the data pair, generate a common label, and then use it as the model input with the multi-sensor number and the image data pair as the electrical data. The device state category is the output, and the model is trained.
进一步地,模型训练的目标函数为Lcls:Further, the objective function of model training is L cls :
式中,m表示训练的样本数,pi为第i个样本的预测分类结果,由多类别神经网络分类公式计算得到,表示第i个样本的真实分类结果。In the formula, m represents the number of training samples, pi is the predicted classification result of the ith sample, which is calculated by the multi-class neural network classification formula, represents the true classification result of the ith sample.
直到满足Lcls≤σ,则停止训练,σ为误差阈值,可根据实际情况而定。Until L cls ≤σ is satisfied, the training is stopped, and σ is the error threshold, which can be determined according to the actual situation.
训练完成之后,则可利用训练好的跨模态数据融合模型,以对应的多传感器数据和图像数据对为输入,感知电气设备状态。After the training is completed, the trained cross-modal data fusion model can be used to perceive the state of electrical equipment with the corresponding multi-sensor data and image data pairs as input.
本发明还提供另一实施例,本实施例采用多传感器数据和红外数据对变压器进行状态感知。其中传感器数据是结构化数据,数据量较小,因此传输成本小,每10分钟采集一次,表1为传感器数据类别;摄像机采集的是非结构化数据,数据量大,传输成本高,通常每2小时采集一次。The present invention also provides another embodiment, which uses multi-sensor data and infrared data to sense the state of the transformer. Among them, the sensor data is structured data, the data volume is small, so the transmission cost is small, and it is collected every 10 minutes. Table 1 shows the sensor data categories; the camera collects unstructured data, the data volume is large, and the transmission cost is high, usually every 2 collected every hour.
表1多传感器参量类型及采样频率表Table 1 Multi-sensor parameter type and sampling frequency table
本实施例使用的数据集包含红外图像数据6432张,参照步骤1中多参量递归图转换方法,则一张红外图像对应10×20大小的传感器参数矩阵,故共计传感器参量10×20×6432。如表2所示,5432个数据对作为训练集,1000个数据对作为测试集。为便于验证本发明所提方法,本实施例将变压器的状态分为两类,即有异常和无异常。The data set used in this embodiment contains 6432 infrared image data. Referring to the multi-parameter recursive graph conversion method in
表2数据分布情况表Table 2 Data distribution table
具体步骤如下所示:The specific steps are as follows:
步骤1,选取多传感器时间序列的长度n=20,对包括绕组温度、全电流、电容值等十类参量进行归一化处理,并生成大小为20×20的多参量递归图。
步骤2,对所生成的多参量递归图进行特征提取,输出为大小1×125的特征向量。In
步骤3,用Faster R-CNN对多传感器参量对应的红外图像数据进行特征提取,输出大小为1×4096的特征向量。Step 3: Use Faster R-CNN to extract features from the infrared image data corresponding to the multi-sensor parameters, and output a feature vector with a size of 1×4096.
步骤4,按照权重对这二类特征进行拼接,生成大小为1×4221的融合特征向量。Step 4, splicing the two types of features according to the weights to generate a fusion feature vector with a size of 1×4221.
步骤5,以变压器状态等级为目标,分别经过一个大小为1×4096和大小为1×2的全连接层对融合后的特征进行训练。
步骤6,重复步骤1-5,对模型进行训练。Step 6, repeat steps 1-5 to train the model.
步骤7,利用训练的模型进行测试。Step 7, use the trained model for testing.
本实施中用精确率(Precision,P)、召回率(Recall,R)、平均精确率(AveragePrecision,AR)和平均召回率(Average Recall,AR)进行模型测试的评价指标。测试结果如表3所示。可见本发明所提用于电气设备状态感知的跨模态数据融合方法能够较为准确地进行电力设备的状态感知,感知的平均精确率和平均召回率分别为76.3%和72.9%。In this implementation, precision (P), recall (Recall, R), average precision (AveragePrecision, AR) and average recall (Average Recall, AR) are used to evaluate the model testing metrics. The test results are shown in Table 3. It can be seen that the cross-modal data fusion method for electrical equipment state perception proposed in the present invention can more accurately perceive the electrical equipment state, and the average precision rate and average recall rate of perception are 76.3% and 72.9%, respectively.
表3各模型的结果对比表Table 3 Comparison of the results of each model
本文中所描述的具体实施例仅仅是对本发明精神作举例说明。本发明所属技术领域的技术人员可以对所描述的具体实施例做各种各样的修改或补充或采用类似的方式替代,但并不会偏离本发明的精神或者超越所附权利要求书所定义的范围。The specific embodiments described herein are merely illustrative of the spirit of the invention. Those skilled in the art to which the present invention pertains can make various modifications or additions to the described specific embodiments or substitute in similar manners, but will not deviate from the spirit of the present invention or go beyond the definitions of the appended claims range.
Claims (5)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011334424.3A CN112418324B (en) | 2020-11-25 | 2020-11-25 | Cross-modal data fusion method for electrical equipment state perception |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011334424.3A CN112418324B (en) | 2020-11-25 | 2020-11-25 | Cross-modal data fusion method for electrical equipment state perception |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112418324A CN112418324A (en) | 2021-02-26 |
CN112418324B true CN112418324B (en) | 2022-06-24 |
Family
ID=74842057
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011334424.3A Active CN112418324B (en) | 2020-11-25 | 2020-11-25 | Cross-modal data fusion method for electrical equipment state perception |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112418324B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113518114B (en) * | 2021-05-12 | 2024-07-12 | 江苏力行电力电子科技有限公司 | Artificial intelligence control method and system based on multi-mode ad hoc network |
CN113344137B (en) * | 2021-07-06 | 2022-07-19 | 电子科技大学成都学院 | SOM-based data fusion method and device, storage medium and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108182441A (en) * | 2017-12-29 | 2018-06-19 | 华中科技大学 | Parallel multichannel convolutive neural network, construction method and image characteristic extracting method |
CN108614548A (en) * | 2018-04-03 | 2018-10-02 | 北京理工大学 | A kind of intelligent failure diagnosis method based on multi-modal fusion deep learning |
CN109738512A (en) * | 2019-01-08 | 2019-05-10 | 重庆大学 | Nondestructive testing system and method based on multiphysics fusion |
CN111507233A (en) * | 2020-04-13 | 2020-08-07 | 吉林大学 | Multi-mode information fusion intelligent vehicle pavement type identification method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10902343B2 (en) * | 2016-09-30 | 2021-01-26 | Disney Enterprises, Inc. | Deep-learning motion priors for full-body performance capture in real-time |
-
2020
- 2020-11-25 CN CN202011334424.3A patent/CN112418324B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108182441A (en) * | 2017-12-29 | 2018-06-19 | 华中科技大学 | Parallel multichannel convolutive neural network, construction method and image characteristic extracting method |
CN108614548A (en) * | 2018-04-03 | 2018-10-02 | 北京理工大学 | A kind of intelligent failure diagnosis method based on multi-modal fusion deep learning |
CN109738512A (en) * | 2019-01-08 | 2019-05-10 | 重庆大学 | Nondestructive testing system and method based on multiphysics fusion |
CN111507233A (en) * | 2020-04-13 | 2020-08-07 | 吉林大学 | Multi-mode information fusion intelligent vehicle pavement type identification method |
Non-Patent Citations (8)
Title |
---|
中国电机工程学会,2016,第315-320页. * |
国网上海市电力公司,2020,第24-32页. * |
李永德.多源传感器数据融合的电气设备故障诊断研究.《信息通信》.2018,第104-106页. * |
李进,张萌.基于多传感器信息融合的电力设备故障诊断方法.《电子世界》.中国电子学会,2016,第131+140页. * |
林刚,王波,彭辉,王晓阳,陈思远,张黎明.基于改进Faster-RCNN的输电线巡检图像多目标检测及定位.《电力自动化设备》.南京电力自动化研究所有限公司,2019,第213-218页. * |
汪勋婷,王波.考虑信息物理融合的电网脆弱社团评估方法.《电力自动化设备》.南京电力自动化研究所有限公司,2017,第43-51页. * |
王红霞,王波,陈红坤,刘畅,马富齐,罗鹏,杨艳.电力数据融合:基本概念、抽象化结构、关键技术和应用场景.《供用电》.英大传媒(上海)有限公司 * |
魏大千,王波,刘涤尘,陈得治,唐飞,郭珂.基于时序数据相关性挖掘的WAMS/SCADA数据融合方法.《高电压技术》.国家高电压计量站 * |
Also Published As
Publication number | Publication date |
---|---|
CN112418324A (en) | 2021-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106529090A (en) | Evaluation method of reliability of aerospace electronic product | |
CN112418324B (en) | Cross-modal data fusion method for electrical equipment state perception | |
CN114118251A (en) | Fault diagnosis and early warning method based on multi-source data fusion and convolutional Siamese neural network | |
CN116610998A (en) | Switch cabinet fault diagnosis method and system based on multi-mode data fusion | |
CN117407770A (en) | Classification and prediction method of high-voltage switchgear failure modes based on neural network | |
CN115013298A (en) | Real-time performance on-line monitoring system and monitoring method of sewage pump | |
CN117269644A (en) | Line fault monitoring system and method for current transformer | |
CN113960090A (en) | LSTM neural network algorithm-based soil Cd element spectrum qualitative analysis method | |
CN117484031A (en) | Photovoltaic module welding processing equipment | |
CN115659252A (en) | A GIS Partial Discharge Pattern Recognition Method Based on PRPD Multi-Feature Information Fusion | |
CN117332352B (en) | Lightning arrester signal defect identification method based on BAM-AlexNet | |
CN118710376A (en) | Intelligent financial product recommendation method and system based on multimodal user portrait | |
CN112926016A (en) | Multivariable time series change point detection method | |
CN118822956A (en) | Industrial wiring harness quality detection method, medium and system based on machine vision | |
CN118295842A (en) | Data processing method, device and server for transaction system abnormal event | |
CN116739996A (en) | Power transmission line insulator fault diagnosis method based on deep learning | |
CN118311434A (en) | Lithium-ion battery SOH estimation method and system based on electrochemical impedance spectroscopy | |
CN117791856A (en) | Power grid fault early warning method and device based on inspection robot | |
CN110908365A (en) | A kind of unmanned aerial vehicle sensor fault diagnosis method, system and readable storage medium | |
CN116577602A (en) | Cable defect positioning method based on broadband impedance spectrum and self-attention mechanism coupling | |
CN116908184A (en) | A ground wire crimp detection system and its detection method | |
CN116559680A (en) | Battery fault diagnosis method and device, electronic equipment and storage medium | |
CN115729200B (en) | A method and device for constructing a UAV servo fault detection model, and a method and device for detecting a UAV servo fault | |
CN113468823B (en) | Optical module damage detection method and system based on machine learning | |
CN113807267A (en) | Suspension insulator discharge severity assessment method based on ultraviolet video and deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20250122 Address after: 430000 Hubei city of Wuhan province Wuchang Luojiashan Patentee after: WUHAN University Country or region after: China Patentee after: CHINA SOUTHERN POWER GRID Co.,Ltd. Patentee after: NARI TECHNOLOGY Co.,Ltd. Address before: 430072 No. 299 Bayi Road, Wuchang District, Hubei, Wuhan Patentee before: WUHAN University Country or region before: China |