CN112418324B - Cross-modal data fusion method for electrical equipment state perception - Google Patents

Cross-modal data fusion method for electrical equipment state perception Download PDF

Info

Publication number
CN112418324B
CN112418324B CN202011334424.3A CN202011334424A CN112418324B CN 112418324 B CN112418324 B CN 112418324B CN 202011334424 A CN202011334424 A CN 202011334424A CN 112418324 B CN112418324 B CN 112418324B
Authority
CN
China
Prior art keywords
sensor
data
electrical equipment
parameter
cross
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011334424.3A
Other languages
Chinese (zh)
Other versions
CN112418324A (en
Inventor
王波
王红霞
罗鹏
马富齐
李怡凡
周胤宇
张嘉鑫
张迎晨
冯磊
王雷雄
马恒瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
China Southern Power Grid Co Ltd
Nari Technology Co Ltd
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN202011334424.3A priority Critical patent/CN112418324B/en
Publication of CN112418324A publication Critical patent/CN112418324A/en
Application granted granted Critical
Publication of CN112418324B publication Critical patent/CN112418324B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)

Abstract

本发明公开了一种用于电气设备状态感知的跨模态数据融合方法,基于多传感器数据和图像数据这两类跨模态数据,对电气设备的状态进行融合感知。本发明首先将多传感器时间序列数据转换为递归图;然后分别用不同的卷积神经网络对递归图和电气设备图像数据进行特征提取;之后按照权重对这两类数据特征进行有效拼接,最后对融合后的特征进行进一步特征提取和状态等级感知。本发明充分利用了电气设备监测数据中的多传感器和图像这两类跨模态数据,一定程度上解决了基于单模态数据感知中精确率低和容错性差的问题。

Figure 202011334424

The invention discloses a cross-modal data fusion method for electrical equipment state perception. Based on two types of cross-modal data, multi-sensor data and image data, the electrical equipment state is fused and perceived. The invention first converts multi-sensor time series data into recursive graphs; then uses different convolutional neural networks to extract features from recursive graphs and electrical equipment image data; The fused features undergo further feature extraction and state-level perception. The invention makes full use of two types of cross-modal data, such as multi-sensors and images, in the monitoring data of electrical equipment, and solves the problems of low accuracy and poor fault tolerance in perception based on single-modal data to a certain extent.

Figure 202011334424

Description

用于电气设备状态感知的跨模态数据融合方法A Cross-modal Data Fusion Method for State Awareness of Electrical Equipment

技术领域technical field

本发明属于电气设备状态技术领域,具体涉及一种用于电气设备状态感知的跨模态数据融合方法。The invention belongs to the technical field of electrical equipment status, and in particular relates to a cross-modal data fusion method for electrical equipment status perception.

背景技术Background technique

对电气设备进行有效的状态感知,及时发现隐患并采取相应的措施,是电网安全稳定运行的关键。Effective state awareness of electrical equipment, timely detection of hidden dangers and corresponding measures are the keys to the safe and stable operation of the power grid.

电气设备的状态是多种参量共同作用的结果,对各影响因素进行有效融合,使其相互补充和增强,从而从多角度、全方位进行状态感知,有利于提高感知的准确性。同时,当某一类参量由于通讯、测量误差而出现数据质量下降时,仍有其他类型数据进行补充,可有效提高感知的容错性。The state of electrical equipment is the result of the joint action of a variety of parameters. The effective integration of various influencing factors makes them complement and enhance each other, so that the state perception can be carried out from multiple angles and all directions, which is beneficial to improve the accuracy of perception. At the same time, when the data quality of a certain type of parameter is degraded due to communication and measurement errors, there are still other types of data to supplement, which can effectively improve the fault tolerance of perception.

随着数字化基建和电力物联网的逐步发展,电力感知终端在数量上越来越多、类型上越来越广,为电气设备的全方位多角度状态感知提供了有效的数据获取途径。电力物联网下多源感知终端产生的多模态参量包括以电气量测为代表的时间序列等结构化参量,也包括图像、检修报告等非结构化参量,二者在物理意义和表征形式上有很大的差别,即多模态参量。针对同一模态的数据,由于其数据形式相同,可以看作是同一坐标体系下的参量,融合难度相对较小;针对跨模态的数据,由于其结构形式、物理意义均不同,难以进行统一描述,融合难度大。可见,电网的迅速发展为基于跨模态数据融合的电气设备状态感知提供了充分的数据源,但跨类型、多维度的数据分析技术薄弱,状态量间的关联分析挖掘能力不足,对电气设备的状态感知不足。With the gradual development of digital infrastructure and power Internet of Things, the number and types of power sensing terminals are increasing, providing an effective way to obtain data for all-round, multi-angle state sensing of electrical equipment. The multi-modal parameters generated by the multi-source sensing terminal under the power Internet of Things include structured parameters such as time series represented by electrical measurements, as well as unstructured parameters such as images and maintenance reports. There is a big difference, namely multimodal parameters. For the data of the same mode, due to the same data form, it can be regarded as a parameter under the same coordinate system, and the difficulty of fusion is relatively small; for the data of cross-modality, due to its different structural forms and physical meanings, it is difficult to unify Description, integration is difficult. It can be seen that the rapid development of the power grid provides sufficient data sources for electrical equipment state perception based on cross-modal data fusion, but the cross-type and multi-dimensional data analysis technology is weak, and the correlation analysis and mining capabilities between state quantities are insufficient. lack of state awareness.

目前,图像参量和各类结构化参量在电气设备的状态评估中应用较为广泛,如使用红外图像数据监测设备的温度情况,使用各类传感器监测设备运行环境中的气体量等。但现有方法仍以基于单模态数据的状态评估为主,其描述角度较为单一,且精确性低,容错性差。At present, image parameters and various structural parameters are widely used in the state assessment of electrical equipment, such as using infrared image data to monitor the temperature of the equipment, and using various sensors to monitor the amount of gas in the operating environment of the equipment. However, the existing methods are still mainly based on state evaluation based on unimodal data, which has a single description angle, low accuracy and poor fault tolerance.

发明内容SUMMARY OF THE INVENTION

本发明解决的技术问题是提供一种用于电气设备状态感知的跨模态数据融合方法,解决现有电气设备状态感知方法仅依赖于单模态数据,其感知精确性低,容错性差的问题。The technical problem solved by the present invention is to provide a cross-modal data fusion method for electrical equipment state perception, which solves the problems that the existing electrical equipment state perception method only relies on single-modal data, and its perception accuracy is low and fault tolerance is poor. .

本发明提供一种用于电气设备状态感知的跨模态数据融合方法,包括以下步骤:The present invention provides a cross-modal data fusion method for electrical equipment state perception, comprising the following steps:

S1、对多传感器时序数据进行归一化处理,并生成多参量递归图;S1, normalize the multi-sensor time series data, and generate a multi-parameter recursive graph;

S2、利用卷积神经网络对多参量递归图进行特征提取,得到多传感器参量特征;S2, using the convolutional neural network to perform feature extraction on the multi-parameter recursive graph to obtain multi-sensor parameter features;

S3、利用Faster R-CNN网络对图像数据进行特征提取,得到图像参量特征;S3. Use the Faster R-CNN network to perform feature extraction on the image data to obtain image parameter features;

S4、构建跨模态数据融合模型,该模型将多传感器参量特征和图像参量特征按权重进行融合,并对融合后的特征进行特征提取,最后通过多类别神经网络进行电气设备状态分类;S4. Build a cross-modal data fusion model, which fuses multi-sensor parameter features and image parameter features according to weights, extracts features from the fused features, and finally classifies electrical equipment status through a multi-category neural network;

S5、以对应的多传感器时序数据、图像数据和电气设备状态,对跨模态数据融合模型进行训练;S5, train the cross-modal data fusion model with the corresponding multi-sensor time series data, image data and electrical equipment state;

S6、利用训练好的跨模态数据融合模型,感知电气设备状态。S6. Use the trained cross-modal data fusion model to perceive the state of the electrical equipment.

进一步地,步骤S1中,对多传感器时序数据进行归一化处理的具体方法为:Further, in step S1, the specific method for normalizing the multi-sensor time series data is as follows:

Figure BDA0002796755050000021
Figure BDA0002796755050000021

式中,

Figure BDA0002796755050000022
表示传感器某一参量i在t1到tn时刻的采样值,n表示时间序列的长度。In the formula,
Figure BDA0002796755050000022
It represents the sampling value of a certain parameter i of the sensor from time t 1 to t n , and n represents the length of the time series.

进一步地,步骤S1中,生成多参量递归图具体包括:Further, in step S1, generating a multi-parameter recursive graph specifically includes:

S11、设传感器时间序列的长度为n,则生成递归矩阵RnS11. Set the length of the sensor time series as n, then generate a recursive matrix R n ,

Figure BDA0002796755050000023
Figure BDA0002796755050000023

Figure BDA0002796755050000024
Figure BDA0002796755050000024

式中,Rij表示递归矩阵Rn中的元素,

Figure BDA0002796755050000025
表示第ti时刻j类传感器的采样值,xj(ti)为第j类传感器在第ti时刻的采样值。In the formula, R ij represents the elements in the recursive matrix R n ,
Figure BDA0002796755050000025
represents the sampling value of the j-th sensor at time t i , and x j (t i ) is the sampling value of the j-th sensor at time t i .

S12、将递归矩阵中的所有元素均乘以预设的匹配系数,使递归矩阵与图像的元素值匹配,以匹配后的元素值作为像素值绘制递归图。S12. Multiply all elements in the recursive matrix by a preset matching coefficient, so that the recursive matrix matches the element values of the image, and draws a recursive graph with the matched element values as pixel values.

进一步地,步骤S2中的卷积神经网络结构具体为:Further, the convolutional neural network structure in step S2 is specifically:

第一层:输入层,输入大小为n×n的多传感器递归图;The first layer: input layer, the input size is n×n multi-sensor recursive graph;

第二层:卷积层,32个大小为5×5的卷积层,步长s=1,填充padding=2,并用relu函数进行非线性变换;The second layer: convolutional layer, 32 convolutional layers of size 5×5, stride s=1, padding=2, and non-linear transformation with relu function;

第三层:池化层,大小为2×2的最大池化函数,步长s=2,填充padding=0;The third layer: pooling layer, the maximum pooling function of size 2×2, step size s=2, padding=0;

第四层:卷积层,32个大小为5×5的卷积层,步长s=1,填充padding=2,并用relu函数进行非线性变换;The fourth layer: convolutional layer, 32 convolutional layers of size 5×5, stride s=1, padding=2, and use the relu function for nonlinear transformation;

第五层:池化层,大小为2×2的最大池化函数,步长s=2,填充padding=0;The fifth layer: pooling layer, the maximum pooling function of size 2×2, step size s=2, padding=0;

第六层:大小为1×125的全连接层。The sixth layer: a fully connected layer of size 1×125.

进一步地,步骤S3中,利用Faster R-CNN网络对图像数据进行特征提取具体为:利用Faster R-CNN中ROI Pooling后的第一个全连接层及其之前的网络作为图像数据的特征提取网络,对图像数据进行特征提取。Further, in step S3, using the Faster R-CNN network to perform feature extraction on the image data is specifically: using the first fully connected layer after ROI Pooling in the Faster R-CNN and its previous network as the feature extraction network of the image data. to extract features from the image data.

进一步地,步骤S4中,多传感器参量特征和图像参量特征的权重因子通过训练得到。Further, in step S4, the weighting factors of the multi-sensor parameter features and the image parameter features are obtained through training.

进一步地,步骤S4中,将多传感器参量特征和图像参量特征按权重进行融合具体为:Further, in step S4, the multi-sensor parameter features and the image parameter features are fused according to weights, specifically:

设多传感器参数特征的权重为W1,图像数据特征的权重为W2,将二者分别与对应的权重相乘后进行拼接。The weight of the multi-sensor parameter feature is set as W 1 , and the weight of the image data feature as W 2 , and the two are multiplied by the corresponding weights respectively and then spliced.

进一步地,步骤S4中,对融合后的特征进行特征提取,最后通过多类别神经网络进行电气设备状态分类具体为:将融合后的特征分别经过一个4096的全连接层和一个m的全连接层,最后与1×m的多类别神经网络连接;其中,m为多类别神经网络的层数,即电气设备状态的类别数。Further, in step S4, feature extraction is performed on the fused features, and finally the electrical equipment state classification is performed through a multi-category neural network. Specifically, the fused features are respectively passed through a 4096 fully connected layer and an m fully connected layer. , and finally connected to a 1×m multi-category neural network; where m is the number of layers of the multi-category neural network, that is, the number of categories of electrical equipment states.

进一步地,步骤S5中,以对应的多传感器时序数据、图像数据和电气设备状态,对跨模态数据融合模型进行训练具体为:以对应的多传感器时序数据和图像数据对作为模型输入,以电气设备状态类别为输出,进行模型训练。Further, in step S5, the cross-modal data fusion model is trained with the corresponding multi-sensor time series data, image data and electrical equipment state, specifically: using the corresponding multi-sensor time series data and image data pairs as the model input, to The electrical equipment state category is the output, and the model is trained.

进一步地,模型训练的目标函数为LclsFurther, the objective function of model training is L cls :

Figure BDA0002796755050000031
Figure BDA0002796755050000031

式中,m表示训练的样本数,pi为第i个样本的预测分类结果,由多类别神经网络分类公式计算得到,

Figure BDA0002796755050000032
表示第i个样本的真实分类结果。In the formula, m represents the number of training samples, pi is the predicted classification result of the ith sample, which is calculated by the multi-class neural network classification formula,
Figure BDA0002796755050000032
represents the true classification result of the ith sample.

本发明的有益效果是:本发明的用于电气设备状态感知的跨模态数据融合方法基于多传感器时序数据和图像数据这两类跨模态数据,对电气设备状态进行融合感知,充分利用了电气设备监测数据中的多传感器和图像两类跨模态数据,解决了基于单模态数据感知中精确性低和容错性差的问题。The beneficial effects of the present invention are as follows: the cross-modal data fusion method for electrical equipment state perception of the present invention is based on two types of cross-modal data, multi-sensor time series data and image data, to fuse and perceive the electrical equipment state, making full use of the The multi-sensor and image cross-modal data in electrical equipment monitoring data solves the problems of low accuracy and poor fault tolerance in perception based on single-modal data.

附图说明Description of drawings

图1是本发明用于电气设备状态感知的跨模态数据融合方法的流程图。FIG. 1 is a flow chart of the cross-modal data fusion method for electrical equipment state perception according to the present invention.

图2是本发明基于递归图的电气设备多传感器参量特征提取网络图。FIG. 2 is a network diagram of multi-sensor parameter feature extraction of electrical equipment based on recursive graph of the present invention.

图3是本发明电气设备图像特征提取网络图。FIG. 3 is a network diagram of image feature extraction of electrical equipment according to the present invention.

图4是本发明基于权重因子的特征融合网络图。FIG. 4 is a feature fusion network diagram based on weighting factors of the present invention.

具体实施方式Detailed ways

下面将结合附图对本发明作进一步的说明:The present invention will be further described below in conjunction with the accompanying drawings:

本发明的用于电气设备状态感知的跨模态数据融合方法,基于多传感器数据和图像数据这两类跨模态数据,对电气设备的状态进行融合感知。本发明首先将多传感器时间序列数据转换为递归图;然后分别用不同的卷积神经网络对递归图和电气设备图像数据进行特征提取;之后按照权重对这两类数据特征进行有效拼接,最后对融合后的特征进行进一步特征提取和状态等级感知。本发明充分利用了电气设备监测数据中的多传感器和图像这两类跨模态数据,一定程度上解决了基于单模态数据感知中精确率低和容错性差的问题。The cross-modal data fusion method for electrical equipment state perception of the present invention is based on two types of cross-modal data, multi-sensor data and image data, to perform fusion perception on the electrical equipment state. The invention first converts multi-sensor time series data into recursive graphs; then uses different convolutional neural networks to extract features from recursive graphs and electrical equipment image data; The fused features undergo further feature extraction and state-level perception. The present invention makes full use of two types of cross-modal data, such as multi-sensors and images, in the monitoring data of electrical equipment, and solves the problems of low accuracy and poor fault tolerance in perception based on single-modal data to a certain extent.

本发明实施例的用于电气设备状态感知的跨模态数据融合方法,如图1所示,包括以下步骤:The cross-modal data fusion method for electrical equipment state perception according to the embodiment of the present invention, as shown in FIG. 1 , includes the following steps:

S1、对多传感器时序数据进行归一化处理,并生成多参量递归图;S1, normalize the multi-sensor time series data, and generate a multi-parameter recursive graph;

S2、利用卷积神经网络对多参量递归图进行特征提取,得到多传感器参量特征;S2, using the convolutional neural network to perform feature extraction on the multi-parameter recursive graph to obtain multi-sensor parameter features;

S3、利用Faster R-CNN网络对图像数据进行特征提取,得到图像参量特征;S3. Use the Faster R-CNN network to perform feature extraction on the image data to obtain image parameter features;

S4、构建跨模态数据融合模型,该模型将多传感器参量特征和图像参量特征按权重进行融合,并对融合后的特征进行特征提取,最后通过多类别神经网络进行电气设备状态分类;S4. Build a cross-modal data fusion model, which fuses multi-sensor parameter features and image parameter features according to weights, extracts features from the fused features, and finally classifies electrical equipment status through a multi-category neural network;

S5、以对应的多传感器时序数据、图像数据和电气设备状态,对跨模态数据融合模型进行训练;S5, train the cross-modal data fusion model with the corresponding multi-sensor time series data, image data and electrical equipment state;

S6、利用训练好的跨模态数据融合模型,感知电气设备状态。S6. Use the trained cross-modal data fusion model to perceive the state of the electrical equipment.

进一步地,步骤S1中,对多传感器时序数据进行归一化处理的具体方法为:Further, in step S1, the specific method for normalizing the multi-sensor time series data is as follows:

Figure BDA0002796755050000041
Figure BDA0002796755050000041

式中,

Figure BDA0002796755050000042
表示传感器某一参量i在t1到tn时刻的采样值,n表示时间序列的长度,由图像数据和传感器数据的采样频率决定。为了使时序数据与图像数据相对应,即在时间上相匹配,多传感器时序数据中最后一个数据的采样时刻与图像采样时刻相同或相近。In the formula,
Figure BDA0002796755050000042
It represents the sampling value of a certain parameter i of the sensor from time t 1 to t n , and n represents the length of the time series, which is determined by the sampling frequency of image data and sensor data. In order to make the time series data correspond to the image data, that is, match in time, the sampling time of the last data in the multi-sensor time series data is the same as or close to the image sampling time.

进一步地,步骤S1中,生成多参量递归图具体包括:Further, in step S1, generating a multi-parameter recursive graph specifically includes:

S11、设传感器时间序列的长度为n,则生成递归矩阵RnS11. Set the length of the sensor time series as n, then generate a recursive matrix R n ,

Figure BDA0002796755050000043
Figure BDA0002796755050000043

Figure BDA0002796755050000044
Figure BDA0002796755050000044

式中,Rij表示递归矩阵Rn中的元素,

Figure BDA0002796755050000051
表示第ti时刻j类传感器的采样值,xj(ti)为第j类传感器在第ti时刻的采样值。In the formula, R ij represents the elements in the recursive matrix R n ,
Figure BDA0002796755050000051
represents the sampling value of the j-th sensor at time t i , and x j (t i ) is the sampling value of the j-th sensor at time t i .

S12、为了使递归图中的元素值与图像数据中的元素值相匹配,将递归矩阵中的所有元素均乘以预设的匹配系数,本实施例为100,使递归矩阵与图像的元素值匹配,以匹配后的元素值作为像素值绘制递归图。S12. In order to match the element values in the recursive graph with the element values in the image data, multiply all the elements in the recursive matrix by a preset matching coefficient, which is 100 in this embodiment, so that the recursive matrix matches the element values of the image Match, draw the recursive graph with the matched element value as the pixel value.

进一步地,步骤S2中的卷积神经网络结构如图2所示,具体为:Further, the convolutional neural network structure in step S2 is shown in Fig. 2, specifically:

第一层:输入层,输入大小为n×n的多传感器递归图;The first layer: input layer, the input size is n×n multi-sensor recursive graph;

第二层:卷积层,32个大小为5×5的卷积层,步长s=1,填充padding=2,并用relu函数进行非线性变换;The second layer: convolutional layer, 32 convolutional layers of size 5×5, stride s=1, padding=2, and non-linear transformation with relu function;

第三层:池化层,大小为2×2的最大池化函数,步长s=2,填充padding=0;The third layer: pooling layer, the maximum pooling function of size 2×2, step size s=2, padding=0;

第四层:卷积层,32个大小为5×5的卷积层,步长s=1,填充padding=2,并用relu函数进行非线性变换;The fourth layer: convolutional layer, 32 convolutional layers of size 5×5, stride s=1, padding=2, and use the relu function for nonlinear transformation;

第五层:池化层,大小为2×2的最大池化函数,步长s=2,填充padding=0;The fifth layer: pooling layer, the maximum pooling function of size 2×2, step size s=2, padding=0;

第六层:大小为1×125的全连接层。The sixth layer: a fully connected layer of size 1×125.

其中,k为传感器类别数,n为时间序列的长度。conv(32*5*5)代表32个大小为5*5的卷积核;maxpooling1(2*2)代表大小为2*2的最大池化;s为卷积时在图像每一维的步长;p为padding,即填充在图像四周的部分,若p=2,则说明在图像周围填充两圈0。Among them, k is the number of sensor categories, and n is the length of the time series. conv(32*5*5) represents 32 convolution kernels of size 5*5; maxpooling1(2*2) represents the maximum pooling of size 2*2; s is the step in each dimension of the image during convolution long; p is padding, that is, the part filled around the image, if p=2, it means that two circles of 0 are filled around the image.

进一步地,步骤S3中,利用Faster R-CNN网络对图像数据进行特征提取具体为:利用Faster R-CNN中ROI Pooling(感兴趣区域池化)后的第一个全连接层及其之前的网络作为图像数据的特征提取网络,如图3所示,对图像数据进行特征提取。Further, in step S3, using the Faster R-CNN network to perform feature extraction on the image data is specifically: using the first fully connected layer after ROI Pooling (region of interest pooling) in Faster R-CNN and the network before it. As a feature extraction network for image data, as shown in Figure 3, feature extraction is performed on image data.

进一步地,步骤S4中,将多传感器参量特征和图像参量特征按权重进行融合具体为:设多传感器参数特征的权重为W1,图像数据特征的权重为W2,将二者分别与对应的权重相乘后进行拼接。为避免人为因素的影响,步骤S4中的多传感器参量特征和图像参量特征的权重因子可通过训练得到。Further, in step S4, the multi-sensor parameter feature and the image parameter feature are fused according to the weight, specifically: set the weight of the multi-sensor parameter feature to be W 1 and the weight of the image data feature to be W 2 , and combine the two with the corresponding The weights are multiplied and stitched together. In order to avoid the influence of human factors, the weighting factors of the multi-sensor parameter features and the image parameter features in step S4 can be obtained through training.

进一步地,步骤S4中,对融合后的特征进行特征提取,最后通过多类别神经网络进行电气设备状态分类具体为:将融合后的特征分别经过一个4096的全连接层和一个m的全连接层,最后与1×m的多类别神经网络连接;其中,m为多类别神经网络的层数,即电气设备状态的类别数。在本实施例中,多参量传感器参数的特征提取网络输出大小为1×125的特征矩阵,图像的特征提取网络输出大小为1×4096的特征矩阵,设多传感器参数特征对应的权重为W1,图像数据特征对应的权重为W2,如图4所示,将二者按照权重进行拼接,形成大小为1×4221的特征,接下来对融合后的特征进行特征提取:将该特征作为输入,以电气设备状态感知为目标进行特征提取和参数训练,即分别经过一个4096和m的全连接层,最后连接到一个1×m的softmax(多类别神经网络)进行电气设备状态分类。将电气设备的状态分为m类,因此softmax输出1×m的矩阵,其中,第i个元素代表输入是第i类别的概率。Further, in step S4, feature extraction is performed on the fused features, and finally the electrical equipment state classification is performed through a multi-category neural network. Specifically, the fused features are respectively passed through a 4096 fully connected layer and an m fully connected layer. , and finally connected with a 1×m multi-category neural network; where m is the number of layers of the multi-category neural network, that is, the number of categories of electrical equipment states. In this embodiment, the feature extraction network of the multi-parameter sensor parameters outputs a feature matrix with a size of 1×125, and the feature extraction network of the image outputs a feature matrix with a size of 1×4096. Let the weight corresponding to the multi-sensor parameter feature be W 1 , the weight corresponding to the image data feature is W 2 , as shown in Figure 4, the two are spliced according to the weight to form a feature with a size of 1×4221, and then the feature extraction is performed on the fused feature: this feature is used as input , to perform feature extraction and parameter training with the goal of electrical equipment state perception, that is, through a fully connected layer of 4096 and m respectively, and finally connected to a 1×m softmax (multi-category neural network) for electrical equipment state classification. The states of electrical devices are divided into m classes, so softmax outputs a 1×m matrix, where the ith element represents the probability that the input is the ith class.

对于softmax,设其输入为x,第j类目标对应概率为p(y=j|x),For softmax, let its input be x, and the corresponding probability of the j-th target is p(y=j|x),

Figure BDA0002796755050000061
Figure BDA0002796755050000061

式中,W为x到y的模型参数。where W is the model parameter from x to y.

进一步地,步骤S5中,以对应的多传感器时序数据、图像数据和电气设备状态,对跨模态数据融合模型进行训练具体为:以对应的多传感器时序数据和图像数据对作为模型输入,以电气设备状态类别为输出,进行模型训练。对多传感器数据和图像数据生成联合标签,即以对应时刻的多传感器数据和图像数据为数据对,生成一个共同的标签,然后将其作为以多传感器数和图像数据对作为模型输入,以电气设备状态类别为输出,进行模型训练。Further, in step S5, the cross-modal data fusion model is trained with the corresponding multi-sensor time series data, image data and electrical equipment state, specifically: using the corresponding multi-sensor time series data and image data pairs as the model input, to The electrical equipment state category is the output, and the model is trained. Generate a joint label for multi-sensor data and image data, that is, take the multi-sensor data and image data at the corresponding time as the data pair, generate a common label, and then use it as the model input with the multi-sensor number and the image data pair as the electrical data. The device state category is the output, and the model is trained.

进一步地,模型训练的目标函数为LclsFurther, the objective function of model training is L cls :

Figure BDA0002796755050000062
Figure BDA0002796755050000062

式中,m表示训练的样本数,pi为第i个样本的预测分类结果,由多类别神经网络分类公式计算得到,

Figure BDA0002796755050000063
表示第i个样本的真实分类结果。In the formula, m represents the number of training samples, pi is the predicted classification result of the ith sample, which is calculated by the multi-class neural network classification formula,
Figure BDA0002796755050000063
represents the true classification result of the ith sample.

直到满足Lcls≤σ,则停止训练,σ为误差阈值,可根据实际情况而定。Until L cls ≤σ is satisfied, the training is stopped, and σ is the error threshold, which can be determined according to the actual situation.

训练完成之后,则可利用训练好的跨模态数据融合模型,以对应的多传感器数据和图像数据对为输入,感知电气设备状态。After the training is completed, the trained cross-modal data fusion model can be used to perceive the state of electrical equipment with the corresponding multi-sensor data and image data pairs as input.

本发明还提供另一实施例,本实施例采用多传感器数据和红外数据对变压器进行状态感知。其中传感器数据是结构化数据,数据量较小,因此传输成本小,每10分钟采集一次,表1为传感器数据类别;摄像机采集的是非结构化数据,数据量大,传输成本高,通常每2小时采集一次。The present invention also provides another embodiment, which uses multi-sensor data and infrared data to sense the state of the transformer. Among them, the sensor data is structured data, the data volume is small, so the transmission cost is small, and it is collected every 10 minutes. Table 1 shows the sensor data categories; the camera collects unstructured data, the data volume is large, and the transmission cost is high, usually every 2 collected every hour.

表1多传感器参量类型及采样频率表Table 1 Multi-sensor parameter type and sampling frequency table

Figure BDA0002796755050000064
Figure BDA0002796755050000064

Figure BDA0002796755050000071
Figure BDA0002796755050000071

本实施例使用的数据集包含红外图像数据6432张,参照步骤1中多参量递归图转换方法,则一张红外图像对应10×20大小的传感器参数矩阵,故共计传感器参量10×20×6432。如表2所示,5432个数据对作为训练集,1000个数据对作为测试集。为便于验证本发明所提方法,本实施例将变压器的状态分为两类,即有异常和无异常。The data set used in this embodiment contains 6432 infrared image data. Referring to the multi-parameter recursive graph conversion method in step 1, one infrared image corresponds to a 10×20 sensor parameter matrix, so the total sensor parameters are 10×20×6432. As shown in Table 2, 5432 data pairs are used as training set and 1000 data pairs are used as test set. In order to facilitate the verification of the method proposed by the present invention, the state of the transformer is divided into two categories in this embodiment, namely, abnormal and non-abnormal.

表2数据分布情况表Table 2 Data distribution table

Figure BDA0002796755050000072
Figure BDA0002796755050000072

具体步骤如下所示:The specific steps are as follows:

步骤1,选取多传感器时间序列的长度n=20,对包括绕组温度、全电流、电容值等十类参量进行归一化处理,并生成大小为20×20的多参量递归图。Step 1, select the length n=20 of the multi-sensor time series, normalize ten types of parameters including winding temperature, full current, capacitance value, etc., and generate a multi-parameter recursion graph with a size of 20×20.

步骤2,对所生成的多参量递归图进行特征提取,输出为大小1×125的特征向量。In step 2, feature extraction is performed on the generated multi-parameter recursion graph, and the output is a feature vector with a size of 1×125.

步骤3,用Faster R-CNN对多传感器参量对应的红外图像数据进行特征提取,输出大小为1×4096的特征向量。Step 3: Use Faster R-CNN to extract features from the infrared image data corresponding to the multi-sensor parameters, and output a feature vector with a size of 1×4096.

步骤4,按照权重对这二类特征进行拼接,生成大小为1×4221的融合特征向量。Step 4, splicing the two types of features according to the weights to generate a fusion feature vector with a size of 1×4221.

步骤5,以变压器状态等级为目标,分别经过一个大小为1×4096和大小为1×2的全连接层对融合后的特征进行训练。Step 5, with the transformer state level as the target, the fused features are trained through a fully connected layer of size 1×4096 and 1×2 respectively.

步骤6,重复步骤1-5,对模型进行训练。Step 6, repeat steps 1-5 to train the model.

步骤7,利用训练的模型进行测试。Step 7, use the trained model for testing.

本实施中用精确率(Precision,P)、召回率(Recall,R)、平均精确率(AveragePrecision,AR)和平均召回率(Average Recall,AR)进行模型测试的评价指标。测试结果如表3所示。可见本发明所提用于电气设备状态感知的跨模态数据融合方法能够较为准确地进行电力设备的状态感知,感知的平均精确率和平均召回率分别为76.3%和72.9%。In this implementation, precision (P), recall (Recall, R), average precision (AveragePrecision, AR) and average recall (Average Recall, AR) are used to evaluate the model testing metrics. The test results are shown in Table 3. It can be seen that the cross-modal data fusion method for electrical equipment state perception proposed in the present invention can more accurately perceive the electrical equipment state, and the average precision rate and average recall rate of perception are 76.3% and 72.9%, respectively.

表3各模型的结果对比表Table 3 Comparison of the results of each model

Figure BDA0002796755050000081
Figure BDA0002796755050000081

本文中所描述的具体实施例仅仅是对本发明精神作举例说明。本发明所属技术领域的技术人员可以对所描述的具体实施例做各种各样的修改或补充或采用类似的方式替代,但并不会偏离本发明的精神或者超越所附权利要求书所定义的范围。The specific embodiments described herein are merely illustrative of the spirit of the invention. Those skilled in the art to which the present invention pertains can make various modifications or additions to the described specific embodiments or substitute in similar manners, but will not deviate from the spirit of the present invention or go beyond the definitions of the appended claims range.

Claims (5)

1. A cross-mode data fusion method for sensing the state of electrical equipment is characterized by comprising the following steps:
s1, carrying out normalization processing on the multi-sensor time sequence data and generating a multi-parameter recursion graph;
wherein: the specific method for carrying out normalization processing on the multi-sensor time sequence data comprises the following steps:
Figure FDA0003604741690000011
in the formula (I), the compound is shown in the specification,
Figure FDA0003604741690000012
indicating that a certain parameter i of the sensor is at t1To tnSampling values at time, n representing the length of the time series;
the generating of the multi-parameter recursive graph specifically comprises:
s11, if the length of the sensor time sequence is n, generating a recursive matrix Rn
Figure FDA0003604741690000013
Figure FDA0003604741690000014
In the formula, RijRepresenting a recursive matrix RnThe elements (A) and (B) in (B),
Figure FDA0003604741690000015
denotes the t-thiSampling value, x, of a sensor of type j at a time instantj(ti) At t for class j sensorsiSampling values of the moments;
s12, multiplying all elements in the recursion matrix by a preset matching coefficient, matching the recursion matrix with the element values of the image, and drawing a recursion graph by taking the matched element values as pixel values;
s2, performing feature extraction on the multi-parameter recursive graph by using a convolutional neural network to obtain multi-sensor parameter features;
s3, extracting the characteristics of the image data by using a Faster R-CNN network to obtain image parameter characteristics;
s4, constructing a cross-modal data fusion model, fusing the multi-sensor parameter characteristics and the image parameter characteristics according to weights, extracting the characteristics of the fused characteristics, and finally classifying the states of the electrical equipment through a multi-class neural network;
wherein, the weighting factors of the multi-sensor parameter characteristic and the image parameter characteristic are obtained through training;
fusing the multi-sensor parameter characteristics and the image parameter characteristics according to weights specifically comprises the following steps: let the weight of the multi-sensor parameter feature be W1The weight of the image data feature is W2Respectively multiplying the two weights by corresponding weights and splicing;
s5, training a cross-mode data fusion model according to the corresponding multi-sensor time sequence data, image data and electric equipment state; the method comprises the following specific steps: taking corresponding multi-sensor time sequence data and image data pairs as model input, and taking the state type of the electrical equipment as output, and performing model training;
and S6, sensing the state of the electrical equipment by using the trained cross-modal data fusion model.
2. The cross-modal data fusion method for electrical device state awareness according to claim 1, wherein the convolutional neural network structure in step S2 is specifically:
a first layer: an input layer that inputs a multi-sensor recurrence map of size n × n;
a second layer: convolutional layers, 32 convolutional layers of size 5 × 5, with step s equal to 1, padding equal to 2, and non-linearly transformed with relu function;
and a third layer: the pooling layer is a maximal pooling function with the size of 2 multiplied by 2, the step length s is 2, and the padding is 0;
a fourth layer: convolutional layers, 32 convolutional layers of size 5 × 5, with step s equal to 1, padding equal to 2, and non-linearly transformed with relu function;
and a fifth layer: pooling layers with a maximum pooling function of 2 × 2, step length s is 2, padding is 0;
a sixth layer: fully connected layers of size 1 x 125.
3. The cross-modal data fusion method for sensing the state of the electrical device according to claim 1, wherein in step S3, the performing the feature extraction on the image data by using the Faster R-CNN network specifically comprises: and (3) utilizing the first full-connection layer after ROI Pooling in the Faster R-CNN and the network before the first full-connection layer as a feature extraction network of the image data to extract the features of the image data.
4. The method for fusing cross-modal data for sensing the state of the electrical device according to claim 1, wherein in step S4, the feature extraction is performed on the fused features, and finally the classification of the state of the electrical device through the multi-class neural network is specifically as follows: the fused features respectively pass through a 4096 full-link layer and a m full-link layer, and are finally connected with a 1 x m multi-class neural network; wherein m is the number of layers of the multi-class neural network, namely the class number of the state of the electrical equipment.
5. The method of claim 1, wherein an objective function of model training is Lcls
Figure FDA0003604741690000021
Wherein m represents the number of samples for training, piIs the prediction classification result of the ith sample and is obtained by calculation of a multi-class neural network classification formula,
Figure FDA0003604741690000022
representing the true classification result of the ith sample.
CN202011334424.3A 2020-11-25 2020-11-25 Cross-modal data fusion method for electrical equipment state perception Active CN112418324B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011334424.3A CN112418324B (en) 2020-11-25 2020-11-25 Cross-modal data fusion method for electrical equipment state perception

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011334424.3A CN112418324B (en) 2020-11-25 2020-11-25 Cross-modal data fusion method for electrical equipment state perception

Publications (2)

Publication Number Publication Date
CN112418324A CN112418324A (en) 2021-02-26
CN112418324B true CN112418324B (en) 2022-06-24

Family

ID=74842057

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011334424.3A Active CN112418324B (en) 2020-11-25 2020-11-25 Cross-modal data fusion method for electrical equipment state perception

Country Status (1)

Country Link
CN (1) CN112418324B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518114B (en) * 2021-05-12 2024-07-12 江苏力行电力电子科技有限公司 Artificial intelligence control method and system based on multi-mode ad hoc network
CN113344137B (en) * 2021-07-06 2022-07-19 电子科技大学成都学院 SOM-based data fusion method and device, storage medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108182441A (en) * 2017-12-29 2018-06-19 华中科技大学 Parallel multichannel convolutive neural network, construction method and image characteristic extracting method
CN108614548A (en) * 2018-04-03 2018-10-02 北京理工大学 A kind of intelligent failure diagnosis method based on multi-modal fusion deep learning
CN109738512A (en) * 2019-01-08 2019-05-10 重庆大学 Nondestructive testing system and method based on multiphysics fusion
CN111507233A (en) * 2020-04-13 2020-08-07 吉林大学 Multi-mode information fusion intelligent vehicle pavement type identification method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10902343B2 (en) * 2016-09-30 2021-01-26 Disney Enterprises, Inc. Deep-learning motion priors for full-body performance capture in real-time

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108182441A (en) * 2017-12-29 2018-06-19 华中科技大学 Parallel multichannel convolutive neural network, construction method and image characteristic extracting method
CN108614548A (en) * 2018-04-03 2018-10-02 北京理工大学 A kind of intelligent failure diagnosis method based on multi-modal fusion deep learning
CN109738512A (en) * 2019-01-08 2019-05-10 重庆大学 Nondestructive testing system and method based on multiphysics fusion
CN111507233A (en) * 2020-04-13 2020-08-07 吉林大学 Multi-mode information fusion intelligent vehicle pavement type identification method

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
中国电机工程学会,2016,第315-320页. *
国网上海市电力公司,2020,第24-32页. *
李永德.多源传感器数据融合的电气设备故障诊断研究.《信息通信》.2018,第104-106页. *
李进,张萌.基于多传感器信息融合的电力设备故障诊断方法.《电子世界》.中国电子学会,2016,第131+140页. *
林刚,王波,彭辉,王晓阳,陈思远,张黎明.基于改进Faster-RCNN的输电线巡检图像多目标检测及定位.《电力自动化设备》.南京电力自动化研究所有限公司,2019,第213-218页. *
汪勋婷,王波.考虑信息物理融合的电网脆弱社团评估方法.《电力自动化设备》.南京电力自动化研究所有限公司,2017,第43-51页. *
王红霞,王波,陈红坤,刘畅,马富齐,罗鹏,杨艳.电力数据融合:基本概念、抽象化结构、关键技术和应用场景.《供用电》.英大传媒(上海)有限公司 *
魏大千,王波,刘涤尘,陈得治,唐飞,郭珂.基于时序数据相关性挖掘的WAMS/SCADA数据融合方法.《高电压技术》.国家高电压计量站 *

Also Published As

Publication number Publication date
CN112418324A (en) 2021-02-26

Similar Documents

Publication Publication Date Title
CN106529090A (en) Evaluation method of reliability of aerospace electronic product
CN112418324B (en) Cross-modal data fusion method for electrical equipment state perception
CN114118251A (en) Fault diagnosis and early warning method based on multi-source data fusion and convolutional Siamese neural network
CN116610998A (en) Switch cabinet fault diagnosis method and system based on multi-mode data fusion
CN117407770A (en) Classification and prediction method of high-voltage switchgear failure modes based on neural network
CN115013298A (en) Real-time performance on-line monitoring system and monitoring method of sewage pump
CN117269644A (en) Line fault monitoring system and method for current transformer
CN113960090A (en) LSTM neural network algorithm-based soil Cd element spectrum qualitative analysis method
CN117484031A (en) Photovoltaic module welding processing equipment
CN115659252A (en) A GIS Partial Discharge Pattern Recognition Method Based on PRPD Multi-Feature Information Fusion
CN117332352B (en) Lightning arrester signal defect identification method based on BAM-AlexNet
CN118710376A (en) Intelligent financial product recommendation method and system based on multimodal user portrait
CN112926016A (en) Multivariable time series change point detection method
CN118822956A (en) Industrial wiring harness quality detection method, medium and system based on machine vision
CN118295842A (en) Data processing method, device and server for transaction system abnormal event
CN116739996A (en) Power transmission line insulator fault diagnosis method based on deep learning
CN118311434A (en) Lithium-ion battery SOH estimation method and system based on electrochemical impedance spectroscopy
CN117791856A (en) Power grid fault early warning method and device based on inspection robot
CN110908365A (en) A kind of unmanned aerial vehicle sensor fault diagnosis method, system and readable storage medium
CN116577602A (en) Cable defect positioning method based on broadband impedance spectrum and self-attention mechanism coupling
CN116908184A (en) A ground wire crimp detection system and its detection method
CN116559680A (en) Battery fault diagnosis method and device, electronic equipment and storage medium
CN115729200B (en) A method and device for constructing a UAV servo fault detection model, and a method and device for detecting a UAV servo fault
CN113468823B (en) Optical module damage detection method and system based on machine learning
CN113807267A (en) Suspension insulator discharge severity assessment method based on ultraviolet video and deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20250122

Address after: 430000 Hubei city of Wuhan province Wuchang Luojiashan

Patentee after: WUHAN University

Country or region after: China

Patentee after: CHINA SOUTHERN POWER GRID Co.,Ltd.

Patentee after: NARI TECHNOLOGY Co.,Ltd.

Address before: 430072 No. 299 Bayi Road, Wuchang District, Hubei, Wuhan

Patentee before: WUHAN University

Country or region before: China