CN112418324B - Cross-modal data fusion method for electrical equipment state perception - Google Patents

Cross-modal data fusion method for electrical equipment state perception Download PDF

Info

Publication number
CN112418324B
CN112418324B CN202011334424.3A CN202011334424A CN112418324B CN 112418324 B CN112418324 B CN 112418324B CN 202011334424 A CN202011334424 A CN 202011334424A CN 112418324 B CN112418324 B CN 112418324B
Authority
CN
China
Prior art keywords
data
sensor
parameter
electrical equipment
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011334424.3A
Other languages
Chinese (zh)
Other versions
CN112418324A (en
Inventor
王波
王红霞
罗鹏
马富齐
李怡凡
周胤宇
张嘉鑫
张迎晨
冯磊
王雷雄
马恒瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN202011334424.3A priority Critical patent/CN112418324B/en
Publication of CN112418324A publication Critical patent/CN112418324A/en
Application granted granted Critical
Publication of CN112418324B publication Critical patent/CN112418324B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a cross-modal data fusion method for sensing the state of electrical equipment. The invention firstly converts multi-sensor time sequence data into a recursion graph; then, respectively using different convolution neural networks to carry out feature extraction on the recursion graph and the image data of the electrical equipment; and then, the two types of data features are effectively spliced according to the weight, and finally, the fused features are further subjected to feature extraction and state grade perception. The invention fully utilizes two types of cross-modal data, namely multi-sensor data and images, in the monitoring data of the electrical equipment, and solves the problems of low accuracy and poor fault tolerance in single-modal data sensing to a certain extent.

Description

Cross-modal data fusion method for electrical equipment state perception
Technical Field
The invention belongs to the technical field of electrical equipment state, and particularly relates to a cross-modal data fusion method for electrical equipment state perception.
Background
The method has the advantages that the effective state perception is carried out on the electrical equipment, hidden dangers are found in time, corresponding measures are taken, and the method is the key of safe and stable operation of the power grid.
The state of the electrical equipment is a result of the combined action of various parameters, and all the influencing factors are effectively fused and mutually supplemented and enhanced, so that the state sensing is carried out from multiple angles and all directions, and the accuracy of sensing is improved. Meanwhile, when the data quality of a certain parameter is reduced due to communication and measurement errors, other types of data are still supplemented, and the perceived fault tolerance can be effectively improved.
With the gradual development of digital infrastructure and power internet of things, the number and the types of the power sensing terminals are more and more, and an effective data acquisition way is provided for the all-dimensional multi-angle state sensing of electrical equipment. The multi-mode parameters generated by the multi-source sensing terminal under the power Internet of things comprise time series and other structural parameters represented by electrical measurement and also comprise unstructured parameters such as images and maintenance reports, and the two parameters have great differences in physical significance and representation forms, namely the multi-mode parameters. For data in the same mode, the data form is the same, so that the data can be regarded as parameters in the same coordinate system, and the fusion difficulty is relatively small; for the cross-modal data, because the structural form and the physical meaning are different, the unified description is difficult to carry out, and the fusion difficulty is high. Therefore, the rapid development of the power grid provides a sufficient data source for the state perception of the electrical equipment based on cross-modal data fusion, but the cross-type and multi-dimensional data analysis technology is weak, the correlation analysis mining capability among state quantities is insufficient, and the state perception of the electrical equipment is insufficient.
At present, image parameters and various structured parameters are widely applied to state evaluation of electrical equipment, such as monitoring the temperature condition of the equipment by using infrared image data, monitoring the gas quantity in the running environment of the equipment by using various sensors and the like. However, the existing method is mainly based on the state evaluation of the monomodal data, and has the defects of single description angle, low accuracy and poor fault tolerance.
Disclosure of Invention
The invention aims to provide a cross-modal data fusion method for electrical equipment state perception, and solves the problems that the existing electrical equipment state perception method only depends on single-modal data, and the perception accuracy and the fault tolerance are low.
The invention provides a cross-modal data fusion method for electrical equipment state perception, which comprises the following steps:
s1, carrying out normalization processing on the multi-sensor time sequence data and generating a multi-parameter recursion graph;
s2, performing feature extraction on the multi-parameter recursive graph by using a convolutional neural network to obtain multi-sensor parameter features;
s3, extracting the characteristics of the image data by using a Faster R-CNN network to obtain image parameter characteristics;
s4, constructing a cross-modal data fusion model, fusing the multi-sensor parameter characteristics and the image parameter characteristics according to weights, extracting the characteristics of the fused characteristics, and finally classifying the states of the electrical equipment through a multi-class neural network;
s5, training a cross-mode data fusion model according to the corresponding multi-sensor time sequence data, image data and electric equipment state;
and S6, sensing the state of the electrical equipment by using the trained cross-modal data fusion model.
Further, in step S1, the specific method for performing normalization processing on the multi-sensor time series data is as follows:
Figure BDA0002796755050000021
in the formula (I), the compound is shown in the specification,
Figure BDA0002796755050000022
indicating a certain sensor parameter i at t1To tnThe sampled values at the time instants, n, represent the length of the time series.
Further, in step S1, the generating the multi-parameter recursive graph specifically includes:
s11, if the length of the sensor time sequence is n, generating a recursive matrix Rn
Figure BDA0002796755050000023
Figure BDA0002796755050000024
In the formula, RijRepresenting a recursive matrix RnThe elements (A) and (B) in (B),
Figure BDA0002796755050000025
denotes the t-thiSampling value, x, of a sensor of type j at a time instantj(ti) At t for class j sensorsiThe sampled value of the moment.
And S12, multiplying all elements in the recursive matrix by preset matching coefficients to match the recursive matrix with the element values of the image, and drawing a recursive graph by taking the matched element values as pixel values.
Further, the convolutional neural network structure in step S2 is specifically:
a first layer: an input layer that inputs a multi-sensor recurrence map of size n × n;
a second layer: convolutional layers, 32 convolutional layers of size 5 × 5, step s is 1, padding is 2, and nonlinear transformation is performed by using a relu function;
and a third layer: pooling layers with a maximum pooling function of 2 × 2, step length s is 2, padding is 0;
a fourth layer: convolutional layers, 32 convolutional layers of size 5 × 5, step s is 1, padding is 2, and nonlinear transformation is performed by using a relu function;
and a fifth layer: pooling layers with a maximum pooling function of 2 × 2, step length s is 2, padding is 0;
a sixth layer: fully connected layers of size 1 x 125.
Further, in step S3, the feature extraction performed on the image data by using the Faster R-CNN network specifically includes: and (3) utilizing the first full-connection layer after ROI Pooling in the Faster R-CNN and the network before the first full-connection layer as a feature extraction network of the image data to extract the features of the image data.
Further, in step S4, the weighting factors of the multi-sensor parameter feature and the image parameter feature are obtained through training.
Further, in step S4, fusing the multi-sensor parameter features and the image parameter features according to weights specifically includes:
let the weight of the multi-sensor parameter feature be W1The weight of the image data feature is W2And the two are respectively multiplied by corresponding weights and then spliced.
Further, in step S4, the feature extraction is performed on the fused features, and finally, the classification of the electrical device state by the multi-class neural network specifically includes: the fused features respectively pass through a 4096 full-link layer and a m full-link layer, and are finally connected with a 1 x m multi-class neural network; wherein m is the number of layers of the multi-class neural network, namely the class number of the state of the electrical equipment.
Further, in step S5, training the cross-modal data fusion model with the corresponding multi-sensor timing data, image data, and electrical device state specifically includes: and performing model training by taking the corresponding multi-sensor time sequence data and image data pairs as model input and taking the state type of the electrical equipment as output.
Further, the objective function of the model training is Lcls
Figure BDA0002796755050000031
Wherein m represents the number of samples for training, piIs the prediction classification result of the ith sample and is obtained by calculation of a multi-class neural network classification formula,
Figure BDA0002796755050000032
representing the true classification result of the ith sample.
The invention has the beneficial effects that: the cross-modal data fusion method for sensing the state of the electrical equipment performs fusion sensing on the state of the electrical equipment based on the two types of cross-modal data, namely multi-sensor time sequence data and image data, fully utilizes the two types of cross-modal data of the multi-sensor and the image in monitoring data of the electrical equipment, and solves the problems of low accuracy and poor fault tolerance in single-modal data sensing.
Drawings
FIG. 1 is a flow chart of a cross-modal data fusion method for electrical device state awareness in accordance with the present invention.
FIG. 2 is a recursive graph-based electrical equipment multi-sensor parameter feature extraction network diagram.
Fig. 3 is a network diagram of image feature extraction for electrical equipment according to the present invention.
FIG. 4 is a diagram of a feature fusion network based on weighting factors according to the present invention.
Detailed Description
The invention will be further described with reference to the accompanying drawings in which:
the invention discloses a cross-modal data fusion method for sensing the state of electrical equipment, which is used for carrying out fusion sensing on the state of the electrical equipment based on two types of cross-modal data, namely multi-sensor data and image data. The invention firstly converts multi-sensor time sequence data into a recursion graph; then, respectively using different convolutional neural networks to perform feature extraction on the recursive graph and the image data of the electrical equipment; and then, the two types of data features are effectively spliced according to the weight, and finally, the fused features are further subjected to feature extraction and state grade perception. The invention fully utilizes two types of cross-modal data, namely multi-sensor data and images, in the monitoring data of the electrical equipment, and solves the problems of low accuracy and poor fault tolerance in single-modal data sensing to a certain extent.
The cross-mode data fusion method for sensing the state of the electrical equipment, disclosed by the embodiment of the invention, as shown in fig. 1, comprises the following steps:
s1, carrying out normalization processing on the multi-sensor time sequence data and generating a multi-parameter recursion graph;
s2, performing feature extraction on the multi-parameter recursive graph by using a convolutional neural network to obtain multi-sensor parameter features;
s3, extracting the characteristics of the image data by using a Faster R-CNN network to obtain image parameter characteristics;
s4, constructing a cross-modal data fusion model, fusing the multi-sensor parameter characteristics and the image parameter characteristics according to weights, extracting the characteristics of the fused characteristics, and finally classifying the states of the electrical equipment through a multi-class neural network;
s5, training a cross-mode data fusion model according to the corresponding multi-sensor time sequence data, image data and electric equipment state;
and S6, sensing the state of the electrical equipment by using the trained cross-modal data fusion model.
Further, in step S1, the specific method for performing normalization processing on the multi-sensor time series data is as follows:
Figure BDA0002796755050000041
in the formula (I), the compound is shown in the specification,
Figure BDA0002796755050000042
indicating that a certain parameter i of the sensor is at t1To tnThe sampling value at the time, n, represents the length of the time series, and is determined by the sampling frequency of the image data and the sensor data. In order to make the time series data correspond to, i.e. temporally match, the image data, the sampling instant of the last data in the multi-sensor time series data is the same as or close to the image sampling instant.
Further, in step S1, the generating the multi-parameter recursive graph specifically includes:
s11, if the length of the sensor time sequence is n, generating a recursive matrix Rn
Figure BDA0002796755050000043
Figure BDA0002796755050000044
In the formula, RijRepresenting a recursive matrix RnThe elements (A) and (B) in (B),
Figure BDA0002796755050000051
denotes the t-thiSampling value, x, of a sensor of type j at a time instantj(ti) At t for class j sensorsiThe sampled value of the moment.
S12, in order to match the element values in the recursive graph with the element values in the image data, all the elements in the recursive matrix are multiplied by a preset matching coefficient, which is 100 in this embodiment, so that the recursive matrix is matched with the element values of the image, and the recursive graph is drawn with the matched element values as pixel values.
Further, the convolutional neural network structure in step S2 is shown in fig. 2, and specifically includes:
a first layer: an input layer that inputs a multi-sensor recurrence map of size n × n;
a second layer: convolutional layers, 32 convolutional layers of size 5 × 5, with step s equal to 1, padding equal to 2, and non-linearly transformed with relu function;
and a third layer: the pooling layer is a maximal pooling function with the size of 2 multiplied by 2, the step length s is 2, and the padding is 0;
a fourth layer: convolutional layers, 32 convolutional layers of size 5 × 5, step s is 1, padding is 2, and nonlinear transformation is performed by using a relu function;
a fifth layer: the pooling layer is a maximal pooling function with the size of 2 multiplied by 2, the step length s is 2, and the padding is 0;
a sixth layer: fully connected layers of size 1 x 125.
Where k is the number of sensor classes and n is the length of the time series. conv (32 × 5) represents 32 convolution kernels of size 5 × 5; maxporoling 1(2 x 2) represents the maximum pooling of size 2 x 2; s is the step length in each dimension of the image during convolution; p is padding, that is, a part filled around the image, and if p is 2, it is described that two circles 0 are filled around the image.
Further, in step S3, the specific step of performing feature extraction on the image data by using the Faster R-CNN network is: the first fully connected layer after ROI Pooling (region of interest Pooling) in Faster R-CNN and the network before it are used as the feature extraction network of the image data, as shown in fig. 3, to perform feature extraction on the image data.
Further, in step S4, fusing the multi-sensor parameter features and the image parameter features according to weights specifically includes: let the weight of the multi-sensor parameter feature be W1The weight of the image data feature is W2And the two are respectively multiplied by the corresponding weights and then spliced. To avoid the influence of human factors, the weighting factors of the multi-sensor parametric features and the image parametric features in step S4 can be obtained through training.
Further, in step S4, the feature extraction is performed on the fused features, and finally, the classification of the electrical device state by the multi-class neural network specifically includes: the fused features respectively pass through a 4096 full-link layer and a m full-link layer, and are finally connected with a 1 x m multi-class neural network; wherein m is the number of layers of the multi-class neural network, namely the class number of the state of the electrical equipment. In this embodiment, the feature extraction network output size of the multi-parameter sensor parameter is 1 × 125, the feature extraction network output size of the image is 1 × 4096, and the weight corresponding to the multi-sensor parameter feature is set as W1The weight corresponding to the image data feature is W2As shown in fig. 4, the two are spliced according to the weight to form a feature with a size of 1 × 4221, and then feature extraction is performed on the fused feature: taking the characteristics as input, and performing characteristic extraction and parameter training by taking the state perception of the electrical equipment as a target, namely, respectively passing through a 4096 full-connection layer and a m full-connection layer, and finally connecting to a 1 x m softmax (multi-class neural network) for state classification of the electrical equipment. The states of the electrical devices are classified into m classes, so softmax outputs a 1 × m matrix, where the ith element represents the probability that the input is of the ith class.
For softmax, let its input be x, the target corresponding probability of jth class j be p (y ═ j | x),
Figure BDA0002796755050000061
in the formula, W is a model parameter from x to y.
Further, in step S5, training the cross-modal data fusion model with the corresponding multi-sensor time sequence data, image data, and electrical device state specifically includes: and performing model training by taking the corresponding multi-sensor time sequence data and image data pairs as model input and taking the state type of the electrical equipment as output. And generating a joint label for the multi-sensor data and the image data, namely generating a common label by taking the multi-sensor data and the image data at corresponding moments as a data pair, taking the multi-sensor data and the image data pair as model input, and taking the state type of the electrical equipment as output to carry out model training.
Further, the objective function of the model training is Lcls
Figure BDA0002796755050000062
Wherein m represents the number of samples for training, piThe predicted classification result of the ith sample is calculated by a multi-class neural network classification formula,
Figure BDA0002796755050000063
representing the true classification result of the ith sample.
Until L is satisfiedclsIf the error is less than or equal to sigma, the training is stopped, and sigma is an error threshold value and can be determined according to the actual situation.
After the training is finished, the state of the electrical equipment can be sensed by using the trained cross-modal data fusion model and taking the corresponding multi-sensor data and image data pairs as input.
The invention also provides another embodiment, and the embodiment adopts multi-sensor data and infrared data to sense the state of the transformer. The sensor data is structured data, the data volume is small, so the transmission cost is low, the data is acquired every 10 minutes, and the table 1 is the sensor data category; the camera collects unstructured data, the data volume is large, the transmission cost is high, and the data are collected every 2 hours generally.
TABLE 1 Multi-sensor parameter types and sampling frequency table
Figure BDA0002796755050000064
Figure BDA0002796755050000071
The data set used in this embodiment includes 6432 infrared image data, and referring to the multi-parameter recursive graph conversion method in step 1, one infrared image corresponds to a sensor parameter matrix of 10 × 20, so that the total sensor parameters is 10 × 20 × 6432. As shown in Table 2, 5432 data pairs were used as the training set and 1000 data pairs were used as the test set. In order to verify the method of the present invention, the present embodiment classifies the states of the transformer into two categories, i.e. abnormal and non-abnormal.
TABLE 2 data distribution Table
Figure BDA0002796755050000072
The specific steps are as follows:
step 1, selecting the length n of the multi-sensor time sequence as 20, normalizing ten parameters including winding temperature, full current, capacitance value and the like, and generating a multi-parameter recursive graph with the size of 20 multiplied by 20.
And 2, performing feature extraction on the generated multi-parameter recursive graph, and outputting a feature vector with the size of 1 × 125.
And 3, performing feature extraction on the infrared image data corresponding to the multi-sensor parameters by using fast R-CNN, and outputting a feature vector with the size of 1 multiplied by 4096.
And 4, splicing the two types of features according to the weight to generate a fusion feature vector with the size of 1 multiplied by 4221.
And 5, training the fused features by respectively passing through a full connection layer with the size of 1 x 4096 and a full connection layer with the size of 1 x 2 by taking the transformer state grade as a target.
And 6, repeating the steps 1-5, and training the model.
And 7, testing by using the trained model.
In this embodiment, evaluation indexes of a model test are performed by using Precision (P), Recall (R), Average Precision (AR), and Average Recall (AR). The test results are shown in table 3. The cross-modal data fusion method for sensing the state of the electrical equipment can accurately sense the state of the electrical equipment, and the average sensing precision and the average recall rate of the sensing are 76.3% and 72.9% respectively.
TABLE 3 comparison of results for each model
Figure BDA0002796755050000081
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.

Claims (5)

1. A cross-mode data fusion method for sensing the state of electrical equipment is characterized by comprising the following steps:
s1, carrying out normalization processing on the multi-sensor time sequence data and generating a multi-parameter recursion graph;
wherein: the specific method for carrying out normalization processing on the multi-sensor time sequence data comprises the following steps:
Figure FDA0003604741690000011
in the formula (I), the compound is shown in the specification,
Figure FDA0003604741690000012
indicating that a certain parameter i of the sensor is at t1To tnSampling values at time, n representing the length of the time series;
the generating of the multi-parameter recursive graph specifically comprises:
s11, if the length of the sensor time sequence is n, generating a recursive matrix Rn
Figure FDA0003604741690000013
Figure FDA0003604741690000014
In the formula, RijRepresenting a recursive matrix RnThe elements (A) and (B) in (B),
Figure FDA0003604741690000015
denotes the t-thiSampling value, x, of a sensor of type j at a time instantj(ti) At t for class j sensorsiSampling values of the moments;
s12, multiplying all elements in the recursion matrix by a preset matching coefficient, matching the recursion matrix with the element values of the image, and drawing a recursion graph by taking the matched element values as pixel values;
s2, performing feature extraction on the multi-parameter recursive graph by using a convolutional neural network to obtain multi-sensor parameter features;
s3, extracting the characteristics of the image data by using a Faster R-CNN network to obtain image parameter characteristics;
s4, constructing a cross-modal data fusion model, fusing the multi-sensor parameter characteristics and the image parameter characteristics according to weights, extracting the characteristics of the fused characteristics, and finally classifying the states of the electrical equipment through a multi-class neural network;
wherein, the weighting factors of the multi-sensor parameter characteristic and the image parameter characteristic are obtained through training;
fusing the multi-sensor parameter characteristics and the image parameter characteristics according to weights specifically comprises the following steps: let the weight of the multi-sensor parameter feature be W1The weight of the image data feature is W2Respectively multiplying the two weights by corresponding weights and splicing;
s5, training a cross-mode data fusion model according to the corresponding multi-sensor time sequence data, image data and electric equipment state; the method comprises the following specific steps: taking corresponding multi-sensor time sequence data and image data pairs as model input, and taking the state type of the electrical equipment as output, and performing model training;
and S6, sensing the state of the electrical equipment by using the trained cross-modal data fusion model.
2. The cross-modal data fusion method for electrical device state awareness according to claim 1, wherein the convolutional neural network structure in step S2 is specifically:
a first layer: an input layer that inputs a multi-sensor recurrence map of size n × n;
a second layer: convolutional layers, 32 convolutional layers of size 5 × 5, with step s equal to 1, padding equal to 2, and non-linearly transformed with relu function;
and a third layer: the pooling layer is a maximal pooling function with the size of 2 multiplied by 2, the step length s is 2, and the padding is 0;
a fourth layer: convolutional layers, 32 convolutional layers of size 5 × 5, with step s equal to 1, padding equal to 2, and non-linearly transformed with relu function;
and a fifth layer: pooling layers with a maximum pooling function of 2 × 2, step length s is 2, padding is 0;
a sixth layer: fully connected layers of size 1 x 125.
3. The cross-modal data fusion method for sensing the state of the electrical device according to claim 1, wherein in step S3, the performing the feature extraction on the image data by using the Faster R-CNN network specifically comprises: and (3) utilizing the first full-connection layer after ROI Pooling in the Faster R-CNN and the network before the first full-connection layer as a feature extraction network of the image data to extract the features of the image data.
4. The method for fusing cross-modal data for sensing the state of the electrical device according to claim 1, wherein in step S4, the feature extraction is performed on the fused features, and finally the classification of the state of the electrical device through the multi-class neural network is specifically as follows: the fused features respectively pass through a 4096 full-link layer and a m full-link layer, and are finally connected with a 1 x m multi-class neural network; wherein m is the number of layers of the multi-class neural network, namely the class number of the state of the electrical equipment.
5. The method of claim 1, wherein an objective function of model training is Lcls
Figure FDA0003604741690000021
Wherein m represents the number of samples for training, piIs the prediction classification result of the ith sample and is obtained by calculation of a multi-class neural network classification formula,
Figure FDA0003604741690000022
representing the true classification result of the ith sample.
CN202011334424.3A 2020-11-25 2020-11-25 Cross-modal data fusion method for electrical equipment state perception Active CN112418324B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011334424.3A CN112418324B (en) 2020-11-25 2020-11-25 Cross-modal data fusion method for electrical equipment state perception

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011334424.3A CN112418324B (en) 2020-11-25 2020-11-25 Cross-modal data fusion method for electrical equipment state perception

Publications (2)

Publication Number Publication Date
CN112418324A CN112418324A (en) 2021-02-26
CN112418324B true CN112418324B (en) 2022-06-24

Family

ID=74842057

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011334424.3A Active CN112418324B (en) 2020-11-25 2020-11-25 Cross-modal data fusion method for electrical equipment state perception

Country Status (1)

Country Link
CN (1) CN112418324B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518114B (en) * 2021-05-12 2024-07-12 江苏力行电力电子科技有限公司 Artificial intelligence control method and system based on multi-mode ad hoc network
CN113344137B (en) * 2021-07-06 2022-07-19 电子科技大学成都学院 SOM-based data fusion method and device, storage medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108182441A (en) * 2017-12-29 2018-06-19 华中科技大学 Parallel multichannel convolutive neural network, construction method and image characteristic extracting method
CN108614548A (en) * 2018-04-03 2018-10-02 北京理工大学 A kind of intelligent failure diagnosis method based on multi-modal fusion deep learning
CN109738512A (en) * 2019-01-08 2019-05-10 重庆大学 Nondestructive detection system and method based on multiple physical field fusion
CN111507233A (en) * 2020-04-13 2020-08-07 吉林大学 Multi-mode information fusion intelligent vehicle pavement type identification method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10902343B2 (en) * 2016-09-30 2021-01-26 Disney Enterprises, Inc. Deep-learning motion priors for full-body performance capture in real-time

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108182441A (en) * 2017-12-29 2018-06-19 华中科技大学 Parallel multichannel convolutive neural network, construction method and image characteristic extracting method
CN108614548A (en) * 2018-04-03 2018-10-02 北京理工大学 A kind of intelligent failure diagnosis method based on multi-modal fusion deep learning
CN109738512A (en) * 2019-01-08 2019-05-10 重庆大学 Nondestructive detection system and method based on multiple physical field fusion
CN111507233A (en) * 2020-04-13 2020-08-07 吉林大学 Multi-mode information fusion intelligent vehicle pavement type identification method

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
中国电机工程学会,2016,第315-320页. *
国网上海市电力公司,2020,第24-32页. *
李永德.多源传感器数据融合的电气设备故障诊断研究.《信息通信》.2018,第104-106页. *
李进,张萌.基于多传感器信息融合的电力设备故障诊断方法.《电子世界》.中国电子学会,2016,第131+140页. *
林刚,王波,彭辉,王晓阳,陈思远,张黎明.基于改进Faster-RCNN的输电线巡检图像多目标检测及定位.《电力自动化设备》.南京电力自动化研究所有限公司,2019,第213-218页. *
汪勋婷,王波.考虑信息物理融合的电网脆弱社团评估方法.《电力自动化设备》.南京电力自动化研究所有限公司,2017,第43-51页. *
王红霞,王波,陈红坤,刘畅,马富齐,罗鹏,杨艳.电力数据融合:基本概念、抽象化结构、关键技术和应用场景.《供用电》.英大传媒(上海)有限公司 *
魏大千,王波,刘涤尘,陈得治,唐飞,郭珂.基于时序数据相关性挖掘的WAMS/SCADA数据融合方法.《高电压技术》.国家高电压计量站 *

Also Published As

Publication number Publication date
CN112418324A (en) 2021-02-26

Similar Documents

Publication Publication Date Title
CN110376522B (en) Motor fault diagnosis method of data fusion deep learning network
CN111914883B (en) Spindle bearing state evaluation method and device based on deep fusion network
CN112418324B (en) Cross-modal data fusion method for electrical equipment state perception
CN112507915B (en) Bolt connection structure loosening state identification method based on vibration response information
CN108435819B (en) Energy consumption abnormity detection method for aluminum profile extruder
CN112085071A (en) Power distribution room equipment fault analysis and pre-judgment method and device based on edge calculation
CN116610998A (en) Switch cabinet fault diagnosis method and system based on multi-mode data fusion
CN112180204A (en) Power grid line fault diagnosis method based on electric quantity information
CN112069930A (en) Vibration signal processing method and device for improving GIS equipment fault diagnosis accuracy
CN108562821A (en) A kind of method and system determining Single-phase Earth-fault Selection in Distribution Systems based on Softmax
CN117671396A (en) Intelligent monitoring and early warning system and method for construction progress
CN117233682B (en) Quick calibration system of balance bridge
CN117972460B (en) Operation fault discrimination method for high-voltage current transformer
CN117407770A (en) High-voltage switch cabinet fault mode classification and prediction method based on neural network
CN115017828A (en) Power cable fault identification method and system based on bidirectional long-short-time memory network
CN117332352B (en) Lightning arrester signal defect identification method based on BAM-AlexNet
CN117875778A (en) Microwave component quality control method, device and storage medium based on fusion model
CN114662976A (en) Power distribution network overhead line state evaluation method based on convolutional neural network
CN112926016A (en) Multivariable time series change point detection method
CN113740671A (en) Fault arc identification method based on VMD and ELM
CN116702005A (en) Neural network-based data anomaly diagnosis method and electronic equipment
CN116577602A (en) Cable defect positioning method based on broadband impedance spectrum and self-attention mechanism coupling
CN116738355A (en) Electric power multi-mode feature level fusion method considering data difference
CN110908365A (en) Unmanned aerial vehicle sensor fault diagnosis method and system and readable storage medium
CN116796187A (en) Power transmission line partial discharge detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant