CN115239989A - Distributed fault diagnosis method for oil immersed transformer based on data privacy protection - Google Patents

Distributed fault diagnosis method for oil immersed transformer based on data privacy protection Download PDF

Info

Publication number
CN115239989A
CN115239989A CN202210751099.3A CN202210751099A CN115239989A CN 115239989 A CN115239989 A CN 115239989A CN 202210751099 A CN202210751099 A CN 202210751099A CN 115239989 A CN115239989 A CN 115239989A
Authority
CN
China
Prior art keywords
data
fault diagnosis
local
transformer
oil
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210751099.3A
Other languages
Chinese (zh)
Inventor
郭方洪
刘师硕
董辉
吴祥
俞立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University of Technology ZJUT
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN202210751099.3A priority Critical patent/CN115239989A/en
Publication of CN115239989A publication Critical patent/CN115239989A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Bioethics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses an oil immersed transformer distributed fault diagnosis method based on data privacy protection, which comprises the following steps: acquiring data of dissolved gas in oil of N transformer stations and storing the data into corresponding local databases; normalizing the data and labeling; the data marked with the labels are tiled in a bidirectional mode and converted into a two-dimensional image; the transformer fault diagnosis model is constructed based on federal learning and comprises a central server and N participant clients, wherein the participant clients are provided with local models, and the central server is used for aggregating parameters of the local models; inputting the two-dimensional image into a corresponding local model for training to obtain local model parameters; carrying out differential privacy processing on the local model parameters; uploading the data to a central server, aggregating the data, sending the aggregated data to a local model, returning to training until preset training times are completed, and obtaining an optimal transformer fault diagnosis model. The method can effectively solve the problem of data isolated island in transformer diagnosis, carries out privacy protection on data and has good generalization capability.

Description

Distributed fault diagnosis method for oil immersed transformer based on data privacy protection
Technical Field
The invention belongs to the technical field of transformer fault diagnosis, and particularly relates to a distributed fault diagnosis method for an oil immersed transformer based on data privacy protection.
Background
In recent years, with the increasing demand for electric power due to the development of national economy, the total amount of electric power transformers in each country is increasing year by year. Therefore, accurate diagnosis of whether and what kind of transformer faults occur is of the essence.
In the prior art, many scholars introduce an artificial intelligence method into the field of transformer fault diagnosis, and an ideal model for transformer fault diagnosis can be obtained after training for many times by collecting a large amount of operation data of transformers and simulating the learning mode of human beings by using a machine learning algorithm. The method realizes the high efficiency, the digitization and the intellectualization of the transformer fault diagnosis, however, as the privacy protection of industrial data is gradually emphasized, the data sharing among power industry departments and industries is obstructed, the operation data of each power transformer is isolated locally, and the phenomenon of data isolated island commonly exists among data pools. The phenomenon makes the centralized training mode of the collected data difficult to realize, so that the key problem of breaking through the existing application dilemma is how to break a data island to carry out multi-party collaborative training and protect the data privacy of a user in the process of machine learning-based transformer fault diagnosis.
Disclosure of Invention
The invention aims to provide a distributed fault diagnosis method of an oil immersed transformer based on data privacy protection, which can effectively solve the problem of data islands in the diagnosis process of the transformer, carry out privacy protection on model parameters, effectively protect user information and have good generalization capability.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
the invention provides an oil immersed transformer distributed fault diagnosis method based on data privacy protection, which comprises the following steps:
s1, obtaining the dissolved gas data in oil of N transformer stations and storing the dissolved gas data in corresponding local databases, wherein data sharing is not performed among the local databases;
s2, carrying out normalization processing on the dissolved gas data in the oil of each local database, grouping the data after the normalization processing based on an improved three-ratio method, and labeling;
s3, performing bidirectional tiling reinforcement on the labeled data, and correspondingly converting the labeled data into a two-dimensional image;
s4, constructing a transformer fault diagnosis model based on federal learning, wherein the transformer fault diagnosis model comprises a central server and N participant clients, the participant clients correspond to transformer sites one by one, each participant client is configured with a local model, the local model is a convolutional neural network model, and the central server adopts a FedAvg algorithm to aggregate local model parameters;
s5, inputting the two-dimensional images of the transformer stations into corresponding local models respectively for training to obtain local model parameters;
s6, adding Gaussian noise to the local model parameters to perform differential privacy processing;
and S7, uploading the local model parameters subjected to the difference privacy processing to a central server side for aggregation, sending the aggregated local model parameters to the local model, returning to the step S5 until the preset training times are completed, and obtaining the optimal transformer fault diagnosis model.
Preferably, the dissolved gas in oil data comprises H 2 、CH 4 、C 2 H 6 、C 2 H 4 And C 2 H 2 Five kinds of gases.
Preferably, the data of the dissolved gas in oil of each local database is normalized, and the formula is as follows:
Figure BDA0003718340140000021
Figure BDA0003718340140000022
Figure BDA0003718340140000023
Figure BDA0003718340140000024
Figure BDA0003718340140000025
wherein,
Figure BDA0003718340140000026
for the total content of five gases in each set of data,
Figure BDA0003718340140000027
is CH 4 、C 2 H 6 、C 2 H 4 And C 2 H 2 The total content of the gas (es) is,
Figure BDA0003718340140000031
and
Figure BDA0003718340140000032
in order of H 2 、CH 4 、C 2 H 6 、C 2 H 4 、C 2 H 2 And (4) normalizing the content of the dissolved gas in the oil.
Preferably, the convolutional neural network model is a LeNet-5 network model.
Preferably, gradient clipping is performed before the difference privacy processing is performed by adding gaussian noise to the local model parameters.
Preferably, the gaussian noise satisfies the following formula:
Figure BDA0003718340140000033
wherein,
Figure BDA0003718340140000034
where Δ f is the sensitivity of differential privacy, δ is the relaxation factor, ε is the privacy budget, D k For the sample content of participant client k, k =1,2, \ 8230, N, C are gradient clipping coefficients.
Preferably, in the training process, the update formula of each local model parameter is as follows:
Figure BDA0003718340140000035
wherein,
Figure BDA0003718340140000036
and
Figure BDA0003718340140000037
weight parameters, F, for the tth and t +1 th iterations, respectively, of participant k kk ) As a function of the loss for participant k,
Figure BDA0003718340140000038
is a gradient operator, and R is a learning rate; n (0, sigma) 2 ) To satisfy gaussian distributed random noise.
Preferably, the local model parameters after the differential privacy processing are uploaded to a central server for aggregation, and the formula is as follows:
Figure BDA0003718340140000039
wherein,
Figure BDA00037183401400000310
the weight parameter is the weight parameter of the t +1 th iteration of the participant k, and N is the number of the participant clients.
Compared with the prior art, the invention has the following beneficial effects:
1) The method adopts a federal learning architecture to solve the defect that a large amount of data cannot be collected for centralized fault diagnosis during transformer fault diagnosis, breaks through the island barrier of local data of the transformer, realizes distributed cooperative training that the data does not leave the local, and can avoid privacy leakage of local model parameters;
2) By adding Gaussian noise to the local model parameters and aiming at the difference of the sample content of each participant client, a self-adaptive noise adding mechanism is realized, privacy disclosure in the uploading process of the local model parameters is further avoided, data privacy is effectively protected, and the method has good generalization capability.
Drawings
FIG. 1 is a flow chart of a distributed fault diagnosis method for an oil immersed transformer based on data privacy protection according to the present invention;
FIG. 2 is a schematic diagram of a LeNet-5 network model according to the present invention;
FIG. 3 is a block diagram of differential privacy federated learning of the present invention;
FIG. 4 is a graph of accuracy variation of different privacy budget models under a balanced data set according to the present invention;
FIG. 5 is a graph of loss variation for different privacy budget models under a balanced data set in accordance with the present invention;
FIG. 6 is a graph of accuracy variation of different privacy budget models under unbalanced data sets in accordance with the present invention;
FIG. 7 is a graph of loss variation for different privacy budget models under unbalanced data sets in accordance with the present invention.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
It is to be noted that, unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
As shown in fig. 1 to 7, a distributed fault diagnosis method for an oil immersed transformer based on data privacy protection includes the following steps:
s1, obtaining data of dissolved gas in oil of N transformer stations and storing the data into corresponding local databases, wherein data sharing is not performed among the local databases.
S2, carrying out normalization processing on the dissolved gas data in the oil of each local database, grouping the data after the normalization processing based on an improved three-ratio method, and labeling.
In one embodiment, the dissolved gas in oil data includes H 2 、CH 4 、C 2 H 6 、C 2 H 4 And C 2 H 2 Five kinds of gases.
In one embodiment, the data of the dissolved gas in oil in each local database is normalized by the following formula:
Figure BDA0003718340140000051
Figure BDA0003718340140000052
Figure BDA0003718340140000053
Figure BDA0003718340140000054
Figure BDA0003718340140000055
wherein,
Figure BDA0003718340140000056
for the total content of five gases in each set of data,
Figure BDA0003718340140000057
is CH 4 、C 2 H 6 、C 2 H 4 And C 2 H 2 The total content of the gas (es) is,
Figure BDA0003718340140000058
and
Figure BDA0003718340140000059
in order of H 2 、CH 4 、C 2 H 6 、C 2 H 4 、C 2 H 2 The content of dissolved gas in the oil after normalization treatment.
In an actual industrial environment, each transformer station collects operation data (namely, dissolved gas data in oil) and stores the operation data in a local database, and data are not shared among the local databases. In this embodiment, H is selected according to the operating condition 2 、CH 4 、C 2 H 6 、C 2 H 4 And C 2 H 2 The five gases are analyzed to carry out fault diagnosis on the transformer,
Figure BDA00037183401400000510
and
Figure BDA00037183401400000511
in order of H 2 、CH 4 、C 2 H 6 、C 2 H 4 、C 2 H 2 The content of dissolved gas in the oil after normalization treatment. It should be noted that the selection of the type of gas can also be adjusted according to the actual requirements. Then, according to the improved three-ratio method judgment rule recommended by the current DL/T722-2000 guide of China, the data are grouped and labeled, for example, each group comprises the data of the dissolved gas in the five types of oil, so that the subsequent model training is facilitated, and the details are not repeated here for the prior art.
And S3, performing bidirectional tiling reinforcement on the labeled data, and correspondingly converting the labeled data into a two-dimensional image.
The original data features are expanded into matrix features of 28 multiplied by 28 by performing bidirectional tiling reinforcement on the labeled data, and the matrix features are further converted into corresponding two-dimensional images, so that data space features are given, and the feature expression capability of the data is enhanced.
And S4, constructing a transformer fault diagnosis model based on federal learning, wherein the transformer fault diagnosis model comprises a central server and N participant clients, the participant clients correspond to transformer sites one by one, each participant client is configured with a local model, the local model is a convolutional neural network model, and the central server adopts a FedAvg algorithm to aggregate local model parameters.
In one embodiment, the convolutional neural network model is a LeNet-5 network model.
The transformer fault diagnosis model is constructed based on federal learning, each independent transformer station is equivalent to one participant in the transformer fault diagnosis model, and each participant isolates data so as to solve the problem of data island and meet the requirements of user privacy protection and data safety.
The participant client is provided with a local model which uses a LeNet-5 network model as a training model, or other convolutional neural network models in the prior art can also be used. As shown in fig. 2, the LeNet-5 network model is a prior art, and consists of convolutional layers, pooling layers, full-link layers, and classifiers. Here, layer1 and Layer3 are convolution layers, and input features (two-dimensional images) are extracted by using a convolution kernel of 5 × 5. Layer2 and Layer4 are pooling layers, and the convolved images are pooled and compressed by using a 2 x 2 window. Layer5 calculates the output results of the four layers using a convolution kernel of 5 × 5, and obtains 120 neurons of 1 × 1. Layer6 is a fully connected Layer, having a total of 84 nodes, corresponding to a 7 × 12 bit map. Layer7 completes the output of the picture category through the classifier.
And S5, inputting the two-dimensional images of the transformer stations into corresponding local models respectively for training to obtain local model parameters.
And S6, adding Gaussian noise to the local model parameters to perform differential privacy processing.
In one embodiment, gradient clipping is performed before the difference privacy processing is performed by adding gaussian noise to the local model parameters.
In one embodiment, the gaussian noise satisfies the following equation:
Figure BDA0003718340140000061
wherein,
Figure BDA0003718340140000062
where Δ f is the sensitivity of differential privacy, δ is the relaxation factor, ε is the privacy budget, D k For the sample content of participant client k, k =1,2, \8230, N, C are gradient clipping coefficients.
As shown in fig. 3, in order to avoid the risk of privacy disclosure caused by stealing of an attacker in the local model parameter uploading process, noise is added in the local model training process, and differential privacy federal learning is realized in the global model, namely the local model parameter. In general, gradient clipping is performed before the noise addition, such as binding the gradient between C and-C.
In the federal learning structure, the data held by each participant hardly satisfies the independent equal distribution principle, and the influence of the imbalance of the data on the training effect of the model cannot be obviously eliminated by only adding the same noise to all the participants. Therefore, the adaptive noise adding mechanism is realized through the formulas (6) and (7) in consideration of the difference of the data sample content of each participant.
And S7, uploading the local model parameters subjected to the differential privacy processing to a central server side for aggregation, issuing the aggregated local model parameters to the local model, returning to the step S5 until the preset training times are finished, and obtaining an optimal transformer fault diagnosis model.
In one embodiment, during the training process, the update formula of each local model parameter is as follows:
Figure BDA0003718340140000071
wherein,
Figure BDA0003718340140000072
and
Figure BDA0003718340140000073
weight parameters, F, for the tth iteration and the t +1 th iteration, respectively, of the participant k kk ) As a function of the loss for participant k,
Figure BDA0003718340140000074
is a gradient operator, and R is a learning rate; n (0, sigma) 2 ) To satisfy the gaussian distribution of random noise.
In an embodiment, the local model parameters after the differential privacy processing are uploaded to a central server for aggregation, and the formula is as follows:
Figure BDA0003718340140000075
wherein,
Figure BDA0003718340140000076
and N is the weight parameter of the t +1 th iteration of the participant k, and is the number of the participant clients.
The optimal transformer fault diagnosis model can be obtained through multiple loop iterations, and fault diagnosis is carried out by using the optimal transformer fault diagnosis model, so that the local data of each federal participant has a better diagnosis result.
In order to reflect the privacy protection effect and the influence of privacy budgets on the model accuracy more intuitively, firstly, the comparison experiments of the transformer fault diagnosis model under different privacy budgets are designed by adopting independent and same distributed data (balanced data sets). Referring to fig. 4, the abscissa is the number of times of training, and the ordinate is the accuracy of the model under different privacy budgets, and as the number of times of training increases and the privacy budget is adjusted, the accuracy of the model can reach 97.64%. Referring to fig. 5, the abscissa is the training times, and the ordinate is the loss of the loss function under different privacy budgets, and the loss can be reduced to 0.06 as the training times increase and the privacy budgets are adjusted.
Furthermore, considering that local data of each participant in an industrial environment hardly meets an independent same distribution principle (an unbalanced data set), and characteristics of fault type difference, sample content difference and the like generally exist, data used in the experiment are divided, each participant distributes two or three fault type data which are not overlapped with each other, and privacy budget is properly adjusted to carry out the experiment. Referring to fig. 6, the abscissa is the training times, and the ordinate is the accuracy of the model under different privacy budgets, and as the training times increase and the privacy budgets are adjusted, the accuracy of the model can reach 97.02%. Referring to fig. 7, the abscissa is the training times, and the ordinate is the loss of the loss function under different privacy budgets, and as the training times increase and the privacy budgets are adjusted, the loss can be reduced to 0.09 at least.
The method adopts a federal learning architecture to solve the defect that a large amount of data cannot be collected for centralized fault diagnosis during transformer fault diagnosis, breaks through the island barrier of local data of the transformer, realizes distributed cooperative training that the data does not leave the local, and can avoid privacy leakage of local model parameters; by adding Gaussian noise to the local model parameters and aiming at the difference of the sample content of each participant client, a self-adaptive noise adding mechanism is realized, privacy disclosure in the uploading process of the local model parameters is further avoided, data privacy is effectively protected, and the method has good generalization capability.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express the more specific and detailed embodiments described in the present application, but not be construed as limiting the claims. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, and these are all within the scope of protection of the present application. Therefore, the protection scope of the present patent application shall be subject to the appended claims.

Claims (8)

1. A distributed fault diagnosis method for an oil immersed transformer based on data privacy protection is characterized by comprising the following steps: the distributed fault diagnosis method for the oil immersed transformer based on data privacy protection comprises the following steps:
s1, obtaining the dissolved gas data in oil of N transformer stations and storing the dissolved gas data in corresponding local databases, wherein data sharing is not performed among the local databases;
s2, carrying out normalization processing on the data of the dissolved gas in the oil of each local database, grouping the data after the normalization processing based on an improved three-ratio method, and labeling;
s3, performing bidirectional tiling reinforcement on the labeled data, and correspondingly converting the labeled data into a two-dimensional image;
s4, constructing a transformer fault diagnosis model based on federal learning, wherein the transformer fault diagnosis model comprises a central server and N participant clients, the participant clients correspond to transformer sites one by one, each participant client is configured with a local model, the local model is a convolutional neural network model, and the central server adopts a FedAvg algorithm to aggregate local model parameters;
s5, inputting the two-dimensional images of each transformer station into corresponding local models for training to obtain local model parameters;
s6, adding Gaussian noise to the local model parameters to perform differential privacy processing;
and S7, uploading the local model parameters subjected to the difference privacy processing to a central server side for aggregation, sending the aggregated local model parameters to the local model, returning to the step S5 until the preset training times are completed, and obtaining the optimal transformer fault diagnosis model.
2. The oil immersed transformer distributed fault diagnosis method based on data privacy protection as claimed in claim 1, wherein: the dissolved gas data in oil comprises H 2 、CH 4 、C 2 H 6 、C 2 H 4 And C 2 H 2 Five kinds of gases.
3. The oil immersed transformer distributed fault diagnosis method based on data privacy protection as claimed in claim 2, characterized in that: the normalization processing is carried out on the data of the dissolved gas in the oil of each local database, and the formula is as follows:
Figure FDA0003718340130000011
Figure FDA0003718340130000021
Figure FDA0003718340130000022
Figure FDA0003718340130000023
Figure FDA0003718340130000024
wherein,
Figure FDA0003718340130000025
for the total content of five gases in each set of data,
Figure FDA0003718340130000026
is CH 4 、C 2 H 6 、C 2 H 4 And C 2 H 2 The total content of the gas (es) is,
Figure FDA0003718340130000027
and
Figure FDA0003718340130000028
in order of H 2 、CH 4 、C 2 H 6 、C 2 H 4 、C 2 H 2 The content of dissolved gas in the oil after normalization treatment.
4. The oil immersed transformer distributed fault diagnosis method based on data privacy protection as claimed in claim 1, wherein: the convolutional neural network model is a LeNet-5 network model.
5. The oil immersed transformer distributed fault diagnosis method based on data privacy protection as claimed in claim 1, wherein: and gradient clipping is carried out before Gaussian noise is added to the local model parameters for carrying out difference privacy processing.
6. The oil immersed transformer distributed fault diagnosis method based on data privacy protection as claimed in claim 5, characterized in that: gaussian noise satisfies the following equation:
Figure FDA0003718340130000029
wherein,
Figure FDA00037183401300000210
where Δ f is the sensitivity of differential privacy, δ is the relaxation factor, ε is the privacy budget, D k For the sample content of participant client k, k =1,2, \ 8230, N, C are gradient clipping coefficients.
7. The oil immersed transformer distributed fault diagnosis method based on data privacy protection as claimed in claim 6, characterized in that: in the training process, the updating formula of each local model parameter is as follows:
Figure FDA00037183401300000211
wherein,
Figure FDA00037183401300000212
and
Figure FDA00037183401300000213
weight parameters, F, for the tth and t +1 th iterations, respectively, of participant k kk ) As a function of the loss for participant k,
Figure FDA00037183401300000214
is a gradient operator, and R is a learning rate; n (0, sigma) 2 ) To satisfy the gaussian distribution of random noise.
8. The oil immersed transformer distributed fault diagnosis method based on data privacy protection as claimed in claim 7, wherein: uploading the local model parameters subjected to the differential privacy processing to a central server for aggregation, wherein the formula is as follows:
Figure FDA0003718340130000031
wherein,
Figure FDA0003718340130000032
the weight parameter is the weight parameter of the t +1 th iteration of the participant k, and N is the number of the participant clients.
CN202210751099.3A 2022-06-28 2022-06-28 Distributed fault diagnosis method for oil immersed transformer based on data privacy protection Pending CN115239989A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210751099.3A CN115239989A (en) 2022-06-28 2022-06-28 Distributed fault diagnosis method for oil immersed transformer based on data privacy protection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210751099.3A CN115239989A (en) 2022-06-28 2022-06-28 Distributed fault diagnosis method for oil immersed transformer based on data privacy protection

Publications (1)

Publication Number Publication Date
CN115239989A true CN115239989A (en) 2022-10-25

Family

ID=83670574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210751099.3A Pending CN115239989A (en) 2022-06-28 2022-06-28 Distributed fault diagnosis method for oil immersed transformer based on data privacy protection

Country Status (1)

Country Link
CN (1) CN115239989A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115952442A (en) * 2023-03-09 2023-04-11 山东大学 Global robust weighting-based federal domain generalized fault diagnosis method and system
CN117993481A (en) * 2024-04-03 2024-05-07 南京凯奥思数据技术有限公司 Transformer fault diagnosis method, device and equipment based on federal split learning

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115952442A (en) * 2023-03-09 2023-04-11 山东大学 Global robust weighting-based federal domain generalized fault diagnosis method and system
CN117993481A (en) * 2024-04-03 2024-05-07 南京凯奥思数据技术有限公司 Transformer fault diagnosis method, device and equipment based on federal split learning
CN117993481B (en) * 2024-04-03 2024-06-04 南京凯奥思数据技术有限公司 Transformer fault diagnosis method, device and equipment based on federal split learning

Similar Documents

Publication Publication Date Title
CN115239989A (en) Distributed fault diagnosis method for oil immersed transformer based on data privacy protection
CN110212528B (en) Power distribution network measurement data missing reconstruction method
CN112035746A (en) Session recommendation method based on space-time sequence diagram convolutional network
CN110689162B (en) Bus load prediction method, device and system based on user side classification
CN111159638A (en) Power distribution network load missing data recovery method based on approximate low-rank matrix completion
CN114006370B (en) Power system transient stability analysis and evaluation method and system
CN109034511A (en) Based on the power distribution network investment decision analysis model for improving Topsis method
CN115019510B (en) Traffic data restoration method based on dynamic self-adaptive generation countermeasure network
CN112801185B (en) Network security situation understanding and evaluating method based on improved neural network
CN115272776B (en) Hyperspectral image classification method based on double-path convolution and double attention and storage medium
CN107679539A (en) A kind of single convolutional neural networks local message wild based on local sensing and global information integration method
Hao et al. Gear fault detection in a planetary gearbox using deep belief network
CN115983374A (en) Cable partial discharge database sample expansion method based on optimized SA-CACGAN
Li et al. An optimized GRNN‐enabled approach for power transformer fault diagnosis
AU2021106177A4 (en) Method for predicting spatial-temporal dynamic distribution of electric vehicle charging loads
CN114679372A (en) Node similarity-based attention network link prediction method
AU2021102006A4 (en) A system and method for identifying online rumors based on propagation influence
CN110033034A (en) A kind of image processing method, device and the computer equipment of non-homogeneous texture
CN114936703A (en) Marketing company financial violation prediction method based on improved Transformer model
CN107317866A (en) A kind of intelligent communication server and its construction method based on finite-state automata framework
Lin et al. A method of satellite network fault synthetic diagnosis based on C4. 5 algorithm and expert knowledge database
CN114638421A (en) Method for predicting requirement of generator set spare parts
CN113158088A (en) Position recommendation method based on graph neural network
Yuan et al. Multi-style transfer generative adversarial network for text images
CN113656919B (en) Asymmetric rotor displacement field reconstruction method based on deep convolutional neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination