CN112151249A - Active noise reduction method and system for transformer and storage medium - Google Patents

Active noise reduction method and system for transformer and storage medium Download PDF

Info

Publication number
CN112151249A
CN112151249A CN202010871206.7A CN202010871206A CN112151249A CN 112151249 A CN112151249 A CN 112151249A CN 202010871206 A CN202010871206 A CN 202010871206A CN 112151249 A CN112151249 A CN 112151249A
Authority
CN
China
Prior art keywords
noise
transformer
feature set
noise reduction
reduction method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010871206.7A
Other languages
Chinese (zh)
Other versions
CN112151249B (en
Inventor
汪运
焦坤
郭龙刚
黄石磊
顾新行
魏治成
刘巍
刘洁
沈风
李增聚
刘鑫
李培亮
赵婉玲
韦伟
张晓刚
夏兵
钱东峰
张晓琼
赵大富
戚胜军
魏岩岩
戴佳琳
何春光
王义永
杜鹏
汪太平
朱元付
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Corp of China SGCC
Overhaul Branch of State Grid Anhui Electric Power Co Ltd
Original Assignee
State Grid Corp of China SGCC
Overhaul Branch of State Grid Anhui Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Corp of China SGCC, Overhaul Branch of State Grid Anhui Electric Power Co Ltd filed Critical State Grid Corp of China SGCC
Priority to CN202010871206.7A priority Critical patent/CN112151249B/en
Publication of CN112151249A publication Critical patent/CN112151249A/en
Application granted granted Critical
Publication of CN112151249B publication Critical patent/CN112151249B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01FMAGNETS; INDUCTANCES; TRANSFORMERS; SELECTION OF MATERIALS FOR THEIR MAGNETIC PROPERTIES
    • H01F27/00Details of transformers or inductances, in general
    • H01F27/33Arrangements for noise damping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2137Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on criteria of topology preservation, e.g. multidimensional scaling or self-organising maps
    • G06F18/21375Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on criteria of topology preservation, e.g. multidimensional scaling or self-organising maps involving differential geometry, e.g. embedding of pattern manifold
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data

Abstract

The embodiment of the invention provides an active noise reduction method and system for a transformer and a storage medium, and belongs to the technical field of noise reduction of transformers. The active noise reduction method comprises the following steps: acquiring the noise of the transformer; judging whether the noise comprises an abnormal noise component; and sending out early warning under the condition that the noise comprises the abnormal noise component. Through the technical scheme, the active noise reduction method, the active noise reduction system and the storage medium for the transformer further separate the abnormal noise component of the transformer on the basis of eliminating the normal noise component of the transformer, so that the real-time monitoring of the transformer fault is realized.

Description

Active noise reduction method and system for transformer and storage medium
Technical Field
The invention relates to the technical field of noise reduction of transformers, in particular to an active noise reduction method and system for a transformer and a storage medium.
Background
The transformer noise is mostly low frequency noise, has strong penetrability and extensive propagation distance, can cause the influence to very far place, has comparatively serious noise pollution to the owner of residential quarter.
In the prior art, a method for processing noise is to predict the frequency change of the noise in a neural network training mode, so as to control corresponding noise elimination equipment to counteract the noise. However, this noise cancellation method can only work for the transformer in the normal state. In the case where the transformer fails, the frequency of the noise becomes unpredictable, and the noise canceling effect cannot be achieved.
Disclosure of Invention
The invention aims to provide an active noise reduction method, an active noise reduction system and a storage medium for a transformer.
In order to achieve the above object, an embodiment of the present invention provides an active noise reduction method for a transformer, where the active noise reduction method includes:
acquiring the noise of the transformer;
judging whether the noise comprises an abnormal noise component;
and sending out early warning under the condition that the noise comprises the abnormal noise component.
Optionally, the determining whether the noise includes an abnormal noise component specifically includes:
extracting each identified noise feature of the transformer in a historical time period to form a feature set, wherein the feature set comprises a plurality of noise features and influence factors corresponding to each noise feature;
preprocessing the noise features in the feature set;
performing a filtering operation for each of the noise features in the feature set;
inputting the feature set into a quaternary classifier to train the quaternary classifier;
inputting the acquired noise into the quaternary classifier to determine whether an abnormal noise component can be separated.
Optionally, the influencing factors include a current covariance matrix and/or a subband energy variation ratio.
Optionally, the preprocessing the noise features in the feature set specifically includes:
performing data cleaning on the feature set by adopting a compressed neighbor rule algorithm;
carrying out normalization processing on the feature set by adopting a characterization standard method;
and performing data dimension reduction on the feature set by adopting a local linear embedding LLE algorithm.
Optionally, the performing a screening operation on each noise feature in the feature set specifically includes:
and sequentially executing feature deletion, feature selection and feature derivation operations on the noise features.
Optionally, the quaternary classifier comprises:
an input layer for receiving the feature set of an input;
a decision tree connected to the input layer for performing a first classification operation on the feature set;
a gradient boosting iterative decision tree connected to the input layer for performing a second classification operation on the feature set;
a multi-tier perceptron connected to the input tier for performing a third classification operation on the feature set;
the Gaussian core is connected with the input layer and is used for executing a fourth classification operation on the input layer;
and the fusion layer is connected with the decision tree, the gradient boosting iterative decision tree, the multilayer perceptron and the Gaussian core and is used for fusing results of the first classification operation, the second classification operation, the third classification operation and the fourth classification operation by using Choquet fuzzy integration and outputting a final classification result.
Optionally, the active noise reduction method further includes:
generating a corresponding digital noise reduction signal according to the normal noise component of the noise;
inputting a digital noise reduction signal to a noise cancellation module to cancel the noise.
In another aspect, the present invention also provides an active noise reduction system for a transformer, the active noise reduction system comprising:
the noise acquisition module is used for acquiring real-time noise of the transformer;
a noise cancellation module to initiate to cancel the noise; .
And the processor is connected with the noise acquisition device and the noise elimination device and is used for executing the active noise reduction method.
In yet another aspect, the present invention further provides a storage medium storing instructions for being read by a machine to cause the machine to perform any one of the above active noise reduction methods.
By the technical scheme, the active noise reduction method, the active noise reduction system and the storage medium for the transformer further separate the abnormal noise component of the transformer on the basis of eliminating the normal noise component of the transformer, so that the real-time monitoring of the transformer fault is realized.
Additional features and advantages of embodiments of the invention will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the embodiments of the invention without limiting the embodiments of the invention. In the drawings:
FIG. 1 is a factor distribution diagram of a noise source of a transformer;
FIG. 2 is a flow diagram of a noise reduction method for a transformer according to an embodiment of the present invention;
FIG. 3 is a partial flow diagram of a noise reduction method for a transformer according to an embodiment of the present invention;
FIG. 4 is a partial flow diagram of a noise reduction method for a transformer according to an embodiment of the present invention;
FIG. 5 is a partial flow diagram of a noise reduction method for a transformer according to an embodiment of the present invention;
FIG. 6 is a partial flow diagram of a noise reduction method for a transformer according to an embodiment of the present invention;
FIG. 7 is a partial flow diagram of a noise reduction method for a transformer according to an embodiment of the present invention; and
fig. 8 is a block diagram of a quad classifier according to an embodiment of the present invention.
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating embodiments of the invention, are given by way of illustration and explanation only, not limitation.
In the embodiments of the present invention, unless otherwise specified, the use of directional terms such as "upper, lower, top, and bottom" is generally used with respect to the orientation shown in the drawings or the positional relationship of the components with respect to each other in the vertical, or gravitational direction.
In addition, if there is a description of "first", "second", etc. in the embodiments of the present invention, the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In addition, technical solutions between the various embodiments can be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not be within the protection scope of the present invention.
In the prior art, a conventional noise reduction measure is to install a shock absorber and a spring shock absorber at the bottom of the transformer for the noise of the transformer body. The main body of the shock absorber is composed of stainless steel shock absorption and vibration isolation elements, has the advantages of high elasticity, high damping, fatigue resistance, long service life and the like, and can well meet the hardware requirements of the transformer. The spring shock absorber is arranged at the bottom of the transformer and is in contact with the ground, so that the noise influence caused by vibration can be reduced. However, the noise of the transformer, whether low frequency noise or electromagnetic noise, is propagated through the air. For this type of noise, it is common practice to use a special sound insulation cover or perform sound insulation treatment on the transformer room during the transmission path to attenuate the noise during the transmission path, so as to reduce the noise pollution to the surroundings caused by the noise.
For the noise of the transformer itself, there are mainly two noise sources: the noise generated by the transformer body is mainly generated by the vibration of an iron core, a winding, a magnetic shield and the like of the transformer; another example is noise from accessories of the transformer, such as noise caused by the transformer cooling device (fan or water pump), which is specifically distributed as shown in fig. 1. From the source of noise, the internal voltage and current of the transformer are periodically changed in time, so that the iron core and related parts of the transformer periodically move along with the periodic change, and mechanical vibration and air vibration are caused to generate noise. The loudness of this noise depends on the magnitude of the expansion and contraction. In an ideal case, if the frequency of the alternating current is 50Hz, the corresponding noise is 100 Hz. However, in actual practice, noise is at an ideal frequency as a fundamental frequency, and harmonic components and corresponding contributions of multiples of the fundamental frequency often vary, so the noise cannot be predicted simply with the ideal frequency as a reference. However, in the case where the external environment is not changed and the transformer itself does not malfunction, the noise of the transformer is absolutely converged. Therefore, the noise of the transformer can be predicted by adopting a neural network algorithm under the condition that the transformer does not have a fault.
Accordingly, the present invention provides an active noise reduction method for a transformer, as shown in fig. 2, the active noise reduction method may include:
in step S10, noise of the transformer is acquired. The manner of acquiring the noise may be in various forms known to those skilled in the art, such as presetting a noise acquisition module for acquiring real-time noise of the transformer on the transformer.
In step S11, it is determined whether the noise includes an abnormal noise component. Specifically, the step S11 may include steps as shown in fig. 3. In fig. 3, the step S11 may further include:
in step S20, each identified noise feature of the transformer over the historical period of time is extracted to form a feature set. Wherein the feature set may include a plurality of noise features and a corresponding influencing factor for each noise feature.
In this embodiment, the influencing factor may be a physical quantity causing noise, such as the voltage, the current, and the like described above. However, in actually designing the influencing factor, the inventors first performed time domain analysis on the physical quantity causing noise. In the time domain analysis, the inventor extracts abnormal noise data and normal noise data from a sample database of noise, compares the abnormal noise data and the normal noise data in a time domain, and further analyzes the characteristics of the abnormal noise data, and finds that the characteristics of the abnormal noise data should be reflected on physical quantity parameters such as current, frequency, and zero-crossing point. The inventors then performed a frequency domain analysis on this noise. In the process of frequency domain analysis, the inventor performs fourier transform on the extracted abnormal noise data and normal noise data, and finds that the frequency domain characteristics of the abnormal noise data are mainly reflected on physical quantity parameters such as energy, frequency band and current covariance matrix. In addition, the inventor considers that the classification and identification of noise sample data has common characteristics of general signal pattern identification and specificity different from other signal identification, and gradually abstracts the content of the data from the lowest layer of sampling level, the middle physical sample level and the highest signal characteristic level and gradually summarizes the representation of the signal. Thus, in a preferred example of the present invention, the inventors chose the current covariance matrix and/or the subband energy variation ratio.
In step S21, the noise features in the feature set are preprocessed. The pretreatment process may be a variety of operations including, but not limited to, those known in the art. In a preferred example of the invention, the pre-processing may specifically comprise the steps as illustrated in fig. 4. In fig. 4, the preprocessing process may include:
in step S30, a compressed Neighbor rule algorithm (CondensedNearest Neighbor, CNN) is used to perform data cleansing on the feature set. In particular, the data cleansing may further include missing value cleansingAt least one of format content scrubbing, logic error scrubbing, non-required data scrubbing, and correlation verification. The CNN algorithm is mainly used for extracting samples of sample boundaries, so that the training speed of the neural network is improved. Further, in this example, considering that the CNN algorithm randomly selects a sample subset, the selected sample subset may not be in the boundary domain, which results in redundant items in the sample subset. This redundancy can affect the efficiency of subsequent training. Therefore, in the step S30, the method may further include selecting the sample subset by using a boundary ratio method. In particular, the boundary ratio may be defined as
Figure BDA0002651172720000071
Where | x-y | may be used to represent the distance of sample y from sample x that is closest to sample x and is of a different type than sample x. | x '-y | may be used to represent the distance of sample x' from sample y that is closest to sample y and of a different type than sample y. Since the distance value | | | x' -y | | is generally smaller than the distance value | | | x-y | |, the value of the boundary ratio a (x) is between 0 and 1. After the border ratios of the sample subsets are calculated, the sample subsets can be selected from large to small, so that the sample subsets close to the border are selected.
In addition, the CNN algorithm is sensitive to data noise. In order to avoid the influence of the noise of the data on the CNN algorithm, the step S30 may further include a step of identifying and rejecting the noise by using the KNN algorithm. Specifically, the KNN algorithm may be adopted to determine whether the extracted sample can be identified, and if the extracted sample is identified incorrectly, the extracted sample is determined as a noise value and should be removed; otherwise, the training set can be replaced again.
For the CNN algorithm, the specific steps thereof may be to include steps as shown in fig. 5. In fig. 5, the CNN algorithm may include:
in step S40, traverse the original training set (feature set) U, and select each sample in the original training set U one by one;
in step S41, it is determined whether the sample subset V includes the selected sample;
in step S42, in the case where it is determined that the sample is not included in the sample subset V, the sample is added to V;
in step S43, in the case where it is determined that the sample subset V includes the sample, it is determined that the sample is a redundant item.
In step S31, the feature set is normalized by the method of the characterization standard. The specific steps of the normalization process can be in various forms known to those skilled in the art. In a preferred example of the present invention, the normalization processing operation may be, for example, first calculating a mean value of data in each dimension based on the whole data (feature set), and subtracting the mean value from the data corresponding to each dimension to obtain a standard deviation of the dimension, and finally dividing the data corresponding to each dimension by the corresponding standard deviation, so as to complete the normalization operation.
In step S32, a local linear embedding LLE algorithm is used to perform data dimension reduction on the feature set. In particular, the locally linear embedded LLE algorithm may comprise the steps as shown in fig. 6. In fig. 6, the locally linear embedding LLE algorithm may include:
in step S50, k neighbor points of each sample point are found;
in step S51, a local reconstruction weight matrix of each sample point is calculated from its neighboring points;
in step S52, an output value of the sample point is calculated from the local reconstruction weight matrix of the sample point and its neighboring points.
In step S22, a filtering operation is performed for each noise feature in the feature set. In particular, the screening operations may include feature deletion, feature selection, and feature derivation operations.
For the feature deletion operation, a variance expansion factor method can be adopted, the variance expansion factor of each influence factor in the feature set is calculated, and then the influence factors of which the variance expansion factors exceed the range of a preset threshold value are deleted, so that the first round of screening is realized. Wherein the preset threshold range may be (0, 1).
For the feature selection operation, the pearson correlation coefficient method may be employed. Specifically, the pearson correlation coefficient may be obtained by processing each identified fingerprint feature (influence factor) in the feature set to obtain each identified fingerprint feature having a correlation greater than a preset correlation threshold, and replacing all the identified fingerprint features in the feature set with the each identified fingerprint feature, thereby implementing a second round of screening on the feature set.
For the feature derivation operation, the first-level identification fingerprint features of each noise feature in the feature set may be first derived to obtain corresponding second-level identification fingerprint features, which are added to the feature set as each identification fingerprint feature, so as to implement the third round of screening of the feature set.
In step S23, the feature set is input into a quaternary classifier to train the quaternary classifier. In this embodiment, the quaternary classifier may be a structure as shown in fig. 7. In fig. 7, the quad classifier may include an input layer 10, a decision tree 21, a gradient iteration decision tree 22, a multi-layered perceptron 23, a gaussian kernel 24, and a fusion layer 30. Wherein the input layer 10 may be used to receive an input feature set. A decision tree 21 may be connected to the input layer 10 for performing a first classification operation on the feature set. A gradient boosting iterative decision tree 22 may be connected to the input layer 10 for performing a second classification operation on the feature set. The multi-tier perceptron 23 may be coupled to the input tier 10 for performing a third classification operation on the feature set. A gaussian kernel 24 may be connected to the input layer 10 for performing a fourth classification operation on the feature set. The fusion layer 30 may be connected to the decision tree 21, the gradient boosting iterative decision tree 22, the multilayer perceptron 23, and the gaussian kernel 24, and is configured to fuse results of the first classification operation, the second classification operation, the third classification operation, and the fourth classification operation by using Choquet fuzzy integration and output a final classification result.
In this embodiment, for the decision tree 21, the training process thereof may include feature selection, generation of the decision tree, and pruning of the decision tree. For the iterative decision tree 22 (GBDT), the training process may be according to boosting thought, since it belongs to one of boosting algorithms. At each step of the GBDT algorithm, a decision tree is used to fit the residual error of the current learning period, and a new weak learner is obtained. And finally, combining the decision trees of each step to obtain the final strong learning period, namely the GBDT finished by training. The multi-layered perceptron 23(MLPC) is a feedforward Artificial Neural Network (ANN) based classifier, consisting of multiple node layers. Each layer is fully connected to the next layer of the network. The nodes of the first layer as input are used to represent input data, i.e. feature sets. All other node layers map the feature set of the input to the node layer of the output by linearly combining the nodes of the input layer with the weights w and biases b of the other node layers and applying the activation function. The gaussian kernel 24, also known as a Support Vector Machine (SVM), is a two-class model whose basic model is a linear classifier defined at maximum intervals over a feature space. The maximum spacing feature distinguishes it from the multi-tier perceptron 23. In addition, the SVM can also include a kernel technique, which makes it a substantially non-linear classifier. The learning strategy of the SVM is interval maximization, can be formalized into a problem of solving convex quadratic programming, and is also equivalent to the minimization problem of a regularized hinge loss function.
In step S24, the acquired noise is input to the quaternary classifier to determine whether or not an abnormal noise component can be separated.
In step S12, in the case where it is determined that the noise includes an abnormal noise component, a warning is issued, so that it is possible to enable a worker to take measures against a fault of the transformer in time through the warning.
In addition, in addition to giving an early warning to the abnormal noise component, for the separated normal noise component, the active noise reduction method further includes generating a corresponding digital noise reduction signal according to the normal noise component of the noise, and inputting the digital noise reduction signal to the noise elimination module to eliminate the noise.
In another aspect, the present invention further provides an active noise reduction system for a transformer, as shown in fig. 8, which may include a noise collection module 40, a noise cancellation module 41, and a processor 43. The noise acquisition module 40 is used for acquiring real-time noise of the transformer; the noise cancellation module 41 may be used to start up to cancel noise; the processor 43 may be connected to the noise acquisition means 40 and the noise cancellation means 41 for performing the active noise reduction method as described above. Wherein, the noise collecting module 40 may be a structure known to those skilled in the art. In a preferred example of the present invention, the noise collection module 40 may include a microphone collection array for collecting field noise, a current collector for collecting real-time current data of the transformer, and a vibration sensor for collecting vibration frequency of the transformer, and the noise cancellation device 41 may include a rare earth super-magnetostrictive ultrasonic ring energy device active vibration device.
In yet another aspect, the present invention also provides a storage medium that may store instructions that may be used to be read by a machine to cause the machine to perform the active noise reduction method as described above.
Although the embodiments of the present invention have been described in detail with reference to the accompanying drawings, the embodiments of the present invention are not limited to the details of the above embodiments, and various simple modifications can be made to the technical solution of the embodiments of the present invention within the technical idea of the embodiments of the present invention, and the simple modifications all belong to the protection scope of the embodiments of the present invention.
It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, the embodiments of the present invention will not be described separately for the various possible combinations.
Those skilled in the art can understand that all or part of the steps in the method for implementing the above embodiments may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a single chip, a chip, or a processor (processor) to execute all or part of the steps in the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In addition, various different embodiments of the present invention may be arbitrarily combined with each other, and the embodiments of the present invention should be considered as disclosed in the disclosure of the embodiments of the present invention as long as the embodiments do not depart from the spirit of the embodiments of the present invention.

Claims (9)

1. An active noise reduction method for a transformer, the active noise reduction method comprising:
acquiring the noise of the transformer;
judging whether the noise comprises an abnormal noise component;
and sending out early warning under the condition that the noise comprises the abnormal noise component.
2. The active noise reduction method according to claim 1, wherein determining whether the noise includes an abnormal noise component specifically comprises:
extracting each identified noise feature of the transformer in a historical time period to form a feature set, wherein the feature set comprises a plurality of noise features and influence factors corresponding to each noise feature;
preprocessing the noise features in the feature set;
performing a filtering operation for each of the noise features in the feature set;
inputting the feature set into a quaternary classifier to train the quaternary classifier;
inputting the acquired noise into the quaternary classifier to determine whether an abnormal noise component can be separated.
3. The active noise reduction method according to claim 2, wherein the influencing factors comprise a current covariance matrix and/or a subband energy variation ratio.
4. The active noise reduction method according to claim 2, wherein preprocessing the noise features in the feature set specifically comprises:
performing data cleaning on the feature set by adopting a compressed neighbor rule algorithm;
carrying out normalization processing on the feature set by adopting a characterization standard method;
and performing data dimension reduction on the feature set by adopting a local linear embedding LLE algorithm.
5. The active noise reduction method according to claim 2, wherein performing a filtering operation for each of the noise features in the feature set specifically comprises:
and sequentially executing feature deletion, feature selection and feature derivation operations on the noise features.
6. The active noise reduction method of claim 2, wherein the quaternary classifier comprises:
an input layer for receiving the feature set of an input;
a decision tree connected to the input layer for performing a first classification operation on the feature set;
a gradient boosting iterative decision tree connected to the input layer for performing a second classification operation on the feature set;
a multi-tier perceptron connected to the input tier for performing a third classification operation on the feature set;
the Gaussian core is connected with the input layer and is used for executing a fourth classification operation on the input layer;
and the fusion layer is connected with the decision tree, the gradient boosting iterative decision tree, the multilayer perceptron and the Gaussian core and is used for fusing results of the first classification operation, the second classification operation, the third classification operation and the fourth classification operation by using Choquet fuzzy integration and outputting a final classification result.
7. The active noise reduction method of claim 1, further comprising:
generating a corresponding digital noise reduction signal according to the normal noise component of the noise;
inputting a digital noise reduction signal to a noise cancellation module to cancel the noise.
8. An active noise reduction system for a transformer, the active noise reduction system comprising:
the noise acquisition module is used for acquiring real-time noise of the transformer;
a noise cancellation module to initiate to cancel the noise; .
A processor connected to the noise collecting device and the noise eliminating device for executing the active noise reduction method according to any one of claims 1 to 7.
9. A storage medium storing instructions for reading by a machine to cause the machine to perform the active noise reduction method of any of claims 1 to 7.
CN202010871206.7A 2020-08-26 2020-08-26 Active noise reduction method and system for transformer and storage medium Active CN112151249B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010871206.7A CN112151249B (en) 2020-08-26 2020-08-26 Active noise reduction method and system for transformer and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010871206.7A CN112151249B (en) 2020-08-26 2020-08-26 Active noise reduction method and system for transformer and storage medium

Publications (2)

Publication Number Publication Date
CN112151249A true CN112151249A (en) 2020-12-29
CN112151249B CN112151249B (en) 2024-04-02

Family

ID=73888971

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010871206.7A Active CN112151249B (en) 2020-08-26 2020-08-26 Active noise reduction method and system for transformer and storage medium

Country Status (1)

Country Link
CN (1) CN112151249B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971908A (en) * 2014-05-06 2014-08-06 国家电网公司 Transformer noise suppression method
US9008329B1 (en) * 2010-01-26 2015-04-14 Audience, Inc. Noise reduction using multi-feature cluster tracker
CN107402064A (en) * 2017-07-25 2017-11-28 上海控创信息技术股份有限公司 noise detecting method and system
CN108414079A (en) * 2018-03-21 2018-08-17 广东电网有限责任公司电力科学研究院 A kind of transformer noise monitoring system
CN109374119A (en) * 2018-09-29 2019-02-22 国网山西省电力公司阳泉供电公司 Transformer vibration signal Characteristic Extraction method
CN111063528A (en) * 2019-11-25 2020-04-24 深圳供电局有限公司 Active noise reduction equipment of transformer and transformer noise on-line monitoring system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9008329B1 (en) * 2010-01-26 2015-04-14 Audience, Inc. Noise reduction using multi-feature cluster tracker
CN103971908A (en) * 2014-05-06 2014-08-06 国家电网公司 Transformer noise suppression method
CN107402064A (en) * 2017-07-25 2017-11-28 上海控创信息技术股份有限公司 noise detecting method and system
CN108414079A (en) * 2018-03-21 2018-08-17 广东电网有限责任公司电力科学研究院 A kind of transformer noise monitoring system
CN109374119A (en) * 2018-09-29 2019-02-22 国网山西省电力公司阳泉供电公司 Transformer vibration signal Characteristic Extraction method
CN111063528A (en) * 2019-11-25 2020-04-24 深圳供电局有限公司 Active noise reduction equipment of transformer and transformer noise on-line monitoring system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
汪泽州等: "小区配电变压器主动降噪装置的研制探析", 《现代工业经济和信息化》, no. 12, pages 73 - 75 *

Also Published As

Publication number Publication date
CN112151249B (en) 2024-04-02

Similar Documents

Publication Publication Date Title
CN108062572B (en) Hydroelectric generating set fault diagnosis method and system based on DdAE deep learning model
CN108596212B (en) Transformer fault diagnosis method based on improved cuckoo search optimization neural network
KR101844932B1 (en) Signal process algorithm integrated deep neural network based speech recognition apparatus and optimization learning method thereof
US5402520A (en) Neural network method and apparatus for retrieving signals embedded in noise and analyzing the retrieved signals
CN106628097B (en) A kind of ship equipment method for diagnosing faults based on improvement radial base neural net
JP6235938B2 (en) Acoustic event identification model learning device, acoustic event detection device, acoustic event identification model learning method, acoustic event detection method, and program
Cao et al. An enhance excavation equipments classification algorithm based on acoustic spectrum dynamic feature
CN109065028A (en) Speaker clustering method, device, computer equipment and storage medium
CN110929847A (en) Converter transformer fault diagnosis method based on deep convolutional neural network
Khushaba et al. Differential evolution based feature subset selection
US11586652B2 (en) Variable-length word embedding
CN114152825B (en) Transformer fault diagnosis method and device and transformer fault diagnosis system
Li et al. An improved real-valued negative selection algorithm based on the constant detector for anomaly detection
CN114818806A (en) Gearbox fault diagnosis method based on wavelet packet and depth self-encoder
Liu et al. Improved conformer-based end-to-end speech recognition using neural architecture search
CN112151249A (en) Active noise reduction method and system for transformer and storage medium
CN117273516A (en) Performance evaluation method based on attention mechanism neural network
CN116840743A (en) Power transformer fault processing method and device, electronic equipment and storage medium
Ma et al. The pattern classification based on fuzzy min-max neural network with new algorithm
CN116051268A (en) Personal credit evaluation method, system, readable storage medium and computer device
Kajornrit et al. An integrated intelligent technique for monthly rainfall time series prediction
Kasubi et al. A Comparative Study of Feature Selection Methods for Activity Recognition in the Smart Home Environment
Hsiung et al. Detection of leaks in a liquid-liquid heat exchanger using passive acoustic noise
CN113449409A (en) Method and equipment for storing sample data of offshore wind turbine fault diagnosis model
CN117031194B (en) Ultrasonic hidden danger detection method and system for power distribution network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant