CN117520818A - Power fingerprint identification method and device, electronic equipment and storage medium - Google Patents
Power fingerprint identification method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN117520818A CN117520818A CN202311479044.2A CN202311479044A CN117520818A CN 117520818 A CN117520818 A CN 117520818A CN 202311479044 A CN202311479044 A CN 202311479044A CN 117520818 A CN117520818 A CN 117520818A
- Authority
- CN
- China
- Prior art keywords
- network
- fine tuning
- data
- training
- tuning network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 238000003860 storage Methods 0.000 title claims abstract description 13
- 238000012549 training Methods 0.000 claims abstract description 81
- 230000006870 function Effects 0.000 claims description 42
- 238000012545 processing Methods 0.000 claims description 20
- 238000004140 cleaning Methods 0.000 claims description 15
- 238000010276 construction Methods 0.000 claims description 12
- 238000004364 calculation method Methods 0.000 claims description 8
- 238000010606 normalization Methods 0.000 claims description 8
- 230000002159 abnormal effect Effects 0.000 claims description 6
- 238000000605 extraction Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 9
- 239000011159 matrix material Substances 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 239000013598 vector Substances 0.000 description 8
- 230000008859 change Effects 0.000 description 7
- 238000004590 computer program Methods 0.000 description 7
- 238000012795 verification Methods 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000005265 energy consumption Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000011478 gradient descent method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/213—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/10—Pre-processing; Data cleansing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/0985—Hyperparameter optimisation; Meta-learning; Learning-to-learn
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention discloses a power fingerprint identification method, a device, electronic equipment and a storage medium, which are used for solving the problems of low power fingerprint identification efficiency and large calculated amount. Comprising the following steps: collecting electrical characteristic quantities of various preset electrical equipment to construct an operation database; extracting a training data set from the operation database according to the downstream task; building a fine tuning network of a downstream task, and generating a large power fingerprint model to be trained by combining a pre-training network; inputting the training data set into network parameters of a power fingerprint large model training fine-tuning network to be trained to obtain adjustment parameters; adjusting the fine tuning network by adopting the adjustment parameters; judging whether the fine tuning network converges or not; if not, returning to input the training data set into the power fingerprint large model to be trained, and training the network parameters of the fine tuning network to obtain the adjustment parameters; if yes, generating a large electric fingerprint model by adopting the adjusted fine tuning network and the pre-training network; and executing the downstream task by adopting the power fingerprint large model to obtain the power fingerprint of the downstream task.
Description
Technical Field
The present invention relates to the field of fingerprint identification technologies, and in particular, to a method and apparatus for power fingerprint identification, an electronic device, and a storage medium.
Background
In recent years, with the vigorous development of electric power artificial intelligence, the power grid gradually tends to be intelligent, and load identification and monitoring have become research hotspots. The electrical equipment fingerprint identification is to install an electrical fingerprint identification terminal device at a user power inlet, so that specific information of user power consumption can be obtained. The technology has the advantages that the energy consumption condition of each electrical equipment can be known to a household user, so that corresponding energy-saving measures are adopted, the energy consumption is reduced, and the electricity cost is reduced; the intelligent grid can be realized on the power grid side, and the functions of intelligent scheduling, self-healing capacity, energy optimization and the like in the intelligent power grid are supported, so that the intelligent and efficient management of the power grid is realized.
Power fingerprinting has many application scenarios, such as load prediction, load identification, power quality monitoring, etc. The conventional method on the market is to train an artificial intelligent model independently aiming at different tasks, but the method needs to consume huge computing resources and storage resources, and has the technical problems of complex algorithm, large data volume requirement, low model updating efficiency and the like, so that the electric fingerprint identification efficiency is low and the calculation amount is large.
Disclosure of Invention
The invention provides a power fingerprint identification method, a device, electronic equipment and a storage medium, which are used for solving the technical problems of low power fingerprint identification efficiency and large calculated amount.
The invention provides a power fingerprint identification method, which comprises the following steps:
collecting various preset electrical characteristic quantities of electrical equipment, and constructing an operation database by adopting the electrical characteristic quantities;
receiving a downstream task, and extracting a training data set from the operation database according to the downstream task;
building a fine tuning network of the downstream task, and generating a large power fingerprint model to be trained by combining a pre-training network;
inputting the training data set into the electric fingerprint large model to be trained, and training network parameters of the fine tuning network to obtain adjustment parameters;
adjusting the fine tuning network by adopting the adjustment parameters;
judging whether the fine tuning network converges or not;
if not, returning to input the training data set into the electric fingerprint large model to be trained, and training the network parameters of the fine tuning network to obtain adjustment parameters;
if yes, generating a large electric fingerprint model by adopting the adjusted fine tuning network and the pre-training network;
and executing the downstream task by adopting the power fingerprint large model to obtain the power fingerprint of the downstream task.
Optionally, the step of collecting electrical characteristic quantities of a plurality of preset electrical devices and constructing an operation database by adopting the electrical characteristic quantities includes:
collecting various preset electrical characteristic quantities of electrical equipment through a preset sensor;
data cleaning is carried out on the electrical characteristic quantity to obtain cleaned data;
carrying out data marking on the cleaned data to obtain labels of all the cleaned data;
and constructing an operation database by adopting the cleaned data and the corresponding labels.
Optionally, the step of performing data cleaning on the electrical feature quantity to obtain cleaned data includes:
normalizing the electrical characteristic quantity to obtain normalized data;
and performing outlier processing on the normalized data to obtain cleaned data.
Optionally, the step of determining whether the fine-tuning network converges includes:
calculating a current loss function value of the fine tuning network;
acquiring a last loss function value of the fine tuning network;
calculating the difference value between the current loss function value and the last loss function value;
and judging whether the fine-tuning network converges or not by adopting the difference value and a preset convergence threshold value.
The invention also provides a device for identifying the electric fingerprint, which comprises:
the operation database construction module is used for collecting various preset electrical characteristic quantities of the electrical equipment and constructing an operation database by adopting the electrical characteristic quantities;
the training data set extraction module is used for receiving a downstream task and extracting a training data set from the operation database according to the downstream task;
the electric fingerprint large model construction module is used for constructing a fine tuning network of the downstream task and generating an electric fingerprint large model to be trained by combining a pre-training network;
the adjustment parameter generation module is used for inputting the training data set into the electric fingerprint large model to be trained, and training the network parameters of the fine tuning network to obtain adjustment parameters;
the fine tuning network adjusting module is used for adjusting the fine tuning network by adopting the adjusting parameters;
the convergence judging module is used for judging whether the fine tuning network converges or not;
the return module is used for returning to input the training data set into the electric fingerprint large model to be trained, and training the network parameters of the fine tuning network to obtain adjustment parameters;
the power fingerprint large model generation module is used for generating a power fingerprint large model by adopting the adjusted fine tuning network and the pre-training network if the power fingerprint large model is generated;
and the power fingerprint identification module is used for executing the downstream task by adopting the power fingerprint large model to obtain the power fingerprint of the downstream task.
Optionally, the operation database construction module includes:
the electrical characteristic quantity acquisition sub-module is used for acquiring electrical characteristic quantities of various preset electrical equipment through a preset sensor;
the data cleaning sub-module is used for cleaning the electrical characteristic quantity data to obtain cleaned data;
the data marking sub-module is used for carrying out data marking on the cleaned data to obtain labels of all the cleaned data;
and the operation database construction sub-module is used for constructing an operation database by adopting the cleaned data and the corresponding labels.
Optionally, the data cleansing submodule includes:
the normalization unit is used for carrying out normalization processing on the electrical characteristic quantity to obtain normalized data;
and the abnormal value processing unit is used for carrying out abnormal value processing on the normalized data to obtain cleaned data.
Optionally, the convergence judging module includes:
a current loss function value calculation sub-module for calculating a current loss function value of the fine tuning network;
a last loss function value obtaining sub-module, configured to obtain a last loss function value of the fine tuning network;
a difference value calculation sub-module, configured to calculate a difference value between the current loss function value and the last loss function value;
and the convergence judging sub-module is used for judging whether the fine-tuning network converges or not by adopting the difference value and a preset convergence threshold value.
The invention also provides an electronic device comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the power fingerprinting method of any of the above claims according to instructions in the program code.
The present invention also provides a computer readable storage medium for storing program code for performing the power fingerprint identification method as described in any one of the above.
From the above technical scheme, the invention has the following advantages: the invention discloses a power fingerprint identification method, which comprises the following steps: collecting various preset electrical characteristic quantities of electrical equipment, and constructing an operation database by adopting the electrical characteristic quantities; receiving a downstream task, and extracting a training data set from an operation database according to the downstream task; building a fine tuning network of a downstream task, and generating a large power fingerprint model to be trained by combining a pre-training network; inputting the training data set into a large electric fingerprint model to be trained, and training a network of the fine tuning network to obtain adjustment parameters; adjusting the fine tuning network by adopting the adjustment parameters; judging whether the fine tuning network converges or not, if not, returning to input the training data set into the power fingerprint large model to be trained, and training network parameters of the fine tuning network to obtain adjustment parameters; if yes, generating a large electric fingerprint model by adopting the adjusted fine tuning network and the pre-training network; and executing the downstream task by adopting the power fingerprint large model to obtain the power fingerprint of the downstream task. Under the condition of keeping the parameters of the pre-training network unchanged, the invention rapidly obtains the optimal power fingerprint large model matched with the downstream task by updating the fine-tuning network, reduces the training expenditure of the model, improves the identification efficiency of the power fingerprint, and provides powerful support for intelligent multi-mode analysis of the power information of the user.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings can be obtained from these drawings without inventive faculty for a person skilled in the art.
Fig. 1 is a flowchart of steps of a power fingerprint identification method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a power fingerprint large model training process according to an embodiment of the present invention;
fig. 3 is a block diagram of a power fingerprint identification device according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a power fingerprint identification method, a device, electronic equipment and a storage medium, which are used for solving the technical problems of low power fingerprint identification efficiency and large calculation amount.
In order to make the objects, features and advantages of the present invention more comprehensible, the technical solutions in the embodiments of the present invention are described in detail below with reference to the accompanying drawings, and it is apparent that the embodiments described below are only some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, fig. 1 is a flowchart illustrating steps of a power fingerprint identification method according to an embodiment of the present invention.
The invention provides a power fingerprint identification method, which specifically comprises the following steps:
step 101, collecting various preset electrical characteristic quantities of electrical equipment, and constructing an operation database by adopting the electrical characteristic quantities;
in the embodiment of the invention, the preset electrical equipment can be common electrical equipment such as a washing machine, a thermos, a fan, an air conditioner and the like in a household.
The electrical characteristic quantity may include an operating current value, an operating voltage value, an operating temperature, and the like of the electrical device. By collecting the electrical characteristic quantity of the electrical equipment, the operation data of the electrical equipment can be generated, so that an operation database of the electrical equipment is formed.
The operation data of the electrical equipment refer to characteristics displayed when the electrical equipment is in an operation state, and the characteristics comprise a current value, a voltage value, a temperature value, active power, reactive power, a power factor, current harmonic waves, current abrupt change quantity and power abrupt change quantity of the electrical equipment in a specific working mode, and the like. And respectively storing the operation data and the labels of the electrical equipment acquired by the sensor as a group of training samples into an operation database.
In one example, the step of collecting a plurality of electrical characteristic quantities of the preset electrical equipment and constructing an operation database by adopting the electrical characteristic quantities may include the following substeps:
s11, acquiring electrical characteristic quantities of various preset electrical equipment through a preset sensor;
s12, data cleaning is carried out on the electrical characteristic quantity to obtain cleaned data;
in the embodiment of the invention, the working current of the electrical equipment can be acquired by adopting the current sensor, the working voltage of the electrical equipment can be acquired by adopting the voltage sensor, and the working temperature of the electrical equipment can be acquired by adopting the temperature sensor. And then data preprocessing is carried out on the acquired data.
Wherein the data preprocessing may include data cleansing and data tagging. And (3) carrying out data cleaning and data marking on the acquired data to obtain labels of cleaned data which are cleaned, so that an operation database is built by combining the labels.
In one example, the step of data cleaning the electrical feature quantities to obtain cleaned data may include the sub-steps of:
s121, carrying out normalization processing on the electrical characteristic quantity to obtain normalized data;
s122, performing outlier processing on the normalized data to obtain cleaned data.
In particular implementations, data cleaning may include normalization processing, which refers to scaling an input vector to the [ -1,1] interval with a min-max scaling, and outlier processing:
wherein x is the current input vector, x' is the normalized value of the current input vector, x max Refers to all inputsMaximum value of vector, x min Refers to the minimum of all input vectors.
Outlier processing refers to interpolation of missing values or discontinuous data that occur during sensor acquisition and transmission:
wherein t is the sampling time to be interpolated, t 1 And t 2 Is the known sampling instant, x 1 And x 2 Is a known input vector and x is the input vector to be interpolated.
S13, carrying out data marking on the cleaned data to obtain labels of all the cleaned data;
after the data cleaning is completed, all cleaned data can be subjected to data marking to obtain labels of all cleaned data.
Data tagging refers to tagging an input vector according to a given downstream task. When the task is classified, the label can be the type of each electrical equipment; in the regression task, the tag may be time series data; in the case of multimodal tasks, the tags can be associated text, time series, etc. data.
S14, constructing an operation database by adopting the cleaned data and the corresponding labels.
In a specific implementation, the cleaned data may be used to generate operation data of the electrical device, where the operation data may include a current value, a voltage value, a temperature value, active power, reactive power, a power factor, a current harmonic, a current abrupt change and a power abrupt change when the electrical device switches in a specific working mode. And respectively taking the operation data and the labels as a group of training samples, and constructing an operation database of the electrical equipment.
The operation database of the electrical equipment can store private data acquired by self and also can store public data sets, such as REDD, REFIT and the like.
Step 102, receiving a downstream task, and extracting a training data set from an operation database according to the downstream task;
in the embodiment of the invention, corresponding operation data can be extracted from the operation database according to the requirement of a downstream task, and then the extracted operation data is divided into a training data set, a verification data set and a test data set by a random and proportional extraction mode. For example, the division may be made in a ratio of 3:1:1 or 7:2:1, etc. In addition, those skilled in the art can select the ratio according to actual needs, and the embodiment of the present invention is not particularly limited thereto.
Wherein the training data set is used for power fingerprint large model tuning parameters to minimize the loss function. The verification data set is used for tuning and selecting the optimal super parameters of the power fingerprint large model. The test dataset was used to evaluate the generalization ability and performance of the trained power fingerprint large model.
Step 103, constructing a fine tuning network of a downstream task, and generating a large power fingerprint model to be trained by combining a pre-training network;
in the embodiment of the invention, the pre-training network can be formed by stacking convolutional neural networks, can be formed by stacking cyclic neural networks, or is any neural network model based on deep learning.
For example, a gated recurrent neural network may be employed to construct a pre-trained model, the principle of which is as follows:
z t =σ(W z ·[h t-1 ,x t ]+b z )
r t =σ(W r ·[h t-1 ,x t ]+b r )
wherein z is t And r t The output values of the updating gate and the forgetting gate at the moment t; h is a t-1 Is the hidden layer state value at the moment t; x is x t An input value at time t; w is a weight matrix; b is a bias matrix;is a per element multiplication operation.
The fine tuning network suitable for the downstream task refers to a network with the same structure as a part to be fine tuned in the pre-training network, and can be a weight matrix W of a neural network of a certain layer in the training network, a bias matrix b or any parameter matrix in the neural network, wherein the fine tuning network is used for storing incremental weights, namely the weight updating amount generated in the fine tuning process:
h=Wx+ΔWx
where x is the network input, h is the network output, W is the pre-training weight, and ΔW is the incremental weight.
Step 104, inputting a training data set into the power fingerprint large model to be trained, and training network parameters of the fine tuning network to obtain adjustment parameters;
step 105, adopting adjustment parameters to adjust the fine tuning network;
in the embodiment of the invention, the training data set can be input into the power fingerprint large model to be trained so as to update the parameters of the fine tuning network in the power fingerprint large model to be trained by using the training data set, and then the super parameters of the model are adjusted by using the verification data set. The specific process is shown in fig. 2.
Wherein, training the network parameters of the fine tuning network first requires using matrices a and B to approximate Δw:
ΔW=BA
where a and B are low rank matrices, in one example, low rank matrix a is initialized with gaussian and low rank matrix B is initialized with zero; or zero initialization is used for low rank matrix a and gaussian initialization is used for low rank matrix B. The purpose is to enable the initial training fine tuning network to influence the original network output. The amount of trimming parameters for aw is reduced by low-rank approximation of the delta weights aw.
Further, super parameters alpha and beta are introduced in training to adjust the performance of the model in the training process, so that:
h=Wx+αBAx
and alpha is a super parameter of the large power fingerprint model to be trained and is used for controlling the learning rate of the large power fingerprint model to be trained.
Then, fixing a pre-training network of the electric fingerprint large model to be trained, and updating parameters of the fine-tuning network by using a gradient descent method:
where θ is the network parameter of the fine tuning network, μ is the learning rate of the gradient descent,is the gradient of the loss function to the network parameters of the fine-tuning network.
Step 106, judging whether the fine tuning network is converged;
after the network parameters of the fine tuning network are updated, whether the fine tuning network is converged or not can be judged to judge whether the fine tuning network is updated iteratively or not.
In one example, the step of determining whether the fine-tuning network converges may include the sub-steps of:
s61, calculating a current loss function value of the fine tuning network;
s62, obtaining a last loss function value of the fine tuning network;
s63, calculating the difference value between the current loss function value and the last loss function value;
s64, judging whether the fine tuning network converges or not by adopting the difference value and a preset convergence threshold value.
In a specific implementation, when the change between two iterations of the loss function of the verification data set is smaller than the preset convergence threshold, the power fingerprint large model to be trained can be considered to be converged, namely:
|L (t) -L (t+1) |≤ε
wherein L is (t) Is the t timeLoss function value of iteration, L (t+1) Is the loss function value of the t+1st iteration; epsilon is a convergence threshold value set in advance and represents the allowable range of the change of the loss function.
Step 107, if not, returning to input the training data set into the large power fingerprint model to be trained, and training the network of the fine tuning network to obtain the adjustment parameters;
if the change between two iterations of the loss function of the verification data set is smaller than the preset convergence threshold, the step 104 is returned to, and the iterative update of the network parameters of the fine tuning network is entered.
Step 108, if yes, generating a large electric fingerprint model by adopting the adjusted fine tuning network and the pre-training network;
and after the fine tuning network is updated, combining the fine tuning network and the pre-training network to form a trained power fingerprint large model. And then evaluating the recognition effect of the trained power fingerprint large model on the test data set, namely completing the whole power fingerprint recognition flow.
And step 109, executing the downstream task by adopting the power fingerprint large model to obtain the power fingerprint of the downstream task.
After the electric fingerprint large model is obtained through training, a given downstream task can be executed by using the trained electric fingerprint large model, and electric fingerprint identification of the downstream task is completed.
According to the embodiment of the invention, under the condition that parameters of the pre-training network are kept unchanged, only the fine-tuning network is updated, the optimal power fingerprint large model matched with the downstream task is rapidly obtained, and the training expense of the large model is reduced, so that powerful support is provided for intelligent multi-mode analysis of the power utilization information of the user. In addition, the electric fingerprint large model training method provided by the embodiment of the invention has strong generalization capability, is convenient to implement, can be used for conveniently carrying out model training and deployment on various electric fingerprint identification scenes, and has certain practical application value.
Referring to fig. 3, fig. 3 is a block diagram illustrating a power fingerprint identification apparatus according to an embodiment of the present invention.
The embodiment of the invention also provides a device for identifying the electric fingerprint, which comprises the following components:
an operation database construction module 301, configured to collect electrical characteristic quantities of a plurality of preset electrical devices, and construct an operation database by using the electrical characteristic quantities;
a training data set extraction module 302, configured to receive a downstream task, and extract a training data set from the operation database according to the downstream task;
the power fingerprint large model construction module 303 is configured to construct a fine tuning network of a downstream task, and combine with a pre-training network to generate a power fingerprint large model to be trained;
the adjustment parameter generation module 304 is configured to input a training data set into the large power fingerprint model to be trained, and train network parameters of the fine tuning network to obtain adjustment parameters;
a fine tuning network adjustment module 305 for adjusting the fine tuning network using the adjustment parameters;
a convergence judging module 306, configured to judge whether the fine tuning network converges;
a return module 307, configured to return to input the training data set into the large power fingerprint model to be trained, and train the network parameters of the fine tuning network to obtain the adjustment parameters;
the power fingerprint large model generation module 308 is configured to generate a power fingerprint large model by adopting the adjusted fine tuning network and the pre-training network if the power fingerprint large model is generated;
the power fingerprint identification module 309 is configured to perform a downstream task by using the power fingerprint large model, and obtain a power fingerprint of the downstream task.
In an embodiment of the present invention, the operation database construction module 301 includes:
the electrical characteristic quantity acquisition sub-module is used for acquiring electrical characteristic quantities of various preset electrical equipment through a preset sensor;
the data cleaning sub-module is used for cleaning the electrical characteristic quantity data to obtain cleaned data;
the data marking sub-module is used for carrying out data marking on the cleaned data to obtain labels of all the cleaned data;
and the operation database construction sub-module is used for constructing an operation database by adopting the cleaned data and the corresponding labels.
In an embodiment of the present invention, a data cleansing sub-module includes:
the normalization unit is used for carrying out normalization processing on the electrical characteristic quantity to obtain normalized data;
and the abnormal value processing unit is used for carrying out abnormal value processing on the normalized data to obtain cleaned data.
In the embodiment of the present invention, the convergence judging module 306 includes:
a current loss function value calculation sub-module for calculating a current loss function value of the fine tuning network;
the last loss function value acquisition sub-module is used for acquiring a last loss function value of the fine tuning network;
the difference value calculation sub-module is used for calculating the difference value between the current loss function value and the last loss function value;
and the convergence judging sub-module is used for judging whether the fine tuning network converges or not by adopting the difference value and a preset convergence threshold value.
The embodiment of the invention also provides electronic equipment, which comprises a processor and a memory:
the memory is used for storing the program codes and transmitting the program codes to the processor;
the processor is used for executing the power fingerprint identification method according to the embodiment of the invention according to the instructions in the program code.
The embodiment of the invention also provides a computer readable storage medium, which is used for storing program codes, and the program codes are used for executing the electric fingerprint identification method of the embodiment of the invention.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the invention may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or terminal device comprising the element.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.
Claims (10)
1. A method of power fingerprinting, comprising:
collecting various preset electrical characteristic quantities of electrical equipment, and constructing an operation database by adopting the electrical characteristic quantities;
receiving a downstream task, and extracting a training data set from the operation database according to the downstream task;
building a fine tuning network of the downstream task, and generating a large power fingerprint model to be trained by combining a pre-training network;
inputting the training data set into the electric fingerprint large model to be trained, and training network parameters of the fine tuning network to obtain adjustment parameters;
adjusting the fine tuning network by adopting the adjustment parameters;
judging whether the fine tuning network converges or not;
if not, returning to input the training data set into the electric fingerprint large model to be trained, and training the network parameters of the fine tuning network to obtain adjustment parameters;
if yes, generating a large electric fingerprint model by adopting the adjusted fine tuning network and the pre-training network;
and executing the downstream task by adopting the power fingerprint large model to obtain the power fingerprint of the downstream task.
2. The method of claim 1, wherein the step of collecting a plurality of preset electrical characteristic quantities of the electrical equipment and constructing an operation database using the electrical characteristic quantities comprises:
collecting various preset electrical characteristic quantities of electrical equipment through a preset sensor;
data cleaning is carried out on the electrical characteristic quantity to obtain cleaned data;
carrying out data marking on the cleaned data to obtain labels of all the cleaned data;
and constructing an operation database by adopting the cleaned data and the corresponding labels.
3. The method of claim 1, wherein the step of data cleaning the electrical characteristic quantities to obtain cleaned data comprises:
normalizing the electrical characteristic quantity to obtain normalized data;
and performing outlier processing on the normalized data to obtain cleaned data.
4. The method of claim 1, wherein the step of determining whether the fine-tuning network converges comprises:
calculating a current loss function value of the fine tuning network;
acquiring a last loss function value of the fine tuning network;
calculating the difference value between the current loss function value and the last loss function value;
and judging whether the fine-tuning network converges or not by adopting the difference value and a preset convergence threshold value.
5. A power fingerprint recognition device, comprising:
the operation database construction module is used for collecting various preset electrical characteristic quantities of the electrical equipment and constructing an operation database by adopting the electrical characteristic quantities;
the training data set extraction module is used for receiving a downstream task and extracting a training data set from the operation database according to the downstream task;
the electric fingerprint large model construction module is used for constructing a fine tuning network of the downstream task and generating an electric fingerprint large model to be trained by combining a pre-training network;
the adjustment parameter generation module is used for inputting the training data set into the electric fingerprint large model to be trained, and training the network parameters of the fine tuning network to obtain adjustment parameters;
the fine tuning network adjusting module is used for adjusting the fine tuning network by adopting the adjusting parameters;
the convergence judging module is used for judging whether the fine tuning network converges or not;
the return module is used for returning to input the training data set into the electric fingerprint large model to be trained, and training the network parameters of the fine tuning network to obtain adjustment parameters;
the power fingerprint large model generation module is used for generating a power fingerprint large model by adopting the adjusted fine tuning network and the pre-training network if the power fingerprint large model is generated;
and the power fingerprint identification module is used for executing the downstream task by adopting the power fingerprint large model to obtain the power fingerprint of the downstream task.
6. The apparatus of claim 5, wherein the operation database construction module comprises:
the electrical characteristic quantity acquisition sub-module is used for acquiring electrical characteristic quantities of various preset electrical equipment through a preset sensor;
the data cleaning sub-module is used for cleaning the electrical characteristic quantity data to obtain cleaned data;
the data marking sub-module is used for carrying out data marking on the cleaned data to obtain labels of all the cleaned data;
and the operation database construction sub-module is used for constructing an operation database by adopting the cleaned data and the corresponding labels.
7. The apparatus of claim 5, wherein the data cleansing submodule comprises:
the normalization unit is used for carrying out normalization processing on the electrical characteristic quantity to obtain normalized data;
and the abnormal value processing unit is used for carrying out abnormal value processing on the normalized data to obtain cleaned data.
8. The apparatus of claim 5, wherein the convergence determination module comprises:
a current loss function value calculation sub-module for calculating a current loss function value of the fine tuning network;
a last loss function value obtaining sub-module, configured to obtain a last loss function value of the fine tuning network;
a difference value calculation sub-module, configured to calculate a difference value between the current loss function value and the last loss function value;
and the convergence judging sub-module is used for judging whether the fine-tuning network converges or not by adopting the difference value and a preset convergence threshold value.
9. An electronic device, the device comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to perform the power fingerprinting method of any of claims 1-4 according to instructions in the program code.
10. A computer readable storage medium for storing program code for performing the power fingerprinting method of any one of claims 1-4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311479044.2A CN117520818A (en) | 2023-11-08 | 2023-11-08 | Power fingerprint identification method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311479044.2A CN117520818A (en) | 2023-11-08 | 2023-11-08 | Power fingerprint identification method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117520818A true CN117520818A (en) | 2024-02-06 |
Family
ID=89744956
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311479044.2A Pending CN117520818A (en) | 2023-11-08 | 2023-11-08 | Power fingerprint identification method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117520818A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118336877A (en) * | 2024-06-13 | 2024-07-12 | 江苏达盈智慧科技股份有限公司 | Battery charging large model anomaly identification method based on electric power fingerprint |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019141040A1 (en) * | 2018-01-22 | 2019-07-25 | 佛山科学技术学院 | Short term electrical load predication method |
CN111079861A (en) * | 2019-12-31 | 2020-04-28 | 国网北京市电力公司 | Power distribution network voltage abnormity diagnosis method based on image rapid processing technology |
CN114372979A (en) * | 2022-02-23 | 2022-04-19 | 吉林化工学院 | Transferable electric power fingerprint depth identification method |
CN114519293A (en) * | 2021-12-27 | 2022-05-20 | 国网山西省电力公司阳泉供电公司 | Cable body fault identification method based on hand sample machine learning model |
CN116028595A (en) * | 2023-01-17 | 2023-04-28 | 国网甘肃省电力公司信息通信公司 | Automatic identification method based on unstructured document content |
CN116363457A (en) * | 2023-03-17 | 2023-06-30 | 阿里云计算有限公司 | Task processing, image classification and data processing method of task processing model |
CN116432023A (en) * | 2023-03-10 | 2023-07-14 | 广东电网有限责任公司广州供电局 | Novel power system fault classification method based on sample transfer learning |
-
2023
- 2023-11-08 CN CN202311479044.2A patent/CN117520818A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019141040A1 (en) * | 2018-01-22 | 2019-07-25 | 佛山科学技术学院 | Short term electrical load predication method |
CN111079861A (en) * | 2019-12-31 | 2020-04-28 | 国网北京市电力公司 | Power distribution network voltage abnormity diagnosis method based on image rapid processing technology |
CN114519293A (en) * | 2021-12-27 | 2022-05-20 | 国网山西省电力公司阳泉供电公司 | Cable body fault identification method based on hand sample machine learning model |
CN114372979A (en) * | 2022-02-23 | 2022-04-19 | 吉林化工学院 | Transferable electric power fingerprint depth identification method |
CN116028595A (en) * | 2023-01-17 | 2023-04-28 | 国网甘肃省电力公司信息通信公司 | Automatic identification method based on unstructured document content |
CN116432023A (en) * | 2023-03-10 | 2023-07-14 | 广东电网有限责任公司广州供电局 | Novel power system fault classification method based on sample transfer learning |
CN116363457A (en) * | 2023-03-17 | 2023-06-30 | 阿里云计算有限公司 | Task processing, image classification and data processing method of task processing model |
Non-Patent Citations (1)
Title |
---|
孔祥玉;郑锋;鄂志君;曹旌;王鑫;: "基于深度信念网络的短期负荷预测方法", 电力系统自动化, no. 05, 24 January 2018 (2018-01-24) * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118336877A (en) * | 2024-06-13 | 2024-07-12 | 江苏达盈智慧科技股份有限公司 | Battery charging large model anomaly identification method based on electric power fingerprint |
CN118336877B (en) * | 2024-06-13 | 2024-08-20 | 江苏达盈智慧科技股份有限公司 | Battery charging large model anomaly identification method based on electric power fingerprint |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Kaselimi et al. | Context aware energy disaggregation using adaptive bidirectional LSTM models | |
Deka et al. | Learning for DC-OPF: Classifying active sets using neural nets | |
CN108416695B (en) | Power load probability density prediction method, system and medium based on deep learning | |
CN111860982B (en) | VMD-FCM-GRU-based wind power plant short-term wind power prediction method | |
CN110309874B (en) | Negative sample screening model training method, data screening method and data matching method | |
Roubos et al. | Compact and transparent fuzzy models and classifiers through iterative complexity reduction | |
Dou et al. | Hybrid model for renewable energy and loads prediction based on data mining and variational mode decomposition | |
CN109635928A (en) | A kind of voltage sag reason recognition methods based on deep learning Model Fusion | |
CN117520818A (en) | Power fingerprint identification method and device, electronic equipment and storage medium | |
Raghavendra et al. | Artificial humming bird with data science enabled stability prediction model for smart grids | |
Xu et al. | A mixture of HMM, GA, and Elman network for load prediction in cloud-oriented data centers | |
CN109391515A (en) | Network failure prediction technique and system based on dove group's algorithm optimization support vector machines | |
CN116363452B (en) | Task model training method and device | |
Mei-Ying et al. | Chaotic time series prediction using least squares support vector machines | |
Mohammed et al. | GA-optimized fuzzy-based MPPT technique for abruptly varying environmental conditions | |
CN110781595A (en) | Energy use efficiency PUE prediction method, device, terminal and medium | |
Tornai et al. | Classification for consumption data in smart grid based on forecasting time series | |
CN116245019A (en) | Load prediction method, system, device and storage medium based on Bagging sampling and improved random forest algorithm | |
Berberidis et al. | Data-adaptive active sampling for efficient graph-cognizant classification | |
CN107844872B (en) | Short-term wind speed forecasting method for wind power generation | |
CN116307111A (en) | Reactive load prediction method based on K-means clustering and random forest algorithm | |
CN115907000A (en) | Small sample learning method for optimal power flow prediction of power system | |
CN115271198A (en) | Net load prediction method and device of photovoltaic equipment | |
CN113158446B (en) | Non-invasive electrical load identification method | |
CN115528684A (en) | Ultra-short-term load prediction method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |