CN115186771A - DBN-ELM-based equipment power consumption feature classification method and device - Google Patents
DBN-ELM-based equipment power consumption feature classification method and device Download PDFInfo
- Publication number
- CN115186771A CN115186771A CN202211102149.1A CN202211102149A CN115186771A CN 115186771 A CN115186771 A CN 115186771A CN 202211102149 A CN202211102149 A CN 202211102149A CN 115186771 A CN115186771 A CN 115186771A
- Authority
- CN
- China
- Prior art keywords
- elm
- dbn
- power consumption
- model
- layer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 20
- 239000010410 layer Substances 0.000 claims description 142
- 238000012549 training Methods 0.000 claims description 31
- 230000006870 function Effects 0.000 claims description 21
- 239000011159 matrix material Substances 0.000 claims description 17
- 230000000007 visual effect Effects 0.000 claims description 16
- 230000004913 activation Effects 0.000 claims description 7
- 239000011229 interlayer Substances 0.000 claims description 6
- 230000006399 behavior Effects 0.000 abstract description 7
- 230000002159 abnormal effect Effects 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 10
- 238000004590 computer program Methods 0.000 description 9
- 238000004891 communication Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 238000000605 extraction Methods 0.000 description 5
- 238000005315 distribution function Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 101000742844 Homo sapiens RNA-binding motif protein, Y chromosome, family 1 member A1 Proteins 0.000 description 2
- 102100038040 RNA-binding motif protein, Y chromosome, family 1 member A1 Human genes 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 241000287196 Asthenes Species 0.000 description 1
- 101001046999 Homo sapiens Kynurenine-oxoglutarate transaminase 3 Proteins 0.000 description 1
- 238000007476 Maximum Likelihood Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/06—Energy or water supply
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02J—CIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
- H02J3/00—Circuit arrangements for AC mains or AC distribution networks
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02J—CIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
- H02J2203/00—Indexing scheme relating to details of circuit arrangements for AC mains or AC distribution networks
- H02J2203/10—Power transmission or distribution systems management focussing at grid-level, e.g. load flow analysis, node profile computation, meshed network optimisation, active network management or spinning reserve management
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02J—CIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
- H02J2203/00—Indexing scheme relating to details of circuit arrangements for AC mains or AC distribution networks
- H02J2203/20—Simulating, e g planning, reliability check, modelling or computer assisted design [CAD]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Artificial Intelligence (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Economics (AREA)
- Power Engineering (AREA)
- Public Health (AREA)
- Water Supply & Treatment (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The application discloses a DBN-ELM-based equipment power consumption feature classification method and device, wherein the method comprises the following steps: acquiring power consumption data of equipment to be processed; and inputting the power consumption data of the equipment to be processed into a pre-trained DBN-ELM model, and outputting the classification result of the power consumption characteristics of the equipment, wherein the DBN-ELM model is obtained by improving the DBN model according to an ELM algorithm. The method and the device can improve the accuracy of the classification result of the power consumption characteristics of the equipment, further effectively identify the equipment with abnormal power consumption behaviors, and reduce the operation cost of a power company.
Description
Technical Field
The application relates to the field of power operation and maintenance, in particular to a DBN-ELM-based equipment power consumption feature classification method and device.
Background
The power loss caused by NTL (non-technical loss) in the operation process of the power system brings significant economic loss for power supply companies, and the specific form is abnormal power utilization of power equipment. Therefore, how to accurately classify the power consumption behaviors of the power equipment is a technical basis for reasonably and efficiently controlling the NTL of the power grid.
In the related art, the power utilization behavior classification method of the power equipment excessively depends on the feature extraction effect, so that the classification accuracy of abnormal power utilization in different power utilization periods has large fluctuation, and the operation cost of a power company is not sufficiently reduced.
Disclosure of Invention
The present application is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, the application aims to solve the problem of accuracy fluctuation of a power utilization behavior classification method of power equipment, and provides an equipment power consumption feature classification method based on a DBN-ELM (Deep Belief Network-Extreme Learning Machine).
Another objective of the present application is to provide a device power consumption feature classification apparatus based on DBN-ELM.
In order to achieve the above object, a first aspect of the present application provides a DBN-ELM-based device power consumption feature classification method, including the following steps:
acquiring power consumption data of equipment to be processed;
and inputting the power consumption data of the equipment to be processed into a pre-trained DBN-ELM model, and outputting a classification result of the power consumption characteristics of the equipment, wherein the DBN-ELM model is obtained by improving the DBN model according to an ELM algorithm.
In a possible embodiment, before inputting the power consumption data of the device to be processed into the pre-trained DBN-ELM model, the method includes:
acquiring equipment power consumption training data;
training an RBM part model in the DBN-ELM initial model based on the equipment power consumption training data to obtain a DBN-ELM intermediate model;
and determining the weight and the bias of an ELM partial model in the DBN-ELM intermediate model according to an ELM algorithm to obtain the pre-trained DBN-ELM model.
In one possible implementation, the RBM energy function is:
wherein,a number of nodes representing a visual layer of the RBM,a number of nodes representing a hidden layer of the RBM,is shown asThe state of the nodes of the individual visual layers,denotes the firstThe state of each hidden layer node is then,which is indicative of a parameter of the distribution,denotes the firstA visual layer node andthe weight between the nodes of the hidden layer,andrespectively representA visual layer node andan inter-layer bias of hidden layer nodes.
In one possible embodiment, the ELM partial model is:
wherein,the number of nodes representing the hidden layer,represents the number of nodes of the output layer,node representing hidden layerThe link weight to the output layer node,an activation function representing a hidden layer,representing nodes of an input layer to nodes of a hidden layerThe weight of the link of (a) is,an output matrix representing the input layer is then generated,representing nodes of an input layer to nodes of a hidden layerThe bias of (a) is such that,node representing output layerThe corresponding zero matrix.
In order to achieve the above object, a second aspect of the present application provides a device power consumption feature classification apparatus based on a DBN-ELM, including:
the first acquisition module is used for acquiring power consumption data of the equipment to be processed;
and the output module is used for inputting the power consumption data of the equipment to be processed into a pre-trained DBN-ELM model and outputting the classification result of the power consumption characteristics of the equipment, wherein the DBN-ELM model is obtained by improving the DBN model according to an ELM algorithm.
In a possible implementation, the output module is preceded by:
the second acquisition module is used for acquiring the power consumption training data of the equipment;
the training module is used for training an RBM partial model in the DBN-ELM initial model based on the power consumption training data of the equipment to obtain a DBN-ELM intermediate model;
and the determining module is used for determining the weight and the bias of an ELM part model in the DBN-ELM intermediate model according to an ELM algorithm to obtain the pre-trained DBN-ELM model.
A third aspect of the present application provides an electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the DBN-ELM based device power consumption feature classification method according to any of the first aspect.
A fourth aspect of the present application provides a computer-readable storage medium, wherein instructions, when executed by a processor of an electronic device, enable the electronic device to perform the DBN-ELM based device power consumption feature classification method according to any one of the first aspect.
A fifth aspect of the present application provides a computer program product comprising a computer program which, when being executed by a processor, implements the DBN-ELM based device power consumption feature classification method according to any of the first aspect.
The beneficial effect of this application:
in the embodiment of the application, the power consumption data of the equipment to be processed is obtained, then the power consumption data of the equipment to be processed is input into a pre-trained DBN-ELM model, and the classification result of the power consumption characteristics of the equipment is output, wherein the DBN-ELM model is obtained by improving the DBN model according to an ELM algorithm. The method and the device can improve the accuracy of the classification result of the power consumption characteristics of the equipment, further effectively identify the equipment with abnormal power consumption behaviors, and reduce the operation cost of a power company.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flowchart of a DBN-ELM-based device power consumption feature classification method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a DBN model of the prior art;
FIG. 3 is a schematic diagram of a prior art RBM;
FIG. 4 is a schematic diagram of a DBN-ELM model according to an embodiment of the present application;
FIG. 5 is a schematic structural diagram of a DBN-ELM-based device power consumption feature classification apparatus according to an embodiment of the present application;
fig. 6 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
In order to make the technical solutions of the present application better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The DBN-ELM-based device power consumption feature classification method, apparatus, electronic device, and computer-readable storage medium according to embodiments of the present application are described below with reference to the accompanying drawings, and first, the DBN-ELM-based device power consumption feature classification method according to embodiments of the present application will be described with reference to the accompanying drawings.
Fig. 1 is a flowchart of a DBN-ELM-based device power consumption feature classification method according to an embodiment of the present application.
As shown in fig. 1, the DBN-ELM-based device power consumption feature classification method includes:
and step S110, acquiring the power consumption data of the equipment to be processed.
In the embodiment of the application, the power consumption data of the device to be processed can be acquired. The device power consumption amount data to be processed may be power consumption amount data of the counted user, and the statistical manner may be power consumption amount statistics of time-divided periods, for example, power consumption amount of the user a for each day in 5 months.
And step S120, inputting the power consumption data of the equipment to be processed into a pre-trained DBN-ELM model, and outputting the classification result of the power consumption characteristics of the equipment.
Wherein, the DBN-ELM model is obtained by improving the DBN model according to an ELM algorithm.
In the embodiment of the application, after the power consumption data of the device to be processed is obtained, the power consumption data of the device to be processed can be input into a pre-trained DBN-ELM model, and the classification result of the power consumption characteristics of the device can be output after the power consumption data of the device to be processed is processed by the DBN-ELM model.
It should be noted that the purpose of improving the DBN model according to the ELM algorithm is to solve the problems that the original BP layer algorithm of the DBN model is low in convergence rate and easy to fall into local optimum.
In order to make the DBN-ELM model more clear, the DBN model, the ELM model, and the DBN-ELM model are further described below.
The core structure of the DBN model is composed of a multi-layer Restricted Boltzmann Machine (RBM) and a classification layer based on a Back Propagation (BP) theory. Fig. 2 is a schematic structural diagram of a DBN model in the prior art, and as shown in fig. 2, the DBN model includes an input layer, a hidden layer 1, a hidden layer 2, and an output layer, where the input layer and the hidden layer 1 form an RBM1 partial model, the hidden layer 1 and the hidden layer 2 form an RBM2 partial model, and the hidden layer 2 and the output layer form a BP partial model. Automatic extraction of input data features can be realized through the layer-by-layer mapping of the pre-trained RBMs (RBM 1 and RBM 2), so that potential features of data samples are obtained. The BP neural network is a backward propagation network, firstly, a sample data set is input into the network, the sample data set is forward propagated to an output layer through a hidden layer, the error between a calculated output result and a real result is taken as a loss function, then, according to the loss function, the weighted value of each layer of the network is corrected step by step from the output layer, the correction is stopped at the input layer of the network, the calculation process of the network is completed, the operation is repeated in a circulating mode, the network weight is continuously updated until the error between an output calculation value and a real value reaches an allowable range, and the circulation is terminated. In the DBN model, the BP part model is used to complete classification according to the data features extracted by the RBM. In the process of training the DBN model, parameters of the RBM can be finely adjusted according to a set loss function of a classification result so as to meet the requirement of the loss function.
The DBN model mainly depends on RBM to finish layer-by-layer training in the characteristic extraction process. FIG. 3 is prior artThe schematic structure of RBM, as shown in fig. 3, the RBM structure includes a visible layer and a hidden layer,a node of the visual layer is represented,a node of the hidden layer is represented,represents the weight between the visible layer node and the hidden layer node, and is in one-to-one correspondence with the link relation of the visible layer node and the hidden layer node,indicating the inter-layer bias of the visible layer node,indicating the interlayer offset of the hidden layer node.
The ELM model is a feedforward neural network model, hidden layer node bias and weight between an input layer and a hidden layer do not need to be updated, the number of hidden layer nodes only needs to be set, and the determined optimal value can be obtained only through single learning.
In the DBN-ELM model, the DBN finishes feature extraction, the ELM finishes classification tasks, and the function of automatically extracting features by the DBN and the function of efficiently learning the ELM are integrated. Fig. 4 is a schematic structural diagram of a DBN-ELM model according to an embodiment of the present application, and as shown in fig. 4, compared to the DBN model of fig. 2, an ELM partial model of the DBN-ELM model includes a hidden layer 1, a hidden layer 2, and an output layer on the basis of keeping an RBM constant. At this time, the hidden layer 1 corresponds to an input layer of the ELM, and thus fig. 4 is a schematic structural diagram of the DBN-ELM model including a single hidden layer ELM.
In the embodiment of the application, the power consumption data of the equipment to be processed is obtained, then the power consumption data of the equipment to be processed is input into a pre-trained DBN-ELM model, and the classification result of the power consumption characteristics of the equipment is output, wherein the DBN-ELM model is obtained by improving the DBN model according to an ELM algorithm. The method and the device can improve the accuracy of the classification result of the power consumption characteristics of the equipment, further effectively identify the equipment with abnormal power consumption behaviors, and reduce the operation cost of a power company.
In one possible implementation, before inputting the power consumption data of the device to be processed into the pre-trained DBN-ELM model, the method includes:
acquiring device power consumption training data;
training an RBM partial model in the DBN-ELM initial model based on the power consumption training data of the equipment to obtain a DBN-ELM intermediate model;
and determining the weight and the bias of an ELM part model in the DBN-ELM intermediate model according to an ELM algorithm to obtain a pre-trained DBN-ELM model.
Wherein, the DBN-ELM initial model can be an untrained DBN-ELM model, and the DBN-ELM intermediate model can be a DBN-ELM model which completes the training of the RBM partial model.
In the embodiment of the application, the power consumption training data of the equipment can be obtained, and meanwhile, the DBN-ELM initial model is obtained. Then, the device power consumption training data can be preprocessed to form standardized data, part of the data is screened out from the standardized data, labels are set, and the RBM part model in the DBN-ELM initial model is subjected to unsupervised training according to the remaining label-free data, so that a DBN-ELM intermediate model is obtained. And finally, inputting part of data with set labels into a DBN-ELM intermediate model for feature extraction, and finishing classification through an efficient learning function of an ELM algorithm, so that the weight and bias of the ELM part model in the DBN-ELM intermediate model are determined, and the pre-trained DBN-ELM model can be obtained and used for finishing a classification task of the power consumption data of the equipment to be processed.
In one possible embodiment, the energy function of the RBM is:
wherein,a number of nodes representing a visible layer of the RBM,a number of nodes representing a hidden layer of the RBM,is shown asThe state of the nodes of the individual visual layers,denotes the firstThe state of each hidden layer node is then,the parameters of the distribution are represented by,denotes the firstA visual layer node and the firstThe weight between the nodes of the hidden layer,andrespectively represent the firstA visual layer node andof hidden layer nodesAnd performing interlayer bias.
It should be noted that the joint probability distribution function of the visible layer node, the hidden layer node and the distribution parameter can be represented by the following formula:
wherein,representing a partition function for the normalization process, thenMay be represented by the following equation:
further, the essence of layer-by-layer training of the RBM partial model is to solve distribution parameters and fit samples. Under the condition that the capacity of the training sample is T, the distribution parameters can be solved according to a maximum likelihood method, and the formula is as follows:
wherein,a mathematical expectation representing a joint probability distribution function,hiding layer nodes when representing visible layer nodes as samplesThe distribution function of (2).
In one possible implementation, the ELM partial model is:
wherein,the number of nodes representing the hidden layer,indicates the number of nodes of the output layer,node representing hidden layerThe link weight to the output layer node,an activation function representing a hidden layer,representing nodes of an input layer to nodes of a hidden layerThe weight of the link of (a) is,an output matrix representing the input layer is then generated,representing nodes of an input layer to nodes of a hidden layerThe bias of (a) is such that,node representing output layerA corresponding zero matrix.
It should be noted that, in the case of n samples, the ELM model network may be defined as follows:
wherein,node representing hidden layerThe link weight to the output layer node,an activation function representing a hidden layer,representing nodes of an input layer to nodes of a hidden layerThe weight of the link of (a) is,representing nodes of an input layer to nodes of a hidden layerThe bias of (a) is such that,node representing input layerA corresponding input matrix is set up, and,node representing input layerA corresponding result matrix.
In the case where there is only a single hidden-layer feedforward neural network fitting n samples without error, it can be represented by:
wherein,the output matrix is represented by a matrix of values,a link weight matrix representing nodes of the hidden layer to nodes of the output layer,representing a zero matrix. The method is randomly obtained before training is started, so that the method can solve by determining the number of nodes of an ELM hidden layer and an activation function. Thus, the ELM model training process may be as follows:
(1) Setting the number of ELM hidden layer nodes, and randomly setting a bias matrix from an input layer node to the hidden layer nodeAnd a link weight matrix of input layer nodes to hidden layer nodes;
It can be understood that the formula based on ELM partial model is usedRepresenting the weight of the hidden layer to the output layer, the minimized output error can be represented by the following equation:
from the above analysis, the solution problem of the weights and biases of the ELM partial model can be converted to a solution、、And can be represented by the following formula:
in ELMAndcan obtain uniqueness after random initializationExpected output according to the modelWill be modeled asThen, it can be represented by the following formula:
in order to implement the foregoing embodiment, as shown in fig. 5, in this embodiment, there is further provided an apparatus 500 for classifying power consumption characteristics of devices based on DBN-ELM, where the apparatus 500 includes: a first obtaining module 510 and an outputting module 520.
A first obtaining module 510, configured to obtain power consumption data of a device to be processed;
and an output module 520, configured to input the power consumption data of the device to be processed into a pre-trained DBN-ELM model, and output a device power consumption feature classification result, where the DBN-ELM model is obtained by improving the DBN model according to an ELM algorithm.
In a possible implementation, the output module 520 includes:
the second acquisition module is used for acquiring the power consumption training data of the equipment;
the training module is used for training an RBM partial model in the DBN-ELM initial model based on the equipment power consumption training data to obtain a DBN-ELM intermediate model;
and the determining module is used for determining the weight and the bias of an ELM part model in the DBN-ELM intermediate model according to an ELM algorithm to obtain a pre-trained DBN-ELM model.
In one possible embodiment, characterized in that,
the energy function of the RBM is:
wherein,a number of nodes representing a visual layer of the RBM,a number of nodes representing a hidden layer of the RBM,is shown asThe state of each of the nodes of the visual layer,is shown asThe state of each hidden layer node is then,the parameters of the distribution are represented by,is shown asA visual layer node andthe weight between the nodes of the hidden layer,andrespectively representA visual layer node and the firstAn inter-layer bias of hidden layer nodes.
In one possible embodiment, characterized in that,
the ELM partial model is:
wherein,the number of nodes representing the hidden layer,represents the number of nodes of the output layer,node representing hidden layerThe link weight to the output layer node,an activation function representing a hidden layer,representing nodes of an input layer to nodes of a hidden layerThe weight of the link of (a) is,an output matrix representing the input layer is then generated,representing nodes of an input layer to nodes of a hidden layerThe bias of (a) is such that,node representing output layerThe corresponding zero matrix.
In the embodiment of the application, the power consumption data of the device to be processed is obtained through the first obtaining module, then the power consumption data of the device to be processed is input into the pre-trained DBN-ELM model through the output module, and the classification result of the power consumption characteristics of the device is output, wherein the DBN-ELM model is obtained by improving the DBN model according to an ELM algorithm. The method and the device can improve the accuracy of the classification result of the power consumption characteristics of the equipment, further effectively identify the equipment with abnormal power consumption behaviors, and reduce the operation cost of a power company.
It should be noted that the foregoing explanation of the DBN-ELM-based device power consumption feature classification method embodiment is also applicable to the DBN-ELM-based device power consumption feature classification apparatus of this embodiment, and details are not repeated here.
There is also provided, in accordance with an embodiment of the present application, an electronic device, a computer-readable storage medium, and a computer program product.
FIG. 6 illustrates a schematic block diagram of an example electronic device 600 that can be used to implement embodiments of the present disclosure. The electronic device 600 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not intended to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the electronic device 600 includes a computing unit 601, which can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data required for the operation of the device 600 can also be stored. The calculation unit 601, the ROM602, and the RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Various components in the electronic device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, a mouse, and the like; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the electronic device 600 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 601 performs the respective methods and processes described above, such as the DBN-ELM-based device power consumption feature classification method. For example, in some embodiments, the DBN-ELM based device power consumption feature classification method can be implemented as a computer software program tangibly embodied in a machine-readable medium, such as the storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 600 via the ROM602 and/or the communication unit 609. When the computer program is loaded into the RAM603 and executed by the computing unit 601, one or more steps of the DBN-ELM based device power consumption feature classification method described above may be performed. Alternatively, in other embodiments, the calculation unit 601 may be configured to perform the DBN-ELM based device power consumption feature classification method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program code, when executed by the processor or controller, causes the functions/acts specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), the internet, and blockchain networks.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server can be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service ("Virtual Private Server", or simply "VPS"). The server may also be a server of a distributed system, or a server incorporating a blockchain.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are exemplary and should not be construed as limiting the present application and that changes, modifications, substitutions and alterations in the above embodiments may be made by those of ordinary skill in the art within the scope of the present application.
Claims (8)
1. A DBN-ELM-based device power consumption feature classification method is characterized by comprising the following steps:
acquiring power consumption data of equipment to be processed;
and inputting the power consumption data of the equipment to be processed into a pre-trained DBN-ELM model, and outputting a classification result of the power consumption characteristics of the equipment, wherein the DBN-ELM model is obtained by improving the DBN model according to an ELM algorithm.
2. The method of claim 1, wherein the inputting the device power consumption data to be processed into the pre-trained DBN-ELM model comprises:
acquiring device power consumption training data;
training an RBM partial model in the DBN-ELM initial model based on the power consumption training data of the equipment to obtain a DBN-ELM intermediate model;
and determining the weight and the bias of an ELM partial model in the DBN-ELM intermediate model according to an ELM algorithm to obtain the pre-trained DBN-ELM model.
3. The method of claim 2,
the RBM energy function is:
wherein,a number of nodes representing a visible layer of the RBM,a number of nodes representing a hidden layer of the RBM,is shown asThe state of the nodes of the individual visual layers,is shown asThe state of each hidden layer node is then,the parameters of the distribution are represented by,is shown asA visual layer node andthe weight between the nodes of the hidden layer,andrespectively representA visual layer node and the firstAn inter-layer bias of hidden layer nodes.
4. The method of claim 2,
the ELM partial model is as follows:
wherein,the number of nodes representing the hidden layer,indicates the number of nodes of the output layer,node representing hidden layerThe link weight to the output layer node,an activation function representing a hidden layer,representing nodes of an input layer to nodes of a hidden layerThe weight of the link of (a) is,an output matrix representing the input layer is then generated,representing nodes of an input layer to nodes of a hidden layerIn the above-described embodiment, the bias of (c),node representing output layerThe corresponding zero matrix.
5. An apparatus for classifying power consumption characteristics of a device based on a DBN-ELM, comprising:
the first acquisition module is used for acquiring power consumption data of the equipment to be processed;
and the output module is used for inputting the power consumption data of the equipment to be processed into a pre-trained DBN-ELM model and outputting the classification result of the power consumption characteristics of the equipment, wherein the DBN-ELM model is obtained by improving the DBN model according to an ELM algorithm.
6. The apparatus of claim 5, wherein the output module is preceded by:
the second acquisition module is used for acquiring the power consumption training data of the equipment;
the training module is used for training an RBM partial model in the DBN-ELM initial model based on the power consumption training data of the equipment to obtain a DBN-ELM intermediate model;
and the determining module is used for determining the weight and the bias of an ELM part model in the DBN-ELM middle model according to an ELM algorithm to obtain the pre-trained DBN-ELM model.
7. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the DBN-ELM based device power consumption feature classification method of any one of claims 1 to 4.
8. A computer-readable storage medium, wherein instructions, when executed by a processor of an electronic device, enable the electronic device to perform the DBN-ELM based device power consumption feature classification method of any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211102149.1A CN115186771A (en) | 2022-09-09 | 2022-09-09 | DBN-ELM-based equipment power consumption feature classification method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211102149.1A CN115186771A (en) | 2022-09-09 | 2022-09-09 | DBN-ELM-based equipment power consumption feature classification method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115186771A true CN115186771A (en) | 2022-10-14 |
Family
ID=83524239
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211102149.1A Pending CN115186771A (en) | 2022-09-09 | 2022-09-09 | DBN-ELM-based equipment power consumption feature classification method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115186771A (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3098762A1 (en) * | 2015-05-29 | 2016-11-30 | Samsung Electronics Co., Ltd. | Data-optimized neural network traversal |
CN110119755A (en) * | 2019-03-22 | 2019-08-13 | 国网浙江省电力有限公司信息通信分公司 | Electricity method for detecting abnormality based on Ensemble learning model |
CN110378286A (en) * | 2019-07-19 | 2019-10-25 | 东北大学 | A kind of Power Quality Disturbance classification method based on DBN-ELM |
CN111131237A (en) * | 2019-12-23 | 2020-05-08 | 深圳供电局有限公司 | Microgrid attack identification method based on BP neural network and grid-connected interface device |
CN112598114A (en) * | 2020-12-17 | 2021-04-02 | 海光信息技术股份有限公司 | Power consumption model construction method, power consumption measurement method and device and electronic equipment |
CN113033801A (en) * | 2021-03-04 | 2021-06-25 | 北京百度网讯科技有限公司 | Pre-training method and device of neural network model, electronic equipment and medium |
CN113393034A (en) * | 2021-06-16 | 2021-09-14 | 国网山东省电力公司泰安供电公司 | Electric quantity prediction method of online self-adaptive OSELM-GARCH model |
-
2022
- 2022-09-09 CN CN202211102149.1A patent/CN115186771A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3098762A1 (en) * | 2015-05-29 | 2016-11-30 | Samsung Electronics Co., Ltd. | Data-optimized neural network traversal |
CN110119755A (en) * | 2019-03-22 | 2019-08-13 | 国网浙江省电力有限公司信息通信分公司 | Electricity method for detecting abnormality based on Ensemble learning model |
CN110378286A (en) * | 2019-07-19 | 2019-10-25 | 东北大学 | A kind of Power Quality Disturbance classification method based on DBN-ELM |
CN111131237A (en) * | 2019-12-23 | 2020-05-08 | 深圳供电局有限公司 | Microgrid attack identification method based on BP neural network and grid-connected interface device |
CN112598114A (en) * | 2020-12-17 | 2021-04-02 | 海光信息技术股份有限公司 | Power consumption model construction method, power consumption measurement method and device and electronic equipment |
CN113033801A (en) * | 2021-03-04 | 2021-06-25 | 北京百度网讯科技有限公司 | Pre-training method and device of neural network model, electronic equipment and medium |
CN113393034A (en) * | 2021-06-16 | 2021-09-14 | 国网山东省电力公司泰安供电公司 | Electric quantity prediction method of online self-adaptive OSELM-GARCH model |
Non-Patent Citations (1)
Title |
---|
李秋硕等: "BP神经网络在用电用户分类中的应用", 《现代电子技术》 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112507040A (en) | Training method and device for multivariate relation generation model, electronic equipment and medium | |
CN114265979B (en) | Method for determining fusion parameters, information recommendation method and model training method | |
CN113420792A (en) | Training method of image model, electronic equipment, road side equipment and cloud control platform | |
CN116402615B (en) | Account type identification method and device, electronic equipment and storage medium | |
Wang et al. | Short-term load forecasting with LSTM based ensemble learning | |
CN116580223A (en) | Data processing and model fine tuning method and device, electronic equipment and storage medium | |
CN116542362A (en) | Load prediction method and device, electronic equipment and storage medium | |
CN115496157A (en) | Classification model training method and device, electronic equipment and storage medium | |
CN110458690A (en) | A kind of method for building up and device of credit scoring model | |
CN115186771A (en) | DBN-ELM-based equipment power consumption feature classification method and device | |
CN116703109A (en) | Method, device, equipment and storage medium for selecting power distribution network project | |
CN117633184A (en) | Model construction and intelligent reply method, device and medium | |
US20230206075A1 (en) | Method and apparatus for distributing network layers in neural network model | |
CN117333076A (en) | Evaluation method, device, equipment and medium based on mixed expert model | |
CN115146997A (en) | Evaluation method and device based on power data, electronic equipment and storage medium | |
CN116432069A (en) | Information processing method, service providing method, device, equipment and medium | |
CN112632846B (en) | Power transmission section limit probability assessment method of power system and electronic equipment | |
CN113112311B (en) | Method for training causal inference model and information prompting method and device | |
CN115759283A (en) | Model interpretation method and device, electronic equipment and storage medium | |
CN113723593A (en) | Load shedding prediction method and system based on neural network | |
CN113379533A (en) | Method, device, equipment and storage medium for improving circulating loan quota | |
US20230195842A1 (en) | Automated feature engineering for predictive modeling using deep reinforcement learning | |
CN115578583B (en) | Image processing method, device, electronic equipment and storage medium | |
CN116488256A (en) | Power system unit combination method, device, equipment and storage medium | |
CN118606713A (en) | Wind control model training method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20221014 |
|
RJ01 | Rejection of invention patent application after publication |