WO2022267960A1 - Système de détection collaborative de dbn à attention fédérée basé sur des sélections de clients - Google Patents

Système de détection collaborative de dbn à attention fédérée basé sur des sélections de clients Download PDF

Info

Publication number
WO2022267960A1
WO2022267960A1 PCT/CN2022/098981 CN2022098981W WO2022267960A1 WO 2022267960 A1 WO2022267960 A1 WO 2022267960A1 CN 2022098981 W CN2022098981 W CN 2022098981W WO 2022267960 A1 WO2022267960 A1 WO 2022267960A1
Authority
WO
WIPO (PCT)
Prior art keywords
dbn
training
concentrator
data
federated
Prior art date
Application number
PCT/CN2022/098981
Other languages
English (en)
Chinese (zh)
Inventor
夏卓群
陈亚玲
尹波
廖曙光
邢利
文琴
Original Assignee
长沙理工大学
长沙麦融高科股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 长沙理工大学, 长沙麦融高科股份有限公司 filed Critical 长沙理工大学
Publication of WO2022267960A1 publication Critical patent/WO2022267960A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y04INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
    • Y04SSYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
    • Y04S10/00Systems supporting electrical power generation, transmission or distribution
    • Y04S10/50Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications

Definitions

  • the invention relates to the technical field of data processing, in particular to a federated attention DBN collaborative detection system based on client selection.
  • the information security of the smart grid is becoming more and more important.
  • the advanced measurement system AMI is an important part of the smart grid.
  • the security of the advanced metering system is an urgent problem that needs to be solved urgently for the security of the smart grid. Since AMI is a key system of the smart grid, it is vulnerable to network attacks, and the data collected by the smart meter is sensitive, which may cause privacy leakage.
  • the detection model of related technologies usually has "low learning efficiency", " Model training accuracy is not high” and “privacy leakage” and other technical problems.
  • the present invention aims to solve at least the technical problems existing in the prior art.
  • the present invention proposes a federated attention DBN collaborative detection system based on client selection, which can effectively improve the efficiency of federated learning, reduce the number of concentrators that need to be trained, and reduce the communication overhead and computing overhead between concentrators and data centers , improve the accuracy of model training, and enhance the security of data privacy.
  • the invention also proposes a DBN detection method with an attention mechanism.
  • the present invention also proposes a federated attention DBN cooperative detection device based on client selection and having the DBN detection method of the above-mentioned attention mechanism.
  • the invention also proposes a computer-readable storage medium.
  • this embodiment provides a federated attention DBN collaborative detection system based on client selection, including:
  • a plurality of concentrators each of which is connected in communication with a plurality of the smart meters under its jurisdiction, is used to obtain the power data from the corresponding smart meters, and use a DBN training model with an attention mechanism to The above electric power data is trained to obtain training parameters;
  • the data center is connected in communication with all the concentrators, allocates the DBN training model to the concentrators according to the resources of the concentrators, and is used to obtain the training parameters from each of the concentrators.
  • the federated average aggregation is performed on the training parameters, and the result of the federated average aggregation is sent to the concentrator selected in the next round, so that the concentrator trains the DBN training model to convergence.
  • the federated attention DBN collaborative detection system based on client selection includes multiple smart meters, concentrators, and data centers.
  • a large amount of power data is collected on the smart meters, and then the power data is stored on the concentrator. Due to the advanced measurement of the smart grid
  • the system Advanced Metering Infrastructure, AMI
  • the DBN training model with attention mechanism is used on the concentrator to train the power data to obtain training parameters, and each concentrator does not communicate directly, which can better improve the security of power data.
  • the data center is connected to the concentrator, obtains the training parameters, and performs federated average aggregation on the training parameters.
  • the concentrator resources include communication quality, idle CPU, GPU, and send the result of the federated average aggregation to the concentrator selected in the next round, to The concentrator continues to train the electric data until the DBN training model converges.
  • the federated attention DBN collaborative detection system based on client selection provided by this embodiment can greatly shorten the training time, and can also achieve the accuracy close to the centralized method; compared with the distributed detection model, it is better than the distributed unaggregated The training model on the device, while protecting the data privacy well.
  • the concentrator includes a data collection module, a data processing module, and an attack detection module, the data collection module is connected to the data processing module by communication, and the data collection module is connected to the attack detection module communication connection.
  • this embodiment provides a DBN detection method with an attention mechanism, which is applied to a data center and includes the following steps:
  • the training parameters are obtained by the concentrator using the DBN training model with attention mechanism to train the power data;
  • the power data is trained to obtain training parameters, and then the data center performs federated average fitting on the training parameters, and sends the result of the federated average aggregation to the concentrator selected in the next round, so that the concentrator can
  • the power data continues to be trained until the DBN training model converges.
  • the DBN detection method of the attention mechanism provided by this embodiment can effectively shorten the training time, and has the accuracy close to the centralized method; compared with the distributed detection model, it is better than the distributed unidentified Aggregated on-device training models while maintaining data privacy well.
  • the input matrix of the attention mechanism includes a key matrix, a value matrix and a query matrix
  • the output matrix of the attention mechanism includes a context matrix
  • this embodiment provides a DBN detection method with an attention mechanism, which is applied to a concentrator and includes the following steps:
  • the result of the federated average aggregation from the data center is received, and according to the result of the federated average aggregation, the power data is continuously trained until the DBN training model converges.
  • the concentrator receives the DBN training model with attention mechanism from the data center, obtains the power data collected by the smart meter and keeps it in the concentrator, uses the DBN training model to train the power data to obtain training parameters, and converts the
  • the training parameters are sent to the data center, and the result of federated average aggregation is received from the data center, and according to the result of the federated average aggregation, the power data is continuously trained in the concentrator until the DBN training model converges.
  • the DBN detection method of the attention mechanism provided by this embodiment can effectively shorten the training time, and has the accuracy close to the centralized method; compared with the distributed detection model, it is better than the distributed unidentified Aggregated on-device training models while maintaining data privacy well.
  • said using said DBN training model to train said power data to obtain training parameters comprises the following steps:
  • Step S1 inputting the electric power data into the first layer RBM for pre-training to obtain the training result
  • Step S2 inputting the training result to the second layer RBM for training
  • Step S3 repeating the step S1, the step S2 until the maximum number of iterations with the iteration band;
  • Step S4 use the softmax layer to perform backpropagation, and fine-tune the weights of the entire DBN network.
  • the RBM includes a visible layer and a hidden layer, and further includes a step of: performing layer-by-layer training on the RBM, and calculating activation probabilities of the visible layer and the hidden layer using an activation function.
  • this embodiment provides a federated attention DBN cooperative detection device based on client selection, including: a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor When the computer program is executed, the DBN detection method of the attention mechanism as described in the second aspect and the third aspect is realized.
  • this embodiment provides a computer-readable storage medium, the computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are used to make a computer execute the second aspect and the third aspect.
  • the DBN detection method of the attention mechanism is used to make a computer execute the second aspect and the third aspect.
  • Fig. 1 is the flow chart of the DBN detection method of the attention mechanism that an embodiment of the present invention provides
  • Fig. 2 is the flowchart of the DBN detection method of the attention mechanism that another embodiment of the present invention provides;
  • Fig. 3 is a DBN model diagram based on the attention mechanism of the DBN detection method of the attention mechanism provided by another embodiment of the present invention.
  • AMI advanced measurement system
  • Deep learning is widely used to solve the security problems of smart grids due to its powerful feature extraction capabilities.
  • the data is built through centralized training data, which may leak data privacy.
  • federated learning is proposed, which proposes to let users train their own models locally, integrate the user's training parameters under the premise of protecting user privacy, and then update the local user model by returning parameters through the cloud model.
  • the performance of the detection model is related to the amount of data. The larger the amount of data, the better the performance of the trained model; the privacy leakage of AMI will have a serious impact.
  • Most traditional AMI machine learning detection algorithms focus on local data for training. During this process, a large amount of data is transmitted to the data center, and the private content contained in the data of all parties is more or less exposed.
  • the invention proposes a federated learning framework based on concentrator selection to improve federated learning efficiency, and then deploys a DBN algorithm based on an attention mechanism to train a detection model, focusing on more important features and improving the detection accuracy of the detection model.
  • the original data is not exchanged and transmitted, but the data of all parties is integrated, which improves the performance of the detection model compared with a single partial data set, and reduces user data privacy leakage risks of.
  • Federated Learning is an emerging basic technology of artificial intelligence. It was first proposed by Google in 2016. It was originally used to solve the problem of updating models locally for end users of Android phones. Its design goal is to ensure the exchange of big data. Under the premise of real-time information security, protection of terminal data and personal data privacy, and compliance with laws and regulations, efficient machine learning can be carried out between multiple parties or computing nodes, and federated learning can improve the privacy of local data. Conservation issues.
  • Deep Belief Networks is a probability generation model, which is a stack of multiple Restricted Boltzmann Machines (RBM). As the number of RBM hidden layer nodes increases, under certain conditions It can fit any data distribution.
  • RBM layer includes a visible layer and a hidden layer, and the units in any layer are not connected to each other, but are connected to each other in different layers. Except for the first layer and the last layer, each layer of RBM has two roles: as a hidden layer of the previous layer, or as an input (visual layer) of the next layer.
  • DBN training includes two steps, pre-training and weight fine-tuning.
  • the original data is input into the RBM of the first layer for training, and the training result is used as the input of the next layer of RBM for training, and the training is repeated until the maximum number of iterations is reached.
  • the BP algorithm is used for backpropagation to adjust the DBN network to avoid falling into local optimum.
  • the invention provides a federated attention DBN collaborative detection system and method based on client selection, which can greatly shorten the training time, and can also achieve the accuracy close to the centralized method; compared with the distributed detection model, it is better than the distributed detection model Unaggregated on-device training models while maintaining data privacy well.
  • the federated attention DBN cooperative detection system based on client selection includes multiple smart meters, concentrators and data centers, collects a large amount of power data on the smart meters, and then stores the power data on the concentrator, Since the Advanced Metering Infrastructure (AMI) of the smart grid often has network attacks, and the privacy and security requirements of power data are high, the DBN training model with attention mechanism is used on the concentrator to analyze the The power data is trained to obtain the training parameters, and the security of the power data can be better improved without direct communication between the various concentrators.
  • AMI Advanced Metering Infrastructure
  • the data center is connected to the concentrator to obtain the training parameters, and the training Perform federated average aggregation of parameters, allocate the DBN training model to the concentrator according to the resources of the concentrator, and send the result of the federated average aggregation to the concentrator selected in the next round, so that the concentrator
  • the controller continues to train the electric data until the DBN training model converges.
  • the federated attention DBN collaborative detection system based on client selection provided by this embodiment can greatly shorten the training time, and can also achieve the accuracy close to the centralized method; compared with the distributed detection model, it is better than the distributed unaggregated The training model on the device, while protecting the data privacy well.
  • the concentrator includes a data collection module, a data processing module, and an attack detection module, the data collection module is connected in communication with the data processing module, and the data collection module is connected in communication with the attack detection module.
  • FIG. 1 is a flowchart of a DBN detection method of an attention mechanism provided by an embodiment of the present invention.
  • the DBN detection method of an attention mechanism includes but is not limited to steps S110 to S140.
  • Step S110 initializing the DBN training model with attention mechanism
  • Step S120 select a concentrator to participate in training according to concentrator resources, and send a DBN training model to the concentrator;
  • Step S130 receiving the training parameters obtained from the concentrator training, the training parameters are obtained by the concentrator using the DBN training model with attention mechanism to train the power data;
  • Step S140 perform federated average fitting on the training parameters, and send the result of federated average aggregation to the selected concentrator in the next round, so that the concentrator continues to train the power data until the DBN training model converges.
  • the DBN detection method of the attention mechanism provided by this embodiment can effectively shorten the training time, and has the accuracy close to the centralized method; compared with the distributed detection model, it is better than the distributed unidentified Aggregated on-device training models while maintaining data privacy well.
  • Attention Model is widely used in various types of deep learning tasks such as natural language processing, image recognition, and speech recognition. It is one of the core technologies worthy of attention and in-depth understanding in deep learning technology. Its goal It is to select the information related to the current task from the messy information and reduce the influence of noise on the results.
  • the attention mechanism in deep learning is essentially similar to the selective visual attention mechanism of human beings. The core goal is to select information that is more critical to the current task goal from a large number of information.
  • the input matrix of the attention mechanism includes a key matrix, a value matrix and a query matrix
  • the output matrix of the attention mechanism includes a context matrix
  • the attention module has three input matrices: the key matrix K TXh , the value matrix V TXh and the query matrix Q TXh , and the output is the context matrix C TXh , which is calculated as follows:
  • FIG. 2 is a flow chart of a DBN detection method with an attention mechanism provided by another embodiment of the present invention.
  • the DBN detection method with an attention mechanism includes but is not limited to steps S210 to S240.
  • Step S210 receiving the DBN training model with attention mechanism from the data center, and obtaining the power data collected by the smart meter;
  • Step S220 using the DBN training model to train the power data to obtain training parameters
  • Step S230 sending the training parameters to the data center, so that the data center performs federated average aggregation on the training parameters;
  • Step S240 receiving the federated average aggregation result from the data center, and continuing to train the electric power data according to the federated average aggregation result until the DBN training model converges.
  • the concentrator receives a DBN training model with an attention mechanism from the data center, obtains the power data collected by the smart meter and keeps it in the concentrator, and uses the DBN training model to train the power data Obtain training parameters, send the training parameters to the data center, receive the result of federated average aggregation from the data center, and continue training on the power data in the concentrator according to the result of the federated average aggregation until the The DBN training model converges.
  • the DBN detection method of the attention mechanism provided by this embodiment can effectively shorten the training time, and has the accuracy close to the centralized method; compared with the distributed detection model, it is better than the distributed unidentified Aggregated on-device training models while maintaining data privacy well.
  • using the DBN training model to train the electric power data to obtain training parameters includes the following steps:
  • Step S1 input the power data into the first layer RBM for pre-training to obtain the training result
  • Step S2 input the training result to the second layer RBM for training
  • Step S3 repeat step S1, step S2 until the maximum number of iterations with iteration;
  • Step S4 use the softmax layer to perform backpropagation, and fine-tune the weights of the entire DBN network.
  • the RBM includes a visible layer and a hidden layer, and further includes a step of: performing layer-by-layer training on the RBM, and calculating activation probabilities of the visible layer and the hidden layer using an activation function.
  • FIG. 3 is a DBN model diagram based on the attention mechanism of the DBN detection method of the attention mechanism provided by another embodiment of the present invention.
  • using a larger proportion of concentrators can improve the performance of the model and enhance the detection accuracy of the model. If the greedy scheme is adopted, the objective function of each round can be minimized, but concentrators with low security risks, low computing power, and poor communication quality have fewer chances to be selected for training, which means that the local data of these concentrators has a significant impact on the global The contribution of the model is small. With the deviation of concentrator selection, the generalization ability of the global model will decrease, so the fairness of concentrator selection is also a factor to be considered in concentrator selection. The goal is to select as many concentrators as possible for model training under the condition that the concentrator comprehensively considers security risks, computing power, communication quality and fairness.
  • Cyber attack risk refers to the possibility of cyber attacks and the consequences of cyber attacks.
  • the formula is as follows:
  • P refers to the probability of a successful cyber attack
  • C refers to the consequences of a cyber attack.
  • the concentrator is responsible for model training and attack detection, so there will be attacks in the transmission process from data generation to the concentrator.
  • the smart meter is vulnerable to network attacks, and the data transmission from the smart meter to the concentrator is also vulnerable to attacks, so The attack risk of the device and the attack risk of the communication link need to be considered.
  • the attacker randomly selects the attack target. According to the possibility of successful attack on the device, it is related to the degree of defense. The weaker the defense, the easier it is for the attacker to attack it. Assuming that the number of smart meters in charge of the concentrator k is M k , and all smart meters on a concentrator have the same degree of defense, let the defense resources of the smart meter M k be (Referring to protective measures, such as firewalls, personnel security, encryption, etc.), the defense effect expression of smart meters is as follows:
  • the attack probability of the smart meter on the concentrator k is:
  • attack probability of the link on the concentrator k is for:
  • the weight of the concentrator k participating in the training for t rounds is as follows:
  • represents the expected rate of selecting a concentrator.
  • the performance of the generated model is better than that of the same number of concentrators and the decrease of the number of concentrators in each round.
  • the purpose is to ensure that the average weight of each concentrator in each round is as large as possible when the long-term average selection rate of each concentrator is greater than ⁇ , and the number of concentrators selected in each round is at least m.
  • the DBN neural network is used, and the dot product attention module and DBN module are used. This embodiment does not encode the position of the original data. These modules are the same as the transformer modules, but they are combined differently.
  • the input data It is converted by position coding and input coding, and then input into the SDA and DBN modules, and finally output. Each output of the model is a predicted value of a time slot.
  • the DBN model of the attention mechanism first encodes the input data.
  • Each row of the input data is a feature vector of a time point, and the input encoding makes the original data linearly transformed:
  • Win is the linear transformation matrix
  • bin is the bias
  • Win is randomly initialized and updated together with other parameters during training.
  • This embodiment does not perform position encoding on the input data, but explains the position encoding, because for machine translation, the context of a sentence has an important impact on translation.
  • the interrelationship of NSL data attributes is not so close, and the order of attributes has little effect on the category of data, so the position encoding of the input data is not performed.
  • the data after input encoding is input into the RBM of DBN, here only the RBM of DBN is used to train the input data and output the hidden state.
  • Traditional RBM can only accept binary input, which is easy to cause data loss.
  • the input data in this embodiment includes continuous data, so in order to be able to process these continuously distributed real-valued data, on the basis of traditional RBM, the input nodes are divided into binary variables The node expands to a real-valued variable node for continuous input. Hidden nodes are still binary nodes.
  • the attention module has three input matrices: the key matrix K TXh , the value matrix V TXh and the query matrix Q TXh , and the output is the context matrix C TXh , which is calculated as follows:
  • the data output by the DBN module passes through a fully connected layer and softmax activation function, and this layer outputs the classification result.
  • the output layer outputs five probability values, corresponding to the maximum probability The value is the corresponding category.
  • the loss function used here is the mean square error function.
  • the global model uses the Adam optimizer to optimize the network structure.
  • the present invention provides a federated attention DBN cooperative detection device based on client selection, comprising: a memory, a processor, and a computer program stored on the memory and operable on the processor, and the processor executes the computer program When implementing the DBN detection method of the above-mentioned attention mechanism.
  • an embodiment of the present invention also provides a computer-readable storage medium, the computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are executed by one or more control processors, for example, control The processor can execute the method steps S110 to S140 in FIG. 1 and the method steps S210 to S240 in FIG. 2 .
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disk (DVD) or other optical disk storage, magnetic cartridges, tape, magnetic disk storage or other magnetic storage devices, or can Any other medium used to store desired information and which can be accessed by a computer.
  • communication media typically embodies computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism, and may include any information delivery media .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Software Systems (AREA)
  • Bioethics (AREA)
  • Computer Security & Cryptography (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Computer Hardware Design (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Telephonic Communication Services (AREA)

Abstract

Est divulgué dans la présente invention un système de détection collaborative de DBN à attention fédérée basé sur des sélections de clients. Le système comprend : une pluralité de compteurs électriques intelligents, qui sont utilisés pour collecter des données d'électricité ; une pluralité de concentrateurs, chacun d'eux étant en connexion de communication avec une pluralité de compteurs électriques intelligents sous la commande de ceux-ci, et étant utilisé pour acquérir les données d'électricité à partir des compteurs électriques intelligents correspondants, et entraîner les données d'électricité à l'aide d'un modèle d'apprentissage de DBN ayant un mécanisme d'attention, de façon à obtenir des paramètres d'apprentissage ; et un centre de données, qui est en liaison de communication avec tous les concentrateurs, et est utilisé pour acquérir les paramètres d'apprentissage à partir de chaque concentrateur, réaliser une agrégation de moyenne fédérée sur les paramètres d'apprentissage, attribuer le modèle d'apprentissage de DBN au concentrateur en fonction des ressources du concentrateur, et envoyer un résultat d'agrégation de moyenne fédérée à un concentrateur du cycle suivant, de telle sorte que le concentrateur entraîne la convergence du modèle d'apprentissage de DBN. Ainsi, l'efficacité d'apprentissage fédéré peut être efficacement améliorée, la précision d'apprentissage de modèle peut être améliorée, et la sécurité de confidentialité de données peut être améliorée.
PCT/CN2022/098981 2021-06-24 2022-06-15 Système de détection collaborative de dbn à attention fédérée basé sur des sélections de clients WO2022267960A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110703207.5A CN113392919B (zh) 2021-06-24 2021-06-24 一种注意力机制的深度信念网络dbn检测方法
CN202110703207.5 2021-06-24

Publications (1)

Publication Number Publication Date
WO2022267960A1 true WO2022267960A1 (fr) 2022-12-29

Family

ID=77623687

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/098981 WO2022267960A1 (fr) 2021-06-24 2022-06-15 Système de détection collaborative de dbn à attention fédérée basé sur des sélections de clients

Country Status (2)

Country Link
CN (1) CN113392919B (fr)
WO (1) WO2022267960A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116074844A (zh) * 2023-04-06 2023-05-05 广东电力交易中心有限责任公司 一种基于全流量自适应检测的5g切片逃逸攻击检测方法
CN116561696A (zh) * 2023-01-11 2023-08-08 上海合煌能源科技有限公司 基于多维度的用户可调节负荷快速聚合方法及其系统
CN116977272A (zh) * 2023-05-05 2023-10-31 深圳市第二人民医院(深圳市转化医学研究院) 一种基于联邦图注意力学习的结构磁共振图像处理方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113392919B (zh) * 2021-06-24 2023-04-28 长沙理工大学 一种注意力机制的深度信念网络dbn检测方法
CN115208604B (zh) * 2022-02-22 2024-03-15 长沙理工大学 一种ami网络入侵检测的方法、装置及介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109192199A (zh) * 2018-06-30 2019-01-11 中国人民解放军战略支援部队信息工程大学 一种结合瓶颈特征声学模型的数据处理方法
CN110211574A (zh) * 2019-06-03 2019-09-06 哈尔滨工业大学 基于瓶颈特征和多尺度多头注意力机制的语音识别模型建立方法
CN112800461A (zh) * 2021-01-28 2021-05-14 深圳供电局有限公司 一种基于联邦学习框架的电力计量系统网络入侵检测方法
CN113392919A (zh) * 2021-06-24 2021-09-14 长沙理工大学 基于客户端选择的联邦注意力dbn协同检测系统

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3032852A1 (fr) * 2015-02-13 2016-08-19 Orange Procede de selection de concentrateurs de connexions reseau
CN106295323A (zh) * 2016-07-27 2017-01-04 苏盛 基于云安全的高级计量体系恶意软件检测方法
US11853891B2 (en) * 2019-03-11 2023-12-26 Sharecare AI, Inc. System and method with federated learning model for medical research applications
CN111537945B (zh) * 2020-06-28 2021-05-11 南方电网科学研究院有限责任公司 基于联邦学习的智能电表故障诊断方法及设备
CN111723942B (zh) * 2020-06-29 2024-02-02 南方电网科学研究院有限责任公司 一种企业用电负荷预测方法、电网业务子系统及预测系统
CN112181666B (zh) * 2020-10-26 2023-09-01 华侨大学 一种基于边缘智能的设备评估和联邦学习重要性聚合方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109192199A (zh) * 2018-06-30 2019-01-11 中国人民解放军战略支援部队信息工程大学 一种结合瓶颈特征声学模型的数据处理方法
CN110211574A (zh) * 2019-06-03 2019-09-06 哈尔滨工业大学 基于瓶颈特征和多尺度多头注意力机制的语音识别模型建立方法
CN112800461A (zh) * 2021-01-28 2021-05-14 深圳供电局有限公司 一种基于联邦学习框架的电力计量系统网络入侵检测方法
CN113392919A (zh) * 2021-06-24 2021-09-14 长沙理工大学 基于客户端选择的联邦注意力dbn协同检测系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NI GAO, GAO LING;HE YIYUE;GAO QUANLI;REN JIE: "Intrusion detection model based on deep belief nets", JOURNAL OF SOUTHEAST UNIVERSITY (ENGLISH EDITION), SOUTHEAST UNIVERSITY; DONGNAN DAXUE (CHINESE ELECTRONIC PERIODICAL SERVICES), CHINA, vol. 31, no. 3, 15 September 2015 (2015-09-15), China , pages 339 - 346, XP093016816, ISSN: 1003-7985 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116561696A (zh) * 2023-01-11 2023-08-08 上海合煌能源科技有限公司 基于多维度的用户可调节负荷快速聚合方法及其系统
CN116561696B (zh) * 2023-01-11 2024-04-16 上海合煌能源科技有限公司 基于多维度的用户可调节负荷快速聚合方法及其系统
CN116074844A (zh) * 2023-04-06 2023-05-05 广东电力交易中心有限责任公司 一种基于全流量自适应检测的5g切片逃逸攻击检测方法
CN116977272A (zh) * 2023-05-05 2023-10-31 深圳市第二人民医院(深圳市转化医学研究院) 一种基于联邦图注意力学习的结构磁共振图像处理方法

Also Published As

Publication number Publication date
CN113392919B (zh) 2023-04-28
CN113392919A (zh) 2021-09-14

Similar Documents

Publication Publication Date Title
WO2022267960A1 (fr) Système de détection collaborative de dbn à attention fédérée basé sur des sélections de clients
Li et al. Wind power forecasting considering data privacy protection: A federated deep reinforcement learning approach
Li et al. Privacy-preserving spatiotemporal scenario generation of renewable energies: A federated deep generative learning approach
Wang et al. Improving fairness in graph neural networks via mitigating sensitive attribute leakage
CN113408743B (zh) 联邦模型的生成方法、装置、电子设备和存储介质
WO2021128805A1 (fr) Procédé d'attribution de ressources de réseau sans fil utilisant un apprentissage par renforcement antagoniste génératif
CN112668044B (zh) 面向联邦学习的隐私保护方法及装置
US20210374617A1 (en) Methods and systems for horizontal federated learning using non-iid data
Liu et al. Keep your data locally: Federated-learning-based data privacy preservation in edge computing
Zhang et al. Energy theft detection in an edge data center using threshold-based abnormality detector
CN112926747B (zh) 优化业务模型的方法及装置
Zhou et al. Network traffic prediction method based on echo state network with adaptive reservoir
CN115879542A (zh) 一种面向非独立同分布异构数据的联邦学习方法
WO2022095246A1 (fr) Procédé de prise de décision coopérative de réseau intelligent de périphérie basé sur un mécanisme de confidentialité différentielle
Hao et al. Producing more with less: a GAN-based network attack detection approach for imbalanced data
CN113850399A (zh) 一种基于预测置信度序列的联邦学习成员推断方法
Qu et al. Personalized federated learning for heterogeneous residential load forecasting
CN112910865B (zh) 一种基于因子图的推断攻击阶段最大似然估计方法及系统
Cheng et al. GFL: Federated learning on non-IID data via privacy-preserving synthetic data
Sun et al. Communication-efficient vertical federated learning with limited overlapping samples
CN114003957A (zh) 一种基于联邦学习社交媒体用户隐私信息保护方法和系统
Xing et al. N-fedavg: Novel federated average algorithm based on fedavg
WO2024113947A1 (fr) Procédé et appareil d'apprentissage pour réseau neuronal à graphe tenant compte de la protection de la confidentialité et de l'équité
CN117078259A (zh) 基于图随机神经网络的跨链异常交易检测方法及系统
CN115208604B (zh) 一种ami网络入侵检测的方法、装置及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22827455

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22827455

Country of ref document: EP

Kind code of ref document: A1