CN116205311A - Federal learning method based on Shapley value - Google Patents

Federal learning method based on Shapley value Download PDF

Info

Publication number
CN116205311A
CN116205311A CN202310124072.6A CN202310124072A CN116205311A CN 116205311 A CN116205311 A CN 116205311A CN 202310124072 A CN202310124072 A CN 202310124072A CN 116205311 A CN116205311 A CN 116205311A
Authority
CN
China
Prior art keywords
model parameters
client
clients
graph
round
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310124072.6A
Other languages
Chinese (zh)
Inventor
朱亚萍
赵生捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202310124072.6A priority Critical patent/CN116205311A/en
Publication of CN116205311A publication Critical patent/CN116205311A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a federal learning method based on Shapley values. According to the method, the difference of data distribution of different clients in federal learning is considered, and when global model parameters are acquired, the parameters of the global training target are weighted and aggregated according to the contribution of a local training model of the client to the global training target. After each iteration training of federal learning, a weighted graph is constructed according to cosine similarity among local model parameters of each client, and a shape value of each client vertex in the graph is calculated. The server sets corresponding weight coefficients for the model parameters of each client based on the shape values of the clients, and carries out weighted aggregation on the model parameters of the clients according to the coefficients to obtain global model parameters of the next training until the training target is reached.

Description

一种基于Shapley值的联邦学习方法A federated learning method based on Shapley value

技术领域Technical Field

本发明属于机器学习领域,尤其涉及其中的联邦机器学习方法。The present invention belongs to the field of machine learning, and in particular to a federated machine learning method therein.

背景技术Background Art

随着智能终端以及物联网互联设备的大规模增长,海量数据的处理已经成为数字化转型时代的必要技术。对于大规模应用场景,联邦机器学习方法成为其中一项关键技术。作为一种新型分布式学习方法,联邦学习可以缓解单个服务器处理大规模数据时所需面临的高计算量等问题,将对数据的训练下派给多个客户端处理来分担开销。同时,通过让不同客户端在不共享原始数据的前提下联合建模,实现隐私保护、确保数据安全。With the massive growth of smart terminals and IoT devices, the processing of massive data has become a necessary technology in the era of digital transformation. For large-scale application scenarios, federated machine learning methods have become one of the key technologies. As a new distributed learning method, federated learning can alleviate the high computing volume and other problems faced by a single server when processing large-scale data, and distribute the training of data to multiple clients to share the overhead. At the same time, by allowing different clients to jointly model without sharing the original data, privacy protection is achieved and data security is ensured.

联邦学习通过多个客户端同时训练本地模型,并对来自不同客户端的本地模型进行聚合得到全局模型。然而,由于联邦学习中的数据通常不是均匀分布在不同客户端的,并且一般是非独立同分布类型,因而存在数据异质性问题。如果在聚合不同客户端的本地模型时不加区分,会因为数据异质性导致整体的训练效果变差。因此,需要设计合理有效的方法应对联邦学习中的数据异质性问题,以提升训练的整体效果。Federated learning trains local models through multiple clients at the same time, and aggregates local models from different clients to obtain a global model. However, since the data in federated learning is usually not evenly distributed on different clients and is generally not independent and identically distributed, there is a problem of data heterogeneity. If the local models of different clients are aggregated without distinction, the overall training effect will be deteriorated due to data heterogeneity. Therefore, it is necessary to design reasonable and effective methods to deal with the problem of data heterogeneity in federated learning in order to improve the overall training effect.

发明内容Summary of the invention

技术问题:在联邦学习中,参与学习的多个客户端上的数据一般是非独立同分布的,并且不同客户端在不同轮迭代时训练出的模型对总体训练目标产生的效果往往是不同的。因此,在聚合来自不同客户端的本地参数模型时,需要对其加以区分,充分考虑不同客户端训练出的本地模型之间存在的差异,缓解因数据异质性对整体联邦学习效果产生的不利影响。Technical issues: In federated learning, the data on multiple clients participating in learning are generally not independent and identically distributed, and the models trained by different clients in different rounds of iterations often have different effects on the overall training objectives. Therefore, when aggregating local parameter models from different clients, it is necessary to distinguish them and fully consider the differences between local models trained by different clients to alleviate the adverse effects of data heterogeneity on the overall federated learning effect.

技术方案:为解决上述技术问题,本发明提供一种基于Shapley值的联邦学习方法,其特征在于该方法在联邦学习中聚合客户端的本地模型参数时,基于每个客户端的Shapley值设置权重系数,再根据该权重系数对本地模型参数进行加权聚合得到全局模型参数。Technical solution: In order to solve the above technical problems, the present invention provides a federated learning method based on Shapley value, which is characterized in that when aggregating the local model parameters of the client in federated learning, the method sets a weight coefficient based on the Shapley value of each client, and then performs weighted aggregation on the local model parameters according to the weight coefficient to obtain the global model parameters.

进一步地,在联邦学习的每轮迭代训练结束时构建一个加权图,图的顶点是所有参与本轮学习的客户端,客户端两两之间相连构成图的边,边的权重是连接该条边的两个客户端的本地模型参数之间的余弦相似度。Furthermore, at the end of each round of iterative training of federated learning, a weighted graph is constructed. The vertices of the graph are all the clients participating in this round of learning. The clients are connected to each other to form the edges of the graph. The weight of the edge is the cosine similarity between the local model parameters of the two clients connecting the edge.

根据构建出的图来计算每个客户端的Shapley值,对于图中的某一顶点(即客户端)i,其Shapley值(记为

Figure BDA0004081135610000021
)的计算方法为The Shapley value of each client is calculated based on the constructed graph. For a vertex (i.e., client) i in the graph, its Shapley value (denoted as
Figure BDA0004081135610000021
) is calculated as

Figure BDA0004081135610000022
Figure BDA0004081135610000022

其中,Si表示包含顶点成员i的所有子集构成的集合,|s|表示集合s中的元素个数,n表示图中所有顶点的个数,j是集合s中除了i的任一顶点,eij是连接顶点i和j的边所对应的权重。Among them, Si represents the set consisting of all subsets containing vertex member i, |s| represents the number of elements in set s, n represents the number of all vertices in the graph, j is any vertex in set s except i, and eij is the weight corresponding to the edge connecting vertices i and j.

每轮迭代后的全局模型参数为所有参与本轮学习的客户端的本地模型参数的加权求和,其中客户端i的本地模型参数所对应的加权系数为关于

Figure BDA0004081135610000023
的函数的归一化值,具体计算方法为The global model parameters after each round of iteration are the weighted sum of the local model parameters of all clients participating in this round of learning, where the weight coefficient corresponding to the local model parameters of client i is
Figure BDA0004081135610000023
The normalized value of the function is calculated as follows:

Figure BDA0004081135610000024
Figure BDA0004081135610000024

其中,

Figure BDA0004081135610000025
是第t轮训练结束后的全局模型参数,Lt表示参与第t轮学习的所有客户端构成的集合,
Figure BDA0004081135610000026
是关于
Figure BDA0004081135610000027
的函数,
Figure BDA0004081135610000028
是客户端i在第t轮训练得到的本地模型参数。in,
Figure BDA0004081135610000025
is the global model parameter after the tth round of training, Lt represents the set of all clients participating in the tth round of learning,
Figure BDA0004081135610000026
About
Figure BDA0004081135610000027
The function of
Figure BDA0004081135610000028
are the local model parameters obtained by client i in the tth round of training.

有益效果:本发明充分考虑在联邦学习中不同客户端之间可能存在的数据差异,在对本地模型参数进行聚合时,基于Shapley值计算不同客户端在每次迭代时得到的本地模型对整体训练目标产生的贡献,根据贡献值设置不同的加权系数,减轻数据异质性对联邦学习带来的不利影响。Beneficial effects: The present invention fully considers the possible data differences between different clients in federated learning. When aggregating local model parameters, the contribution of local models obtained by different clients in each iteration to the overall training objective is calculated based on the Shapley value. Different weighting coefficients are set according to the contribution values to reduce the adverse effects of data heterogeneity on federated learning.

附图说明BRIEF DESCRIPTION OF THE DRAWINGS

图1是本发明提出的基于Shapley值的联邦学习方法的流程图。FIG1 is a flow chart of the federated learning method based on Shapley value proposed in the present invention.

具体实施方式DETAILED DESCRIPTION

一种基于Shapley值的联邦学习方法,其特征在于该方法在联邦学习中聚合客户端的本地模型参数时,基于每个客户端的Shapley值设置权重系数,再根据该权重系数对本地模型参数进行加权聚合得到全局模型参数。A federated learning method based on Shapley value, characterized in that when aggregating local model parameters of clients in federated learning, the method sets a weight coefficient based on the Shapley value of each client, and then performs weighted aggregation on the local model parameters according to the weight coefficient to obtain a global model parameter.

结合附图1及相关公式,对本发明方案设计作进一步的具体描述。In conjunction with FIG. 1 and related formulas, the design scheme of the present invention is further described in detail.

在联邦学习开始时,中心服务器随机选取参与下一轮迭代训练的客户端,并向所有选中的客户端下发一个初始全局模型参数。客户端在全局模型参数的基础上进行本地训练,得到新一轮的本地模型参数。At the beginning of federated learning, the central server randomly selects clients to participate in the next round of iterative training and sends an initial global model parameter to all selected clients. The client performs local training based on the global model parameters to obtain a new round of local model parameters.

假设在联邦学习的第t轮训练时,共有n个客户端参与本轮训练,记为集合Lt。其中,客户端i(i∈Lt)在本轮训练得到的参数模型为

Figure BDA0004081135610000031
在该轮训练结束后,每个客户端将训练得到的本地模型参数上传给服务器,服务器根据客户端的本地模型参数构建一个图,其中图的顶点为参与本轮学习的客户端,客户端两两之间相连构成图的边,边的权重是连接该条边的两个客户端的本地模型参数之间的余弦相似度。具体地,对于顶点i和j之间构成的边所对应的权重eij为Assume that in the tth round of training of federated learning, there are n clients participating in this round of training, denoted as set L t . Among them, the parameter model obtained by client i (i∈L t ) in this round of training is
Figure BDA0004081135610000031
After the training is completed, each client uploads the local model parameters obtained through training to the server. The server constructs a graph based on the local model parameters of the client. The vertices of the graph are the clients participating in this round of learning. The clients are connected to form the edges of the graph. The weight of the edge is the cosine similarity between the local model parameters of the two clients connected to the edge. Specifically, the weight e ij corresponding to the edge formed between vertices i and j is

Figure BDA0004081135610000032
Figure BDA0004081135610000032

其中,分子表示向量

Figure BDA0004081135610000033
与向量
Figure BDA0004081135610000034
之间的点积,分母中的
Figure BDA0004081135610000035
Figure BDA0004081135610000036
分别表示向量
Figure BDA0004081135610000037
Figure BDA0004081135610000038
的模。The numerator represents the vector
Figure BDA0004081135610000033
With vector
Figure BDA0004081135610000034
The dot product between
Figure BDA0004081135610000035
and
Figure BDA0004081135610000036
Respectively represent vectors
Figure BDA0004081135610000037
and
Figure BDA0004081135610000038
Model.

服务器计算上述构建出的图中每个顶点的Shapley值,具体地,对于图中的某一顶点(即客户端)i,其Shapley值(记为

Figure BDA0004081135610000039
)的计算方法为The server calculates the Shapley value of each vertex in the graph constructed above. Specifically, for a vertex (i.e., client) i in the graph, its Shapley value (denoted as
Figure BDA0004081135610000039
) is calculated as

Figure BDA00040811356100000310
Figure BDA00040811356100000310

其中,Si表示包含顶点成员i的所有子集构成的集合,|s|表示集合s中的元素个数,n表示图中所有顶点的个数,j是集合s中除了i的任一顶点,eij是连接顶点i和j的边所对应的权重。Among them, Si represents the set consisting of all subsets containing vertex member i, |s| represents the number of elements in set s, n represents the number of all vertices in the graph, j is any vertex in set s except i, and eij is the weight corresponding to the edge connecting vertices i and j.

计算出每个客户端顶点的Shapley值之后,服务器对所有参与本轮学习的客户端的本地模型参数进行加权求和得到新的全局模型参数,具体计算方法为After calculating the Shapley value of each client vertex, the server performs a weighted summation of the local model parameters of all clients participating in this round of learning to obtain the new global model parameters. The specific calculation method is:

Figure BDA00040811356100000311
Figure BDA00040811356100000311

其中,

Figure BDA00040811356100000312
是第t轮训练结束后的全局模型参数,
Figure BDA00040811356100000313
是关于
Figure BDA00040811356100000314
的函数。in,
Figure BDA00040811356100000312
is the global model parameter after the tth round of training,
Figure BDA00040811356100000313
About
Figure BDA00040811356100000314
function.

服务器将每轮迭代结束后聚合得到的全局模型参数下发给下一轮选中参与学习的客户端,客户端在该全局模型参数的基础上进行新一轮的训练,直到达到整体的训练收敛目标。The server sends the global model parameters aggregated after each round of iteration to the clients selected to participate in the next round of learning. The clients perform a new round of training based on the global model parameters until the overall training convergence goal is reached.

以上所述仅为本发明的较佳实施方式,本发明的保护范围并不以上述实施方式为限,但凡本领域普通技术人员根据本发明所揭示内容所作的等效修饰或变化,皆应纳入权利要求书中记载的保护范围内。The above description is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above embodiment. Any equivalent modifications or changes made by ordinary technicians in this field based on the contents disclosed by the present invention should be included in the protection scope recorded in the claims.

Claims (5)

1.一种基于Shapley值的联邦学习方法,其特征在于该方法在联邦学习中聚合客户端的本地模型参数时,基于每个客户端的Shapley值设置权重系数,再根据该权重系数对本地模型参数进行加权聚合得到全局模型参数。1. A federated learning method based on Shapley value, characterized in that when aggregating local model parameters of clients in federated learning, this method sets a weight coefficient based on the Shapley value of each client, and then performs weighted aggregation on the local model parameters according to the weight coefficient to obtain a global model parameter. 2.根据权利要求1所述的基于Shapley值的联邦学习方法,其特征在于,在联邦学习的每轮迭代训练结束时构建一个加权图,图的顶点是所有参与本轮学习的客户端,客户端两两之间相连构成图的边,边的权重是连接该条边的两个客户端的本地模型参数之间的余弦相似度。2. According to the Shapley value-based federated learning method of claim 1, it is characterized in that a weighted graph is constructed at the end of each round of iterative training of federated learning, the vertices of the graph are all the clients participating in this round of learning, the clients are connected to each other to form the edges of the graph, and the weight of the edge is the cosine similarity between the local model parameters of the two clients connecting the edge. 3.根据权利要求1所述的基于Shapley值的联邦学习方法,其特征在于,对于顶点i和j之间构成的边所对应的权重eij
Figure FDA0004081135590000011
3. The federated learning method based on Shapley value according to claim 1 is characterized in that the weight e ij corresponding to the edge formed between vertices i and j is
Figure FDA0004081135590000011
其中,分子表示向量
Figure FDA0004081135590000012
与向量
Figure FDA0004081135590000013
之间的点积,分母中的
Figure FDA0004081135590000014
Figure FDA0004081135590000015
分别表示向量
Figure FDA0004081135590000016
Figure FDA0004081135590000017
的模。
The numerator represents the vector
Figure FDA0004081135590000012
With vector
Figure FDA0004081135590000013
The dot product between
Figure FDA0004081135590000014
and
Figure FDA0004081135590000015
Respectively represent vectors
Figure FDA0004081135590000016
and
Figure FDA0004081135590000017
Model.
4.根据权利要求2所述的基于Shapley值的联邦学习方法,其特征在于基于构建出的图来计算每个客户端的Shapley值,对于图中的某一顶点(即客户端)i,其Shapley值(记为
Figure FDA0004081135590000018
)的计算方法为
4. The federated learning method based on Shapley value according to claim 2 is characterized in that the Shapley value of each client is calculated based on the constructed graph, and for a vertex (i.e., client) i in the graph, its Shapley value (denoted as
Figure FDA0004081135590000018
) is calculated as
Figure FDA0004081135590000019
Figure FDA0004081135590000019
其中,Si表示包含顶点成员客户端i的所有子集构成的集合,|s|表示集合s中的元素个数,n表示图中所有顶点的个数,j是集合s中除了i的任一顶点,eij是连接顶点i和j的边所对应的权重。Among them, Si represents the set consisting of all subsets containing vertex member client i, |s| represents the number of elements in set s, n represents the number of all vertices in the graph, j is any vertex in set s except i, and eij is the weight corresponding to the edge connecting vertices i and j.
5.根据权利要求1所述的基于Shapley值的联邦学习方法,其特征在于,每轮迭代后的全局模型参数为所有参与本轮学习的客户端的本地模型参数的加权求和,其中客户端i的本地模型参数所对应的加权系数为所述Shapley值
Figure FDA00040811355900000115
的函数的归一化值,具体计算方法为
5. The federated learning method based on Shapley value according to claim 1 is characterized in that the global model parameters after each round of iteration are the weighted sum of the local model parameters of all clients participating in this round of learning, where the weight coefficient corresponding to the local model parameter of client i is the Shapley value
Figure FDA00040811355900000115
The normalized value of the function is calculated as follows:
Figure FDA00040811355900000110
Figure FDA00040811355900000110
其中,
Figure FDA00040811355900000111
是第t轮训练结束后的全局模型参数,Lt表示参与第t轮学习的所有客户端i构成的集合,
Figure FDA00040811355900000112
是关于
Figure FDA00040811355900000113
的函数,
Figure FDA00040811355900000114
是客户端i在第t轮训练得到的本地模型参数。
in,
Figure FDA00040811355900000111
is the global model parameter after the tth round of training, Lt represents the set of all clients i participating in the tth round of learning,
Figure FDA00040811355900000112
About
Figure FDA00040811355900000113
The function of
Figure FDA00040811355900000114
are the local model parameters obtained by client i in the tth round of training.
CN202310124072.6A 2023-02-16 2023-02-16 Federal learning method based on Shapley value Pending CN116205311A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310124072.6A CN116205311A (en) 2023-02-16 2023-02-16 Federal learning method based on Shapley value

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310124072.6A CN116205311A (en) 2023-02-16 2023-02-16 Federal learning method based on Shapley value

Publications (1)

Publication Number Publication Date
CN116205311A true CN116205311A (en) 2023-06-02

Family

ID=86508965

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310124072.6A Pending CN116205311A (en) 2023-02-16 2023-02-16 Federal learning method based on Shapley value

Country Status (1)

Country Link
CN (1) CN116205311A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116522988A (en) * 2023-07-03 2023-08-01 粤港澳大湾区数字经济研究院(福田) Federal learning method, system, terminal and medium based on graph structure learning
CN117057442A (en) * 2023-10-09 2023-11-14 之江实验室 Model training method, device and equipment based on federal multitask learning
CN119884760A (en) * 2025-03-26 2025-04-25 中国能源建设集团湖南省电力设计院有限公司 Hydropower production simulation data generation method and system based on overseas new energy consumption

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116522988A (en) * 2023-07-03 2023-08-01 粤港澳大湾区数字经济研究院(福田) Federal learning method, system, terminal and medium based on graph structure learning
CN116522988B (en) * 2023-07-03 2023-10-31 粤港澳大湾区数字经济研究院(福田) Federal learning method, system, terminal and medium based on graph structure learning
CN117057442A (en) * 2023-10-09 2023-11-14 之江实验室 Model training method, device and equipment based on federal multitask learning
CN119884760A (en) * 2025-03-26 2025-04-25 中国能源建设集团湖南省电力设计院有限公司 Hydropower production simulation data generation method and system based on overseas new energy consumption

Similar Documents

Publication Publication Date Title
CN116205311A (en) Federal learning method based on Shapley value
CN114510652B (en) Social collaborative filtering recommendation method based on federal learning
CN115271099A (en) Self-adaptive personalized federal learning method supporting heterogeneous model
CN115552429A (en) Method and system for horizontal federal learning using non-IID data
CN114330125A (en) Knowledge distillation-based joint learning training method, device, equipment and medium
CN113850272A (en) A Federated Learning Image Classification Method Based on Local Differential Privacy
CN110969250A (en) Neural network training method and device
CN114841364A (en) Federal learning method capable of meeting personalized local differential privacy requirements
CN115481431A (en) Dual-disturbance-based privacy protection method for federated learning counterreasoning attack
CN111723947A (en) A training method and device for a federated learning model
CN113691594B (en) A method to solve the data imbalance problem in federated learning based on the second derivative
CN110175286A (en) It is combined into the Products Show method and system to optimization and matrix decomposition
CN114116707B (en) Method and device for determining contribution of participants in joint learning
CN114327889A (en) A Model Training Node Selection Method for Hierarchical Federated Edge Learning
CN117788949A (en) Image classification method based on personalized federated learning and conditional generative adversarial network
CN115131605A (en) Structure perception graph comparison learning method based on self-adaptive sub-graph
CN115115021A (en) Personalized Federated Learning Method Based on Asynchronous Update of Model Parameters
CN116629376A (en) A federated learning aggregation method and system based on data-free distillation
CN116611535A (en) An edge federated learning training method and system for heterogeneous data
Groeneboom et al. Estimation in monotone single‐index models
CN113377990B (en) Video/picture-text cross-modal matching training method based on meta-self-paced learning
CN118228841B (en) Personalized federal learning training method, system and equipment based on consistency modeling
CN118070926B (en) Multi-task federation learning method based on client resource self-adaption
CN118246009A (en) A federated learning poisoning attack defense method
CN118627588A (en) A self-supervised adversarial defense framework for federated learning based on unsupervised perturbations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination