CN113836556A - Federal learning-oriented decentralized function encryption privacy protection method and system - Google Patents

Federal learning-oriented decentralized function encryption privacy protection method and system Download PDF

Info

Publication number
CN113836556A
CN113836556A CN202111134122.6A CN202111134122A CN113836556A CN 113836556 A CN113836556 A CN 113836556A CN 202111134122 A CN202111134122 A CN 202111134122A CN 113836556 A CN113836556 A CN 113836556A
Authority
CN
China
Prior art keywords
encryption
model
weight
parameter
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111134122.6A
Other languages
Chinese (zh)
Other versions
CN113836556B (en
Inventor
冯纪元
殷丽华
孙哲
操志强
胡宇
李超
李然
李丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou University
Original Assignee
Guangzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou University filed Critical Guangzhou University
Priority to CN202111134122.6A priority Critical patent/CN113836556B/en
Publication of CN113836556A publication Critical patent/CN113836556A/en
Application granted granted Critical
Publication of CN113836556B publication Critical patent/CN113836556B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/64Protecting data integrity, e.g. using checksums, certificates or signatures

Abstract

The invention provides a Federal learning-oriented decentralized function encryption privacy protection method and a system, wherein the method comprises the steps of obtaining an initial model, a public data set, an encryption tag, an encryption prime number, an encryption weight and a weight vector parameter which are sent by a server; training the initial model according to the local data set to obtain a local model, and testing the local model according to the public data set to obtain model accuracy; generating an encryption private key and a partial decryption key according to the encryption prime number, and performing function encryption on the local model according to the encryption private key and the encryption label to obtain an encryption model; and sending the encryption model, the partial decryption key and the model accuracy to a server so that the server decrypts and aggregates the encryption model according to the partial decryption key, the encryption label, the encryption weight and the model accuracy to obtain a global model. The invention ensures that the server can not obtain the local model of the user, effectively prevents the collusion attack of the third party and the server, and improves the privacy protection degree and the service effect.

Description

Federal learning-oriented decentralized function encryption privacy protection method and system
Technical Field
The invention relates to the technical field of federal learning privacy protection, in particular to a federal learning-oriented decentralized function encryption privacy protection method and system.
Background
With the wide application of federal learning in the fields of digital image processing, natural language processing, text voice processing and the like, on the basis of breaking a data island and providing more accurate service, further solving the privacy disclosure problem in federal learning gradually becomes a key concern in the implementation and application thereof.
The existing privacy protection method applied to federal learning mainly comprises homomorphic encryption, safe multi-party computation and function encryption, such as a distributed selection random gradient descent method combined with homomorphic encryption, a method for aggregating a client update model by using a secret sharing technology in safe multi-party computation, and a method for ensuring parameter privacy by adding a trusted third party entity to be responsible for generating, managing and distributing keys and performing federal learning safe aggregation by using function encryption. Although the prior art provides privacy protection for federal learning to a certain extent, the prior art also has respective defects, for example, a homomorphic encryption technology can bring performance pressure and communication overhead to a certain extent to equipment with weak computing capability, if the computing overhead brought by model encryption is reduced, the security of a parameter aggregation process can be ignored inevitably, and further the privacy protection effect is influenced, and a credible third-party entity is introduced to be responsible for generating, managing and distributing keys, so that a malicious server and the third-party entity can collude to obtain encryption keys, and further the privacy exposure risk of a user model can be stolen.
Therefore, it is desirable to provide a privacy protection method that overcomes the problem of relying on a trusted third party entity in the prior art, and effectively prevents a trusted third party and a server from performing collusion attack while protecting the privacy of a client model.
Disclosure of Invention
The invention aims to provide a decentralized function encryption privacy protection method facing federal learning, which overcomes the problem that the existing privacy protection method depends on a trusted third party entity to generate, manage and distribute keys by means of interaction between a server and a client while ensuring that a server cannot obtain a specific gradient parameter of a local training model of each user, effectively prevents the trusted third party and the server from executing collusion attack risk, and further improves the privacy protection degree and the model service effect of the client model in federal learning.
In order to achieve the above object, it is necessary to provide a method, a system, a computer device and a storage medium for protecting privacy of a decentralization function encryption facing federal learning in view of the above technical problems.
In a first aspect, an embodiment of the present invention provides a federally-learned encryption privacy protection method based on a decentralized function, where the method includes the following steps:
acquiring an initial model, a public data set, an encryption label, an encryption prime number, an encryption weight and a weight vector parameter sent by a server; the initial model is obtained by the server through training according to the public data set;
training the initial model according to a local data set to obtain a local model, and testing the local model according to the public data set to obtain corresponding model accuracy;
generating an encryption private key and a partial decryption key according to the encryption prime number, the encryption weight and the weight vector parameter, and performing function encryption on the local model according to the encryption private key and the encryption label to obtain an encryption model;
and sending the encryption model, the partial decryption key and the model accuracy to the server so that the server decrypts and aggregates the encryption model according to the partial decryption key, the encryption label, the encryption weight and the model accuracy to obtain a global model.
Further, the step of training the initial model according to the local data set to obtain the local model includes:
setting a privacy budget and a noise parameter according to the distribution characteristics and the privacy protection requirements of the local model;
and according to the privacy budget and the noise parameters, performing localized differential privacy to add noise to the local model.
Further, the step of generating an encryption private key and a partial decryption key according to the encryption prime number, the encryption weight and the weight vector parameter includes:
generating a key parameter according to the encryption prime number, and taking the key parameter as the encryption private key; the key parameter is represented as:
Figure BDA0003279056070000031
in the formula, p and
Figure BDA0003279056070000032
respectively representing encrypted prime numbers and corresponding finite fields;
Figure BDA0003279056070000033
the key parameter of the ith client is represented and is a 2-dimensional vector;
generating a decryption parameter according to the encryption prime number, and generating the partial decryption key according to the decryption parameter, the key parameter, the encryption weight and the weight vector parameter; the partial decryption key is represented as:
Figure BDA0003279056070000034
in the formula (I), the compound is shown in the specification,
Figure BDA0003279056070000035
wherein the content of the first and second substances,
Figure BDA0003279056070000036
and yiRespectively representing a partial decryption key and an encryption weight of the ith client; t isiRepresents the decryption parameter generated by the ith client, and sigmai∈[n]TiN is the total number of participating training clients;
Figure BDA0003279056070000037
representing a hash function;
Figure BDA0003279056070000038
is represented by
Figure BDA0003279056070000039
A constructed 2 × 2 matrix;
Figure BDA00032790560700000310
represents a cyclic group of p-factorial method relating to bilinear pairs, and
Figure BDA00032790560700000311
respectively representing groups of multiplication cycles
Figure BDA00032790560700000312
A constructed 2-dimensional vector;
Figure BDA00032790560700000313
and
Figure BDA00032790560700000314
weight vector sum representing all client encryption weight components separatelyThe corresponding weight vector parameters.
Further, the step of generating a decryption parameter according to the encrypted prime number includes:
initializing a parameter matrix; the parameter matrix is a matrix with all elements being zero;
according to the encrypted prime numbers, negotiating with other clients respectively to determine a corresponding random matrix; the random matrix is represented as:
Figure BDA00032790560700000315
wherein, TiiRepresenting a random matrix determined by the negotiation of the ith client and the jth client;
Figure BDA00032790560700000316
representing a finite field
Figure BDA00032790560700000317
A constructed 2 × 2 matrix;
generating the decryption parameters according to the parameter matrix and the random matrix; the decryption parameters are expressed as:
Ti=T0+∑j∈n,i<jTij-∑j∈n,i<iTij
wherein, TiRepresenting a decryption parameter corresponding to the ith client; t is0A parameter matrix representing the client.
Further, the step of performing function encryption on the local model according to the encryption private key and the encryption label to obtain an encryption model includes:
according to the encryption private key and the encryption label, performing function encryption on the local model by adopting the following encryption algorithm to obtain the encryption model:
Figure BDA0003279056070000041
in the formula (I), the compound is shown in the specification,
Figure BDA0003279056070000042
wherein x isiAnd [ c)i]1Respectively representing an ith client local model and an encryption model;
Figure BDA0003279056070000043
a key parameter representing an ith client;
Figure BDA0003279056070000044
representing an encrypted tag;
Figure BDA0003279056070000045
represents a multiplicative cyclic group of order cryptographic prime number associated with a bilinear pair, and
Figure BDA0003279056070000046
representing groups of multiplication cycles
Figure BDA0003279056070000047
A constructed 2-dimensional vector;
Figure BDA0003279056070000048
representing a hash function.
Further, the server decrypts and aggregates the encryption model according to the partial decryption key, the encryption label, the encryption weight and the model accuracy to obtain a global model, and the step of obtaining the global model includes:
obtaining a decryption key by adopting a key combination algorithm according to the encryption weight and part of the decryption key; the decryption key is represented as:
Figure BDA0003279056070000049
in the formula (I), the compound is shown in the specification,
Figure BDA00032790560700000410
wherein the content of the first and second substances,
Figure BDA00032790560700000411
and
Figure BDA00032790560700000412
respectively representing a weight vector composed of a decryption key and all client encryption weights;
Figure BDA00032790560700000413
a partial decryption key representing the ith client;
decrypting and aggregating encryption models of all clients according to the decryption key and the encryption label to obtain the global model; the global model is represented as:
Figure BDA0003279056070000051
in the formula (I), the compound is shown in the specification,
Figure BDA0003279056070000052
wherein, [ alpha ] is]TRepresenting a global model; y isiAnd [ c)i]1Respectively representing the encryption weight and the encryption model of the ith client;
Figure BDA0003279056070000053
representing a decryption key;
Figure BDA0003279056070000054
representing an encrypted tag;
Figure BDA0003279056070000055
represents a multiplicative cyclic group of order cryptographic prime number associated with a bilinear pair, and
Figure BDA0003279056070000056
representing groups of multiplication cycles
Figure BDA0003279056070000057
A constructed 2-dimensional vector;
Figure BDA0003279056070000058
representing a hash function; e (-) represents a bilinear pair map.
Further, the step of the server performing decryption aggregation on the encryption model according to the partial decryption key, the encryption tag, the encryption weight and the model accuracy to obtain a global model further includes:
testing the global model according to the public data set to obtain the accuracy of the global model;
and judging whether the accuracy of the global model reaches a preset accuracy, if so, stopping iterative training, otherwise, updating the encryption weight and weight vector parameter of each client according to the model accuracy of all clients, sending the global model and the updated encryption weight and weight vector parameter to each client, and continuing iterative training.
In a second aspect, an embodiment of the present invention provides a federally-learned encryption privacy protection system with a decentralized function, where the system includes:
the acquisition module is used for acquiring the initial model, the public data set, the encryption label, the encryption prime number, the encryption weight and the weight vector parameter sent by the server; the initial model is obtained by the server through training according to the public data set;
the training module is used for training the initial model according to a local data set to obtain a local model, and testing the local model according to the public data set to obtain corresponding model accuracy;
the encryption module is used for generating an encryption private key and a part of decryption keys according to the encryption prime number, the encryption weight and the weight vector parameters, and performing function encryption on the local model according to the encryption private key and the encryption label to obtain an encryption model;
and the aggregation module is used for sending the encryption model, the partial decryption key and the model accuracy to the server so that the server decrypts and aggregates the encryption model according to the partial decryption key, the encryption label, the encryption weight and the model accuracy to obtain a global model.
In a third aspect, an embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method when executing the computer program.
In a fourth aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the steps of the above method.
The above application provides a Federal learning-oriented decentralized function encryption privacy protection method, system, computer device and storage medium, by which it is achieved that after an encryption tag, an encryption prime number and an encryption weight are generated by a server, and an initial model is obtained by training according to a public data set, the initial model, the public data set, the encryption tag, the encryption prime number, the encryption weight and weight vector parameters are sent to each client, each client trains the initial model according to the local data set to obtain a local model, tests are performed according to the public data set to obtain model accuracy, an encryption private key and a part of decryption keys are generated according to the encryption prime number, and then function encryption is performed on the local model according to the encryption private key and the encryption tag to obtain an encryption model, and then the encryption model, the part of decryption keys and the model accuracy are sent to the server, so that the server completes decryption and aggregation of the client encryption models to obtain a global model, the accuracy of the global model is tested by adopting a public data set, and whether iterative training needs to be continued or not is judged according to the accuracy until an ideal global model is obtained. Compared with the prior art, the Federal learning-oriented decentralized function encryption privacy protection method not only ensures that a server cannot obtain specific gradient parameters of a local training model of each user, but also solves the problem that the existing privacy protection method depends on a trusted third party entity to generate, manage and distribute keys in a mode that a server and a client interact to generate keys, effectively prevents the trusted third party and the server from executing collusion attack risk, and further improves the privacy protection degree and the model service effect of the client model in Federal learning.
Drawings
FIG. 1 is a schematic diagram of a federated learning model framework applied to a federated learning-oriented decentralized function encryption privacy protection method in the embodiment of the present invention;
FIG. 2 is a schematic diagram of a collusion attack between a server and a trusted third party in existing privacy protection for federated learning;
FIG. 3 is a schematic flow chart of a Federal learning-oriented decentralized function encryption privacy protection method in the embodiment of the present invention;
FIG. 4 is a schematic flow chart of the local model obtained by training the initial model in step S12 in FIG. 3;
fig. 5 is a schematic flowchart of the server decrypting and aggregating the encryption models of all the clients to obtain a global model in step S14 in fig. 3;
fig. 6 is another schematic flowchart of the server performing decryption and aggregation on the encryption models of all the clients to obtain a global model in step S14 in fig. 3;
FIG. 7 is a schematic structural diagram of a Federal learning-oriented decentralized function encryption privacy protection system in an embodiment of the present invention;
FIG. 8 is a schematic diagram of a Federal learning-oriented decentralized function encryption privacy protection system applied to a smart medical scenario according to an embodiment of the present invention;
fig. 9 is an internal structural diagram of a computer device in the embodiment of the present invention.
Detailed Description
In order to make the purpose, technical solution and advantages of the present invention more clearly apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments, and it is obvious that the embodiments described below are part of the embodiments of the present invention, and are used for illustrating the present invention only, but not for limiting the scope of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The method for protecting privacy by using function encryption for decentralization facing to federated learning, provided by the invention, is applied to a model frame for protecting privacy by using function encryption for decentralization facing to federated learning shown in figure 1, and can effectively solve the problem of collusion attack between a server and a trusted third party in the federated learning process shown in figure 2 while ensuring that the server can not obtain a specific gradient parameter of a local training model of each user, thereby further improving the privacy protection degree and the model service effect of a client model in federated learning.
As shown in fig. 2, the malicious server obtains an encryption key and an encryption algorithm that are not owned by the malicious server from the trusted third-party entity through means such as private collusion, even profit exchange, and the like, and then calculates a plaintext model of the user client from an encryption model of the user client by using the encryption algorithm and the encryption key, thereby obtaining local privacy information of the user client. Therefore, the dependence on the generation, management and key distribution of a third party is removed in the privacy protection method of federal learning, and the method is very necessary for further enhancing the client privacy protection strength and improving the client privacy protection effect.
In one embodiment, as shown in fig. 3, there is provided a federally-learned decentralized function encryption privacy protection method, including the following steps:
s11, acquiring an initial model, a public data set, an encryption label, an encryption prime number, an encryption weight and a weight vector parameter sent by a server; the initial model is obtained by the server through training according to the public data set;
the initial model can be selected according to an actual federal learning task, and the public data set is a data set which is collected by the server and can be shared by all client sides for training, and is not particularly limited. Wherein, the encryption label, the encryption prime number, the encryption weight and the weight vector parameter are the encryption parameters which are necessary to be used for the interactive encryption and decryption of the client and the server, the encryption label is a group of character strings which are composed of any characters and have any length, the encryption prime number is any prime number, the encryption weight is a one-to-one corresponding weight value which is distributed to each client by the server according to the application scene and the requirement, the weight vector parameter is an encryption parameter which is obtained by the server based on the weight of each client by adopting a Hash algorithm, for example, the server can distribute the corresponding encryption weight to each client according to the data volume of each client and other conditions during the initial iteration, based on the encryption weights of all clients, the weight vector parameter which fuses the encryption weights of all clients is obtained by adopting the Hash algorithm, and the encryption weight is updated according to the accuracy of each round of the client training model during the subsequent iteration, meanwhile, the corresponding weight vector parameters are updated to ensure the model aggregation effect.
S12, training the initial model according to a local data set to obtain a local model, and testing the local model according to the public data set to obtain corresponding model accuracy;
the specific loss function and the training mode adopted by the local model are determined based on the actual application requirements and the selected initial model type, and this embodiment is not particularly limited. After each client side obtains a local model through local data set training, the client side uses a public data set sent by the server to carry out testing to obtain corresponding model accuracy, and the model accuracy is sent to the server together when the local model is sent to the server for aggregation in the follow-up iteration. In addition, in order to enhance the privacy of the client model, after each client obtains a local model by training with a local data set, each client may perform localized differential privacy, add noise to the model, and then encrypt the model, as shown in fig. 4, specifically including:
s121, setting a privacy budget and a noise parameter according to the distribution characteristics and privacy protection requirements of the local model;
and S122, according to the privacy budget and the noise parameter, performing localized differential privacy to add noise to the local model.
The setting of the privacy budget and the noise parameter can be adjusted according to the actual application scenario and the application requirement, which is not limited herein. Meanwhile, in order to improve the efficiency of the whole federal learning, the existing model compression technology is preferably adopted to perform model compression before or after the local model is subjected to noise addition by executing the localized differential privacy, and details are not repeated here.
S13, generating an encryption private key and a partial decryption key according to the encryption prime number, the encryption weight and the weight vector parameters, and performing function encryption on the local model according to the encryption private key and the encryption label to obtain an encryption model;
the method comprises the following steps that an encryption private key is used for each client to encrypt a local model to obtain an encryption model, part of decryption keys are sent to a server to be used for decrypting and aggregating the encryption models of all the clients, and the generation of the encryption private key and the part of decryption keys specifically comprises the following steps of:
generating a key parameter according to the encryption prime number, and taking the key parameter as the encryption private key; the key parameter is represented as:
Figure BDA0003279056070000101
in the formula, p and
Figure BDA0003279056070000102
respectively representing encrypted prime numbers and corresponding finite fields;
Figure BDA0003279056070000103
the key parameter of the ith client is represented and is a 2-dimensional vector;
generating a decryption parameter according to the encryption prime number, and generating the partial decryption key according to the decryption parameter, the key parameter, the encryption weight and the weight vector parameter; the partial decryption key is represented as:
Figure BDA0003279056070000104
in the formula (I), the compound is shown in the specification,
Figure BDA0003279056070000105
wherein the content of the first and second substances,
Figure BDA0003279056070000106
and yiRespectively representing a partial decryption key and an encryption weight of the ith client; t isiRepresents the decryption parameter generated by the ith client, and sigmai∈[n]TiN is the total number of participating training clients;
Figure BDA0003279056070000107
representing a hash function;
Figure BDA0003279056070000108
is represented by
Figure BDA0003279056070000109
A constructed 2 × 2 matrix;
Figure BDA00032790560700001010
represents a cyclic group of p-factorial method relating to bilinear pairs, and
Figure BDA00032790560700001011
respectively representing groups of multiplication cycles
Figure BDA00032790560700001012
A constructed 2-dimensional vector;
Figure BDA00032790560700001013
and
Figure BDA00032790560700001014
representing all client encryption weight components separatelyAnd corresponding weight vector parameters.
The step of generating the decryption parameter by each client according to the encryption prime number comprises the following steps:
initializing a parameter matrix; the parameter matrix is a matrix with all elements being zero;
according to the encrypted prime numbers, negotiating with other clients respectively to determine a corresponding random matrix; the random matrix is represented as:
Figure BDA00032790560700001015
wherein, TijRepresenting a random matrix determined by the negotiation of the ith client and the jth client;
Figure BDA00032790560700001016
representing a finite field
Figure BDA00032790560700001017
A constructed 2 × 2 matrix;
generating the decryption parameters according to the parameter matrix and the random matrix; the decryption parameters are expressed as:
Ti=T0+∑j∈n,i<jTij-∑j∈n,i<iTij
wherein, TiRepresenting a decryption parameter corresponding to the ith client; t is0A parameter matrix representing the client.
It should be noted that the parameter matrix initialized by each client may also be a non-zero matrix, and for convenience of calculation, all the parameter matrices are set as zero matrices in this embodiment. In addition, the method for mutually negotiating and determining the corresponding random matrix between the clients may be implemented by using Diffie-Hellman key exchange protocol, or may also use other similar negotiation techniques, which is not limited herein. After the client side obtains the encryption private key by adopting the steps of the method, the local model can be subjected to function encryption by adopting the following encryption algorithm according to the encryption private key and the encryption label to obtain a corresponding encryption model:
Figure BDA0003279056070000111
in the formula (I), the compound is shown in the specification,
Figure BDA0003279056070000112
wherein x isiAnd [ c)i]1Respectively representing an ith client local model and an encryption model;
Figure BDA0003279056070000113
a key parameter representing an ith client;
Figure BDA0003279056070000114
representing an encrypted tag;
Figure BDA0003279056070000115
represents a multiplicative cyclic group of order cryptographic prime number associated with a bilinear pair, and
Figure BDA0003279056070000116
representing groups of multiplication cycles
Figure BDA0003279056070000117
A constructed 2-dimensional vector;
Figure BDA0003279056070000118
representing a hash function.
In the embodiment, after the client adds noise to the local model (or further adds model compression), the noise-added model is encrypted by the encryption parameter sent by the server according to the encryption algorithm, so that the problem of generating, managing and distributing keys depending on a trusted third party entity is effectively solved, and the reliable guarantee is provided for the privacy protection of the client model.
And S14, sending the encryption model, the partial decryption key and the model accuracy to the server, so that the server decrypts and aggregates the encryption model according to the partial decryption key, the encryption label, the encryption weight and the model accuracy to obtain a global model.
The global model is obtained by decrypting and aggregating encryption models which are locally trained and uploaded by all the clients through the server, and the aggregation method can be selected according to actual requirements. In this embodiment, the model is obtained by a method of performing weighted average aggregation on local models uploaded by all clients based on client weights, and is used as a model which is sent to all clients for training in subsequent iterations. When a part of decryption keys are used for the server to aggregate the encryption models of all the clients, the server needs to combine the decryption keys with the weight vectors corresponding to the encryption weights of all the clients, calculate the corresponding decryption keys through a key combination algorithm, use the decryption keys and the encryption labels, and perform decryption aggregation on the encryption models of all the clients through a decryption algorithm, as shown in fig. 5, the method specifically includes the following steps:
s141, obtaining a decryption key by adopting a key combination algorithm according to the encryption weight and part of the decryption key; the decryption key is represented as:
Figure BDA0003279056070000121
in the formula (I), the compound is shown in the specification,
Figure BDA0003279056070000122
wherein the content of the first and second substances,
Figure BDA0003279056070000123
and
Figure BDA0003279056070000124
respectively representing a weight vector composed of a decryption key and all client encryption weights;
Figure BDA0003279056070000125
a partial decryption key representing the ith client;
s142, carrying out decryption aggregation on the encryption models of all the clients according to the decryption key and the encryption label to obtain the global model; the global model is represented as:
Figure BDA0003279056070000126
in the formula (I), the compound is shown in the specification,
Figure BDA0003279056070000127
wherein, [ alpha ] is]TRepresenting a global model; y isiAnd [ c)i]1Respectively representing the encryption weight and the encryption model of the ith client;
Figure BDA0003279056070000128
representing a decryption key;
Figure BDA0003279056070000129
representing an encrypted tag;
Figure BDA00032790560700001210
represents a multiplicative cyclic group of order cryptographic prime number associated with a bilinear pair, and
Figure BDA00032790560700001211
representing groups of multiplication cycles
Figure BDA00032790560700001212
A constructed 2-dimensional vector;
Figure BDA00032790560700001213
representing a hash function; e (-) represents a bilinear pair map.
After the server obtains the global model of the current round of iterative training through the decryption and aggregation method, the server can directly issue the global model to all clients to continue subsequent iterative training in principle under the condition that the preset iteration times or other preset iteration convergence standards are not met. In order to ensure that the encryption weights subsequently distributed to the clients are more reasonable and effective, in this embodiment, after the server obtains the global model, the server uses a public data set to test the global model, and when the accuracy rate does not meet the requirement, the server re-adjusts the weights of all the clients according to the accuracy rate of the local model obtained by the current round of iteration sent by each client, that is, the server distributes higher encryption weights to the clients with high accuracy rate of the current round of training model for encryption and aggregation of the next round of training model, and each round of iterative training executes the same method steps. Specifically, in addition to the above S141-S142, as shown in fig. 6, the step of performing decryption aggregation on the encryption model by the server according to the partial decryption key, the encryption tag, the encryption weight, and the model accuracy to obtain a global model further includes:
s143, testing the global model according to the public data set to obtain the accuracy of the global model;
s144, judging whether the accuracy of the global model reaches a preset accuracy, if so, stopping iterative training, otherwise, updating the encryption weight and the weight vector parameter of each client according to the model accuracy of all the clients, sending the global model and the updated encryption weight and weight vector parameter to each client, and continuing iterative training.
The weight vector parameter is generated by the server according to the encryption weight of each client, is updated along with the change of the encryption weight of each client, and can be expressed as:
Figure BDA0003279056070000131
wherein the content of the first and second substances,
Figure BDA0003279056070000132
representing a weight vector parameter;
Figure BDA0003279056070000133
representing a hash function;
Figure BDA0003279056070000134
represents a cyclic group of p-factorial method relating to bilinear pairs, and
Figure BDA0003279056070000135
a 2-dimensional vector consisting of a multiplication loop group is represented.
In the embodiment of the application, based on the problem that the existing privacy protection of federal learning relies on a trusted third party entity to generate, manage and distribute keys and easily brings the consideration of collusion attack risk of the trusted third party and a server, a decentralization function encryption privacy protection method facing federal learning is designed, which realizes that an encryption label, an encryption prime number and an encryption weight are generated by the server in advance, an initial model is obtained by training according to a public data set, the initial model, the public data set, the encryption label, the encryption prime number, the encryption weight and weight vector parameters are sent to each client, each client trains the initial model according to the local data set to obtain a local model, tests according to the public data set to obtain model accuracy, generates an encryption private key and a part of decryption keys according to the encryption prime number, and performs local differential privacy to add noise to the local model, after the local model is subjected to function encryption according to the encryption private key and the encryption label to obtain an encryption model, the encryption model, part of decryption keys and model accuracy are sent to the server, so that the server completes decryption aggregation of the encryption models of the clients to obtain a global model, the accuracy of the global model is tested by adopting a public data set, and whether iterative training needs to be continued or not is judged according to the accuracy until an ideal global model is obtained. The method is applied to actual federal learning training, and a method combining a mode of generating a secret key by interaction of a server and a client and local differential privacy is adopted, so that the server is ensured not to obtain a specific gradient parameter of a local training model of each user, the problem of dependence of the existing privacy protection method on a trusted third party entity is solved, the risk of collusion attack executed by the trusted third party and the server is effectively prevented, and the privacy protection degree and the model service effect of the client model in the federal learning are further improved.
It should be noted that, although the steps in the above-described flowcharts are shown in sequence as indicated by arrows, the steps are not necessarily executed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise.
In one embodiment, as shown in fig. 7, there is provided a federally-learned decentralized function encryption privacy protection system, the system comprising:
the acquisition module 1 is used for acquiring an initial model, a public data set, an encryption tag, an encryption prime number, an encryption weight and a weight vector parameter which are sent by a server; the initial model is obtained by the server through training according to the public data set;
the training module 2 is used for training the initial model according to a local data set to obtain a local model, and testing the local model according to the public data set to obtain the corresponding model accuracy;
the encryption module 3 is used for generating an encryption private key and a part of decryption keys according to the encryption prime number, the encryption weight and the weight vector parameters, and performing function encryption on the local model according to the encryption private key and the encryption label to obtain an encryption model;
and the aggregation module 4 is configured to send the encryption model, the partial decryption key, and the model accuracy to the server, so that the server decrypts and aggregates the encryption model according to the partial decryption key, the encryption tag, the encryption weight, and the model accuracy, to obtain a global model.
For specific limitations on the encryption privacy protection of the decentralization function for federal learning, refer to the above limitations on the encryption privacy protection method of the decentralization function for federal learning, which are not described herein again. The modules in the Federal learning-oriented decentralized function encryption privacy protection system can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
As shown in fig. 8, in an intelligent medical scenario including hospitals and diagnosis servers, the system is used to complete safe federal learning of each hospital database, so as to implement accurate diagnosis of online diseases, and the specific application is as follows: the online diagnosis server obtains an initial model according to the training of the public data set, and simultaneously generates an encryption label l, an encryption prime number P and an encryption weight yiAnd weight vector parameters
Figure BDA0003279056070000151
And simultaneously sending the initial model and the initial model to each hospital; each hospital receives the initial model, the public data set, the encryption label l, the encryption prime number P and the encryption weight y sent by the serveriAnd weight vector parameters
Figure BDA0003279056070000152
Then, training the initial model according to the local data set to obtain a local model, testing the local model according to the public data set to obtain corresponding model accuracy, executing local differential privacy according to the distribution characteristics and privacy protection requirements of the local model to add noise to the local model, and generating an encryption private key according to an encryption prime number P
Figure BDA0003279056070000153
And partial decryption key
Figure BDA0003279056070000154
And based on the encrypted private key
Figure BDA0003279056070000155
And the encryption label l performs function encryption on the local model to obtain an encryption model, and decrypts the encryption model and part of the encryption modelKey with a key body
Figure BDA0003279056070000156
And the model accuracy is sent to an online diagnosis server; the on-line diagnosis server decrypts the secret key according to the part transmitted by each hospital
Figure BDA0003279056070000157
And an encryption weight yiObtaining a decryption key using a key combination algorithm
Figure BDA0003279056070000158
Based on the decryption key
Figure BDA0003279056070000159
The encryption labels l are used for decrypting and aggregating the encryption models of all the clients to obtain a global model, testing the global model according to the public data set to obtain the accuracy of the global model, updating the encryption weight and the weight vector parameter of each client according to the model accuracy of all the clients when the accuracy of the global model is judged not to reach the preset accuracy, sending the global model, the updated encryption weight and the updated weight vector parameter to each client, and continuing iterative training until an ideal global model is obtained; the online diagnosis server provides service by using the model trained by federal learning, and after the personal health data is uploaded to the online diagnosis server by an individual user, the online diagnosis server inputs the personal data of the user into the global model for disease matching and feeds back the online diagnosis result to the individual user in time.
Fig. 9 shows an internal structure diagram of a computer device in one embodiment, and the computer device may be specifically a terminal or a server. As shown in fig. 9, the computer apparatus includes a processor, a memory, a network interface, a display, and an input device, which are connected through a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a federated learning-oriented decentralized function privacy protection method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those of ordinary skill in the art that the architecture shown in FIG. 9 is merely a block diagram of some of the structures associated with the present solution and is not intended to limit the computing devices to which the present solution may be applied, and that a particular computing device may include more or less components than those shown in the drawings, or may combine certain components, or have the same arrangement of components.
In one embodiment, a computer device is provided, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the steps of the above method being performed when the computer program is executed by the processor.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of the above-mentioned method.
To sum up, the Federal learning oriented decentralized function encryption privacy protection method and system provided by the embodiments of the present invention generate an encryption tag, an encryption prime number and an encryption weight by a server, train according to a public data set to obtain an initial model, send the initial model, the public data set, the encryption tag, the encryption prime number, the encryption weight and weight vector parameters to each client, train the initial model according to the local data set by each client to obtain a local model, test according to the public data set to obtain model accuracy, generate an encryption private key and a part of decryption keys according to the encryption prime number, functionally encrypt the local model according to the encryption private key and the encryption tag to obtain an encryption model, and send the encryption model, the part of decryption keys and the model accuracy to the server, so that the server completes decryption and aggregation of the client encryption models to obtain a global model, the accuracy of the global model is tested by adopting a public data set, and whether iterative training needs to be continued or not is judged according to the accuracy until an ideal global model meeting the service requirement is obtained. The decentralization function encryption privacy protection method for federal learning ensures that a server cannot obtain specific gradient parameters of a local training model of each user, overcomes the problem that the existing privacy protection method depends on a trusted third party entity to generate, manage and distribute keys in a mode that a server and a client interact to generate the keys, effectively prevents the trusted third party and the server from executing collusion attack risk, and further improves the privacy protection degree and the model service effect of the client model in federal learning.
The embodiments in this specification are described in a progressive manner, and all the same or similar parts of the embodiments are directly referred to each other, and each embodiment is described with emphasis on differences from other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment. It should be noted that, the technical features of the embodiments may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express some preferred embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for those skilled in the art, various modifications and substitutions can be made without departing from the technical principle of the present invention, and these should be construed as the protection scope of the present application. Therefore, the protection scope of the present patent shall be subject to the protection scope of the claims.

Claims (10)

1. A Federal learning-oriented decentralized function encryption privacy protection method is characterized by comprising the following steps:
acquiring an initial model, a public data set, an encryption label, an encryption prime number, an encryption weight and a weight vector parameter sent by a server; the initial model is obtained by the server through training according to the public data set;
training the initial model according to a local data set to obtain a local model, and testing the local model according to the public data set to obtain corresponding model accuracy;
generating an encryption private key and a partial decryption key according to the encryption prime number, the encryption weight and the weight vector parameter, and performing function encryption on the local model according to the encryption private key and the encryption label to obtain an encryption model;
and sending the encryption model, the partial decryption key and the model accuracy to the server so that the server decrypts and aggregates the encryption model according to the partial decryption key, the encryption label, the encryption weight and the model accuracy to obtain a global model.
2. The decentralized function encryption privacy protection method according to claim 1, wherein the step of training the initial model according to the local data set to obtain the local model comprises:
setting a privacy budget and a noise parameter according to the distribution characteristics and the privacy protection requirements of the local model;
and according to the privacy budget and the noise parameters, performing localized differential privacy to add noise to the local model.
3. The method of claim 1, wherein the step of generating an encryption private key and a partial decryption key based on the encryption prime number, the encryption weight, and the weight vector parameter comprises:
generating a key parameter according to the encryption prime number, and taking the key parameter as the encryption private key; the key parameter is represented as:
Figure FDA0003279056060000011
in the formula, p and
Figure FDA0003279056060000021
respectively representing encrypted prime numbers and corresponding finite fields;
Figure FDA0003279056060000022
the key parameter of the ith client is represented and is a 2-dimensional vector;
generating a decryption parameter according to the encryption prime number, and generating the partial decryption key according to the decryption parameter, the key parameter, the encryption weight and the weight vector parameter; the partial decryption key is represented as:
Figure FDA0003279056060000023
in the formula (I), the compound is shown in the specification,
Figure FDA0003279056060000024
wherein the content of the first and second substances,
Figure FDA0003279056060000025
and yiRespectively representing a partial decryption key and an encryption weight of the ith client; t isiRepresents the decryption parameter generated by the ith client, and sigmai∈[n]TiN is the total number of participating training clients;
Figure FDA0003279056060000026
representing a hash function;
Figure FDA0003279056060000027
represents a cyclic group of p-factorial method relating to bilinear pairs, and
Figure FDA0003279056060000028
representing groups of multiplication cycles
Figure FDA0003279056060000029
A constructed 2-dimensional vector;
Figure FDA00032790560600000210
and
Figure FDA00032790560600000211
respectively representing the weight vector composed of all client encryption weights and the corresponding weight vector parameters.
4. The method of claim 3, wherein the step of generating decryption parameters based on the cryptographic prime number comprises:
initializing a parameter matrix; the parameter matrix is a matrix with all elements being zero;
according to the encrypted prime numbers, negotiating with other clients respectively to determine a corresponding random matrix; the random matrix is represented as:
Figure FDA00032790560600000212
wherein, TijRepresenting a random matrix determined by the negotiation of the ith client and the jth client;
Figure FDA00032790560600000213
is represented byLimited area
Figure FDA00032790560600000214
A constructed 2 × 2 matrix;
generating the decryption parameters according to the parameter matrix and the random matrix; the decryption parameters are expressed as:
Ti=T0+∑j∈n,i<jTij-∑j∈n,j<iTij
wherein, TiRepresenting a decryption parameter corresponding to the ith client; t is0A parameter matrix representing the client.
5. The decentralized function encryption privacy protection method according to claim 1, wherein the step of performing function encryption on the local model according to the encryption private key and the encryption tag to obtain the encryption model comprises:
according to the encryption private key and the encryption label, performing function encryption on the local model by adopting the following encryption algorithm to obtain the encryption model:
Figure FDA0003279056060000031
in the formula (I), the compound is shown in the specification,
Figure FDA0003279056060000032
wherein x isiAnd [ c)i]1Respectively representing an ith client local model and an encryption model;
Figure FDA0003279056060000033
a key parameter representing an ith client; l represents an encryption tag;
Figure FDA0003279056060000034
representing information relating to bilinear pairsA multiplication loop group with an order of an encryption prime number, and
Figure FDA0003279056060000035
representing groups of multiplication cycles
Figure FDA0003279056060000036
A constructed 2-dimensional vector;
Figure FDA0003279056060000037
representing a hash function.
6. The decentralized function encryption privacy protection method according to claim 1, wherein the step of the server performing decryption aggregation on the encryption model according to the partial decryption key, the encryption tag, the encryption weight and the model accuracy to obtain a global model comprises:
obtaining a decryption key by adopting a key combination algorithm according to the encryption weight and part of the decryption key; the decryption key is represented as:
Figure FDA0003279056060000038
in the formula (I), the compound is shown in the specification,
Figure FDA0003279056060000039
wherein the content of the first and second substances,
Figure FDA00032790560600000310
and
Figure FDA00032790560600000311
respectively representing a weight vector composed of a decryption key and all client encryption weights;
Figure FDA00032790560600000312
a partial decryption key representing the ith client;
decrypting and aggregating encryption models of all clients according to the decryption key and the encryption label to obtain the global model; the global model is represented as:
Figure FDA00032790560600000313
in the formula (I), the compound is shown in the specification,
Figure FDA00032790560600000314
wherein, [ alpha ] is]TRepresenting a global model; y isiAnd [ c)i]1Respectively representing the encryption weight and the encryption model of the ith client;
Figure FDA0003279056060000041
representing a decryption key; l represents an encryption tag;
Figure FDA0003279056060000042
represents a multiplicative cyclic group of order cryptographic prime number associated with a bilinear pair, and
Figure FDA0003279056060000043
representing groups of multiplication cycles
Figure FDA0003279056060000044
A constructed 2-dimensional vector;
Figure FDA0003279056060000045
representing a hash function; e (-) represents a bilinear pair map.
7. The decentralized function encryption privacy protection method according to claim 1, wherein the step of the server performing decryption aggregation on the encryption model according to the partial decryption key, the encryption tag, the encryption weight and the model accuracy to obtain the global model further comprises:
testing the global model according to the public data set to obtain the accuracy of the global model;
and judging whether the accuracy of the global model reaches a preset accuracy, if so, stopping iterative training, otherwise, updating the encryption weight and weight vector parameter of each client according to the model accuracy of all clients, sending the global model and the updated encryption weight and weight vector parameter to each client, and continuing iterative training.
8. A federally-learned decentralized function privacy protection system using encryption, the system comprising:
the acquisition module is used for acquiring the initial model, the public data set, the encryption label, the encryption prime number, the encryption weight and the weight vector parameter sent by the server; the initial model is obtained by the server through training according to the public data set; the encryption parameters comprise encryption labels, encryption prime numbers and encryption weights;
the training module is used for training the initial model according to a local data set to obtain a local model, and testing the local model according to the public data set to obtain corresponding model accuracy;
the encryption module is used for generating an encryption private key and a part of decryption keys according to the encryption prime number, the encryption weight and the weight vector parameters, and performing function encryption on the local model according to the encryption private key and the encryption label to obtain an encryption model;
and the aggregation module is used for sending the encryption model, the partial decryption key and the model accuracy to the server so that the server decrypts and aggregates the encryption model according to the partial decryption key, the encryption label, the encryption weight and the model accuracy to obtain a global model.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method as claimed in any one of claims 1 to 7 are implemented by the processor when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202111134122.6A 2021-09-26 2021-09-26 Federal learning-oriented decentralized function encryption privacy protection method and system Active CN113836556B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111134122.6A CN113836556B (en) 2021-09-26 2021-09-26 Federal learning-oriented decentralized function encryption privacy protection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111134122.6A CN113836556B (en) 2021-09-26 2021-09-26 Federal learning-oriented decentralized function encryption privacy protection method and system

Publications (2)

Publication Number Publication Date
CN113836556A true CN113836556A (en) 2021-12-24
CN113836556B CN113836556B (en) 2022-11-04

Family

ID=78970549

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111134122.6A Active CN113836556B (en) 2021-09-26 2021-09-26 Federal learning-oriented decentralized function encryption privacy protection method and system

Country Status (1)

Country Link
CN (1) CN113836556B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114362940A (en) * 2021-12-29 2022-04-15 华东师范大学 Server-free asynchronous federated learning method for data privacy protection
CN114726496A (en) * 2022-03-07 2022-07-08 电子科技大学 Safe feature selection method applied to longitudinal federal learning
CN115130814A (en) * 2022-05-10 2022-09-30 中南大学 Privacy calculation method and system for longitudinal data fusion
CN115828287A (en) * 2023-01-10 2023-03-21 湖州丽天智能科技有限公司 Model encryption method, model decryption method, computer and integrated chip
CN116010944A (en) * 2023-03-24 2023-04-25 北京邮电大学 Federal computing network protection method and related equipment
CN116168789A (en) * 2023-04-26 2023-05-26 之江实验室 Multi-center medical data generation system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110572253A (en) * 2019-09-16 2019-12-13 济南大学 Method and system for enhancing privacy of federated learning training data
CN111600707A (en) * 2020-05-15 2020-08-28 华南师范大学 Decentralized federal machine learning method under privacy protection
CN112199702A (en) * 2020-10-16 2021-01-08 鹏城实验室 Privacy protection method, storage medium and system based on federal learning
CN112862001A (en) * 2021-03-18 2021-05-28 中山大学 Decentralized data modeling method under privacy protection
CN113434873A (en) * 2021-06-01 2021-09-24 内蒙古大学 Federal learning privacy protection method based on homomorphic encryption

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110572253A (en) * 2019-09-16 2019-12-13 济南大学 Method and system for enhancing privacy of federated learning training data
CN111600707A (en) * 2020-05-15 2020-08-28 华南师范大学 Decentralized federal machine learning method under privacy protection
CN112199702A (en) * 2020-10-16 2021-01-08 鹏城实验室 Privacy protection method, storage medium and system based on federal learning
CN112862001A (en) * 2021-03-18 2021-05-28 中山大学 Decentralized data modeling method under privacy protection
CN113434873A (en) * 2021-06-01 2021-09-24 内蒙古大学 Federal learning privacy protection method based on homomorphic encryption

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114362940A (en) * 2021-12-29 2022-04-15 华东师范大学 Server-free asynchronous federated learning method for data privacy protection
CN114362940B (en) * 2021-12-29 2023-09-29 华东师范大学 Server-free asynchronous federation learning method for protecting data privacy
CN114726496A (en) * 2022-03-07 2022-07-08 电子科技大学 Safe feature selection method applied to longitudinal federal learning
CN114726496B (en) * 2022-03-07 2023-10-03 电子科技大学 Safe feature selection method applied to longitudinal federal learning
CN115130814A (en) * 2022-05-10 2022-09-30 中南大学 Privacy calculation method and system for longitudinal data fusion
CN115130814B (en) * 2022-05-10 2023-05-02 中南大学 Privacy computing method and system for longitudinal data fusion
CN115828287A (en) * 2023-01-10 2023-03-21 湖州丽天智能科技有限公司 Model encryption method, model decryption method, computer and integrated chip
CN115828287B (en) * 2023-01-10 2023-05-23 湖州丽天智能科技有限公司 Model encryption method, model decryption method, computer and integrated chip
CN116010944A (en) * 2023-03-24 2023-04-25 北京邮电大学 Federal computing network protection method and related equipment
CN116168789A (en) * 2023-04-26 2023-05-26 之江实验室 Multi-center medical data generation system and method

Also Published As

Publication number Publication date
CN113836556B (en) 2022-11-04

Similar Documents

Publication Publication Date Title
CN113836556B (en) Federal learning-oriented decentralized function encryption privacy protection method and system
CN110245510B (en) Method and apparatus for predicting information
CN108712260B (en) Multi-party deep learning computing agent method for protecting privacy in cloud environment
US20210143987A1 (en) Privacy-preserving federated learning
Wang et al. Privacy-preserving federated learning for internet of medical things under edge computing
Liu et al. Privacy-preserving aggregation in federated learning: A survey
WO2020177392A1 (en) Federated learning-based model parameter training method, apparatus and device, and medium
CN112966298B (en) Composite privacy protection method, system, computer equipment and storage medium
JP2020515087A5 (en)
CN112347500B (en) Machine learning method, device, system, equipment and storage medium of distributed system
CN113505882B (en) Data processing method based on federal neural network model, related equipment and medium
Li et al. A verifiable privacy-preserving machine learning prediction scheme for edge-enhanced HCPSs
CN115811402B (en) Medical data analysis method based on privacy protection federal learning and storage medium
CN114003950A (en) Federal machine learning method, device, equipment and medium based on safety calculation
CN113747426B (en) Data auditing method and system, electronic equipment and storage medium
CN111027981A (en) Method and device for multi-party joint training of risk assessment model for IoT (Internet of things) machine
Xu et al. Data tag replacement algorithm for data integrity verification in cloud storage
CN113849828B (en) Anonymous generation and attestation of processed data
CN108259180B (en) Method for quantum specifying verifier signature
CN116502732B (en) Federal learning method and system based on trusted execution environment
CN111740959A (en) Verifiable privacy protection method in mobile crowd sensing system
CN113792282B (en) Identity data verification method and device, computer equipment and storage medium
Liu et al. Efficient and Privacy-Preserving Logistic Regression Scheme based on Leveled Fully Homomorphic Encryption
Zhou et al. VDFChain: Secure and verifiable decentralized federated learning via committee-based blockchain
CN114547684A (en) Method and device for protecting multi-party joint training tree model of private data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant