CN112732297A - Method and device for updating federal learning model, electronic equipment and storage medium - Google Patents
Method and device for updating federal learning model, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN112732297A CN112732297A CN202011640520.0A CN202011640520A CN112732297A CN 112732297 A CN112732297 A CN 112732297A CN 202011640520 A CN202011640520 A CN 202011640520A CN 112732297 A CN112732297 A CN 112732297A
- Authority
- CN
- China
- Prior art keywords
- learning model
- data
- gradient
- updating
- matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/60—Software deployment
- G06F8/65—Updates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Computer Security & Cryptography (AREA)
- Computer And Data Communications (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention relates to an intelligent decision technology, and discloses an updating method of a federated learning model, which comprises the following steps: the method comprises the steps that a server side receives a data removal request sent by a target client side participating in federated learning, after data removal is carried out according to the data removal request, data removal messages are sent to the target client side and other client sides participating in federated learning, gradient parameters of data removal are calculated according to a preset gradient formula, other client sides calculate a blackplug matrix according to the preset matrix formula, and parameters of a federated learning model are updated according to the gradient parameters, the blackplug matrix and a preset model parameter formula. The invention also provides an updating device of the federated learning model, electronic equipment and a computer readable storage medium. The method can solve the problem of low updating efficiency of the federal learning model.
Description
Technical Field
The invention relates to the field of intelligent decision making, in particular to a method and a device for updating a federated learning model, electronic equipment and a computer-readable storage medium.
Background
Nowadays, people pay more and more attention to the safety of data information, and in order to guarantee the information safety during big data exchange and protect terminal data and personal data privacy, a method for developing high-efficiency machine learning among multiple participants or multiple computing nodes, namely a federal learning method, appears.
In the prior art, if a participant wants to quit the federal system after completing the federal model training, or if the participant wants to remove the influence of a part of private data on the federal model, the participant needs to retrain the federal model after deleting the data to update the parameters of the federal model. However, this method is very costly in terms of effort and communication, and it is time-consuming and labor-intensive to associate each participant with each other. Therefore, in the prior art, after data is removed, it is not efficient to update parameters of the federated learning model.
Disclosure of Invention
The invention provides an updating method and device of a federated learning model and a computer readable storage medium, and mainly aims to solve the problem of low updating efficiency of the federated learning model.
In order to achieve the above object, the present invention provides an updating method of a federal learning model applied to a server, including:
receiving a data removal request sent by a target client participating in federal learning;
after deleting the removed data from the data for the federated learning model, sending a data removal message to all the clients participating in the federated learning so that the target client calculates gradient parameters of the removed data about the federated learning model according to a gradient formula and each client calculates a blackplug matrix of a loss function of the federated learning model after the removed data of the total data contained in each client is removed according to a matrix formula;
receiving gradient parameters sent by the target client and the black plug matrix sent by each client;
calculating an updating parameter according to the gradient parameter, the black plug matrix and a preset model parameter formula;
and updating the federal learning model by using the updating parameters.
Optionally, the preset model parameter formula includes:
wherein, w(-m)For the updated parameters, w is a parameter vector of the federated learning model,and a black plug matrix of the ith client is obtained, delta is the gradient parameter, m is the number of the removed data, and k is the number of all clients participating in the federal learning.
Optionally, after the updating the federal learning model by using the update parameters, the method further includes:
and sending the updated federal learning model to all clients participating in federal learning.
Optionally, before calculating the update parameter according to the gradient parameter, the black plug matrix and a preset model parameter formula, the method further includes:
judging whether the gradient parameters or the black plug matrix are encrypted data or not;
and if the gradient parameters or the black plug matrix are encrypted data, decrypting the encrypted data according to a preset decryption formula to obtain the decrypted gradient parameters and the decrypted black plug matrix.
Optionally, the decrypting the encrypted data according to a preset decryption formula to obtain decrypted gradient parameters and a blackplug matrix includes:
the preset decryption formula is as follows:
m=L(cλmodn2)*μmodn
where m is a decrypted gradient parameter or a blackplug matrix, c is an encrypted gradient parameter or a blackplug matrix, mod is a modulus operator, and n is p × q, where p and q are large prime numbers satisfying a maximum common multiple of pq and (p-1) (q-1) as 1, λ is a kamichael function, and μ is a preset parameter.
Optionally, after receiving a data removal request sent by a target client participating in federal learning, the method further includes:
and opening the monitoring ports according to the number of the clients.
Optionally, the receiving the gradient parameter sent by the target client and the black plug matrix sent by each client includes:
acquiring the monitoring port;
and receiving the gradient parameters sent by the target client and the black plug matrix sent by each client through the monitoring port by using a preset request response protocol.
In order to solve the above problem, the present invention further provides an updating apparatus applied to a federated learning model on a server side, where the apparatus includes:
the data request module is used for receiving a data removal request sent by a target client participating in federal learning;
a message sending module, configured to send a data removal message to all clients participating in federated learning after deleting the removal data from the data for the federated learning model, so that the target client calculates gradient parameters of the removal data with respect to the federated learning model according to a gradient formula, and each client calculates a blackplug matrix of a loss function of the federated learning model after removing the removal data, based on a matrix formula, of the full amount of data contained in each client;
the parameter receiving module is used for receiving the gradient parameters sent by the target client and the black plug matrix sent by each client;
the updating parameter calculation module is used for calculating updating parameters according to the gradient parameters, the black plug matrix and a preset model parameter formula;
and the model updating module is used for updating the federal learning model by using the updating parameters.
In order to solve the above problem, the present invention also provides an electronic device, including:
a memory storing at least one instruction; and
and the processor executes the instructions stored in the memory to realize the updating method of the federal learning model.
In order to solve the above problem, the present invention further provides a computer-readable storage medium, where at least one instruction is stored, and the at least one instruction is executed by a processor in an electronic device to implement the above update method of the federal learning model.
The gradient parameters sent by the target client and the black plug matrix sent by each client are received; calculating an updating parameter according to the gradient parameter, the black plug matrix and a preset model parameter formula; and updating the federal learning model by using the updating parameters. After the data used for the federal learning are deleted, the federal learning model does not need to be retrained, the usability and the accuracy of the federal learning model can be maintained, and the updating efficiency of the federal learning model is improved. Meanwhile, the gradient parameters and the gradient parameters are calculated by the client side simultaneously, so that the federal learning model can be updated rapidly, the operation pressure of the server side is reduced, and the overall operation efficiency is improved. Therefore, the method, the device, the electronic equipment and the computer readable storage medium for updating the federal learning model provided by the invention can solve the problem of low updating efficiency of the federal learning model.
Drawings
Fig. 1 is a schematic flow chart of an updating method of a federal learning model according to a first method embodiment of the present invention;
FIG. 2 is a schematic flow chart of a method for updating a federated learning model according to a second method embodiment of the present invention;
fig. 3 is a block diagram of an updating apparatus of the federal learning model according to an embodiment of the present invention;
fig. 4 is a schematic internal structural diagram of an electronic device implementing an update method of a federal learning model according to an embodiment of the present invention;
the implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The embodiment of the application provides an updating method of a federated learning model. The execution subject of the update method of the federal learning model includes, but is not limited to, at least one of electronic devices, such as a server, a terminal, and the like, which can be configured to execute the method provided by the embodiments of the present application. In other words, the update method of the federal learning model may be performed by software or hardware installed in a terminal device or a server device, and the software may be a blockchain platform. The server includes but is not limited to: a single server, a server cluster, a cloud server or a cloud server cluster, and the like.
Referring to fig. 1, a flow chart of a method for updating a federal learning model according to a first method embodiment of the present invention is schematically shown. In this embodiment, the method for updating the federal learning model is applied to a server, and includes:
and S11, receiving a data removal request sent by the target client participating in the federal learning.
The server is a server participating in federal learning. One server may have multiple clients.
In the embodiment of the invention, the server side opens the monitoring ports according to the number of the clients so as to perform data transmission with each client.
For example, if the number of the clients is K, the server opens K listening ports.
And S12, after the removal data are deleted from the data used for the federated learning model, sending a data removal message to all the clients participating in the federated learning, so that the target client calculates gradient parameters of the removal data about the federated learning model according to a gradient formula, and each client calculates a blackplug matrix of a loss function of the federated learning model after the removal data of the full amount of data contained in each client according to a matrix formula.
In the embodiment of the invention, if a certain client has a removal number, the client calculates the gradient parameters, and meanwhile, each client participating in federal learning calculates a blackplug Matrix (Hessian Matrix), so that a plurality of blackplug matrices can be obtained. Namely, other clients including the target client perform calculation to obtain the black plug matrix.
Specifically, please refer to the description of the second embodiment of the method of the present invention for the description of the client computing.
And S13, receiving the gradient parameters sent by the target client and the black plug matrix sent by each client.
In this embodiment of the present invention, the receiving the gradient parameter sent by the target client and the black matrix sent by each client includes:
acquiring the monitoring port;
and receiving the gradient parameters sent by the target client and the black plug matrix sent by each client through the monitoring port by using a preset request response protocol.
And S14, calculating an updating parameter according to the gradient parameter, the black plug matrix and a preset model parameter formula.
Preferably, the preset model parameter formula includes:
wherein, w(-m)For the updated parameters, w is a parameter vector of the federated learning model,and a black plug matrix of the ith client is obtained, delta is the gradient parameter, m is the number of the removed data, and k is the number of all clients participating in the federal learning.
In an embodiment of the present invention, before calculating the update parameter according to the gradient parameter, the black plug matrix, and a preset model parameter formula, the method further includes:
judging whether the gradient parameters or the black plug matrix are encrypted data or not;
and if the gradient parameters or the black plug matrix are encrypted data, decrypting the encrypted data according to a preset decryption formula to obtain the decrypted gradient parameters and the decrypted black plug matrix.
Specifically, the decrypting the encrypted data according to a preset decryption formula to obtain decrypted gradient parameters and a blackplug matrix includes:
the preset decryption formula is as follows:
m=L(cλmodn2)*μmodn
where m is a decrypted gradient parameter or a blackplug matrix, c is an encrypted gradient parameter or a blackplug matrix, mod is a modulus operator, and n is p × q, where p and q are large prime numbers satisfying a maximum common multiple of pq and (p-1) (q-1) as 1, λ is a kamichael function, and μ is a preset parameter.
In detail, the encrypted gradient parameter or the blackplug matrix is decrypted by using the public key (n, g), so that the decrypted gradient parameter or the blackplug matrix is obtained.
In detail, after the updating the federal learning model by using the update parameters, the method further includes:
and sending the updated federal learning model to all clients participating in federal learning.
The gradient parameters sent by the target client and the black plug matrix sent by each client are received; calculating an updating parameter according to the gradient parameter, the black plug matrix and a preset model parameter formula; and updating the federal learning model by using the updating parameters. After the data used for the federal learning are deleted, the federal learning model does not need to be retrained, the usability and the accuracy of the federal learning model can be maintained, and the updating efficiency of the federal learning model is improved. Meanwhile, the gradient parameters and the gradient parameters are calculated by the client side simultaneously, so that the federal learning model can be updated rapidly, the operation pressure of the server side is reduced, and the overall operation efficiency is improved. Therefore, the method for updating the federal learning model provided by the invention can solve the problem of low updating efficiency of the federal learning model.
Fig. 2 is a flow chart illustrating a method for updating the federal learning model according to a second embodiment of the present invention. In this embodiment, the method for updating the federal learning model is applied to a client, and includes:
s21, when the removal data for removing the federal learning model exists in the client, obtaining the federal learning model.
In an embodiment of the present invention, the federated learning model may be a federated linear model or a federated logistic regression model.
S22, calculating gradient parameters of the removal data about the federal learning model according to a gradient formula, and calculating a blackplug matrix of a loss function of the federal learning model of the full-volume data contained in the client after the removal of the removal data according to a matrix formula.
Further, the gradient formula includes a first gradient sub-formula and a second gradient sub-formula, the matrix formula includes a first matrix sub-formula and a second matrix sub-formula, and after obtaining the federal learning model, the method further includes:
judging the type of the federal learning model;
if the federal learning model belongs to a federal linear model, calculating gradient parameters of the removed data about the federal learning model according to a first gradient sub-formula, and calculating a blackplug matrix of a loss function of the federal learning model after the removed data of the total data contained in the client is removed according to a first matrix sub-formula;
if the federal learning model belongs to a federal logistic regression model, calculating gradient parameters of the removed data about the federal learning model according to a second gradient sub-formula, and calculating a blackplug matrix of a loss function of the federal learning model after the removed data of the total data contained in the client is removed according to a second matrix sub-formula.
Specifically, the first gradient sub-formula includes:
wherein Δ is a gradient parameter, m is the number of removed data, λ is a regularization factor, w is a parameter vector of the federated learning model, b is a preset loss perturbation factor, xjAnd yjT is a preset parameter and is input data of the model.
Specifically, the second gradient sub-formula includes:
wherein Δ is a gradient parameter, m is the number of removed data, w is a parameter vector of the federated learning model, xjAnd yjT is a preset parameter and is input data of the model.
Specifically, the first matrix sub-formula includes:
wherein the content of the first and second substances,is a black plug matrix, and the black plug matrix is,in order to be a function of the loss,for the full amount of data that client i contains,remove data, x, for client ijAnd yjFor the input data of the model, m and T are preset parameters.
In this embodiment, when the federal learning model is a federal linear model, the loss function of the federal learning model includes:
wherein the content of the first and second substances,the method comprises the steps of obtaining a loss function, wherein lambda is a regularization factor, w is a parameter vector of the federal learning model, b is a preset loss disturbance factor, (x, y) is input data of the federal learning model, and n is the number of parameter federal learning clients.
Specifically, the second matrix sub-formula includes:
wherein the content of the first and second substances,is a black plug matrix, and the black plug matrix is,in order to be a function of the loss,is the rotation.
In this embodiment, when the federal learning model is a federal logistic regression model, the loss function of the federal learning model includes:
wherein the content of the first and second substances,for the loss function, x and y are input data of the model, and T is a preset parameter.
S23, sending the gradient parameters and the black plug matrix to a server side, so that the server side updates the parameters of the federal learning model according to the gradient parameters and the black plug matrix.
In the embodiment of the invention, when the client and the server are in a connected state, the gradient parameters and the black plug matrix are sent to the server, so that the server adjusts the parameters of the federal learning model according to the gradient parameters and the black plug matrix.
Specifically, in this embodiment of the present invention, before sending the gradient parameter and the black plug matrix to the server, the method further includes:
and carrying out encryption calculation on the gradient parameters and the black plug matrix.
Further, the performing encryption calculation on the gradient parameters and the black plug matrix includes:
randomly selecting two large prime numbers p and q meeting preset conditions, so that the maximum common multiple of pq and (p-1) (q-1) is 1;
calculating m ═ p × q, and satisfying λ (n) ═ lcm (p-1, q-1), where lcm is the least common multiple and λ is the kamichael function;
randomly selecting one less than n2And calculates μ ═ g (L9 g)λmodn2))-1modn;
Obtaining a public key (n, g) and a private key (lambda, mu) according to the n, g, lambda and mu;
and encrypting the gradient parameters and the black plug matrix by using the private key (lambda, mu) to obtain the encrypted gradient parameters and the encrypted black plug matrix.
The prime number refers to a natural number having no other factors than 1 and itself among natural numbers greater than 1, and the large prime number refers to the largest one or more of the natural numbers satisfying the definition of prime number.
Further, in the embodiment of the present invention, the client transmits the public key to the server, and encrypts the gradient parameter and the black plug matrix by using the private key (λ, μ) to obtain the encrypted gradient parameter and the encrypted black plug matrix.
The embodiment of the invention utilizes the private key to encrypt the gradient parameters and the black plug matrix, thereby improving the security of data transmission.
According to the embodiment of the invention, the gradient parameters and the black plug matrix are calculated in parallel at the client, so that the calculation pressure of the server is reduced, the speed of obtaining the gradient parameters and the black plug matrix is improved, and the calculated gradient parameters and the calculated black plug matrix are transmitted to the server, so that the server can rapidly update the model. Therefore, the method for updating the federal learning model provided by the invention can solve the problem of low updating efficiency of the federal learning model.
Fig. 3 is a functional block diagram of an updating apparatus of the federal learning model according to an embodiment of the present invention.
The update apparatus of the federal learning model according to the present invention may be divided into an update apparatus 100 of a first federal learning model and an update apparatus 200 of a second federal learning model. Wherein, the updating apparatus 100 of the first federated learning model may be installed in a server side and the updating apparatus 200 of the second federated learning model may be installed in a client side.
According to the implemented functions, the updating apparatus 100 of the first federated learning model may include a data request module 101, a message sending module 102, a parameter receiving module 103, an update parameter module 104, and a model updating module 105; and the updating apparatus 200 of the second federated learning model may include a model obtaining module 201, a gradient and matrix calculating module 202, and a parameter sending module 203.
The module of the present invention, which may also be referred to as a unit, refers to a series of computer program segments that can be executed by a processor of an electronic device and that can perform a fixed function, and that are stored in a memory of the electronic device.
In this embodiment, the functions of the modules in the updating apparatus 100 of the first federated learning model and the updating apparatus 200 of the second federated learning model are as follows:
the data request module 101 is configured to receive a data removal request sent by a target client participating in federal learning;
the message sending module 102 is configured to send a data removal message to all the clients participating in federated learning after deleting the removed data from the data for the federated learning model, so that the target client calculates gradient parameters of the removed data with respect to the federated learning model according to a gradient formula, and each client calculates a blackplug matrix of a loss function of the federated learning model after removing the removed data, according to a matrix formula, of the total data contained in each client;
the parameter receiving module 103 is configured to receive the gradient parameters sent by the target client and the black plug matrices sent by each client;
the update parameter calculation module 104 is configured to calculate an update parameter according to the gradient parameter, the black plug matrix, and a preset model parameter formula;
the model updating module 105 is configured to update the federal learning model by using the update parameters.
The model obtaining module 201 is configured to obtain a federated learning model when the client has removal data for removing the federated learning model;
the gradient and matrix calculation module 202 is configured to calculate gradient parameters of the removed data with respect to the federal learning model according to a gradient formula, and calculate a blackplug matrix of a loss function of the federal learning model after the removed data of the full amount of data included in the client is removed according to a matrix formula;
the parameter sending module 203 is configured to send the gradient parameter and the black plug matrix to a server, so that the server updates the parameter of the federal learning model according to the gradient parameter and the black plug matrix.
In detail, the specific implementation of each module of the updating apparatus 100 of the first federal learning model is as follows:
the data request module 101 is configured to receive a data removal request sent by a target client participating in federal learning.
The server is a server participating in federal learning. One server may have multiple clients.
In the embodiment of the invention, the server side opens the monitoring ports according to the number of the clients so as to perform data transmission with each client.
For example, if the number of the clients is K, the server opens K listening ports.
The message sending module 102 is configured to send a data removal message to all the clients participating in federated learning after deleting the removed data from the data for the federated learning model, so that the target client calculates gradient parameters of the removed data with respect to the federated learning model according to a gradient formula, and each client calculates a blackplug matrix of a loss function of the federated learning model after removing the removed data, according to a matrix formula, of the total data contained in each client.
In the embodiment of the invention, if a certain client has a removal number, the client calculates the gradient parameters, and meanwhile, each client participating in federal learning calculates a blackplug Matrix (Hessian Matrix), so that a plurality of blackplug matrices can be obtained. Namely, other clients including the target client perform calculation to obtain the black plug matrix.
Specifically, please refer to the description of the second embodiment of the method of the present invention for the description of the client computing.
The parameter receiving module 103 is configured to receive the gradient parameters sent by the target client and the black plug matrix sent by each client.
In this embodiment of the present invention, the receiving the gradient parameter sent by the target client and the black matrix sent by each client includes:
acquiring the monitoring port;
and receiving the gradient parameters sent by the target client and the black plug matrix sent by each client through the monitoring port by using a preset request response protocol.
The update parameter calculation module 104 is configured to calculate an update parameter according to the gradient parameter, the black plug matrix, and a preset model parameter formula.
Preferably, the preset model parameter formula includes:
wherein, w(-m)For the updated parameters, w is a parameter vector of the federated learning model,is as followsAnd the black plug matrixes of the i clients, delta is the gradient parameter, m is the number of the removed data, and k is the number of all the clients participating in the federal learning.
In the embodiment of the present invention, before calculating the update parameter according to the gradient parameter, the black plug matrix, and the preset model parameter formula, the apparatus further includes a decryption module, where the decryption module is configured to determine whether the gradient parameter or the black plug matrix is encrypted data; and if the gradient parameters or the black plug matrix are encrypted data, decrypting the encrypted data according to a preset decryption formula to obtain the decrypted gradient parameters and the decrypted black plug matrix.
Specifically, the decrypting the encrypted data according to a preset decryption formula to obtain decrypted gradient parameters and a blackplug matrix includes:
the preset decryption formula is as follows:
m=L(cλmodn2)*μmodn
where m is a decrypted gradient parameter or a blackplug matrix, c is an encrypted gradient parameter or a blackplug matrix, mod is a modulus operator, and n is p × q, where p and q are large prime numbers satisfying a maximum common multiple of pq and (p-1) (q-1) as 1, λ is a kamichael function, and μ is a preset parameter.
In detail, the encrypted gradient parameter or the blackplug matrix is decrypted by using the public key (n, g), so that the decrypted gradient parameter or the blackplug matrix is obtained.
In detail, after the federal learning model is updated by using the update parameters, the apparatus further includes a model update module, and the model update module 105 is configured to send the updated federal learning model to all clients participating in federal learning.
The gradient parameters sent by the target client and the black plug matrix sent by each client are received; calculating an updating parameter according to the gradient parameter, the black plug matrix and a preset model parameter formula; and updating the federal learning model by using the updating parameters. After the data used for the federal learning are deleted, the federal learning model does not need to be retrained, the usability and the accuracy of the federal learning model can be maintained, and the updating efficiency of the federal learning model is improved. Meanwhile, the gradient parameters and the gradient parameters are calculated by the client side simultaneously, so that the federal learning model can be updated rapidly, the operation pressure of the server side is reduced, and the overall operation efficiency is improved. Therefore, the method for updating the federal learning model provided by the invention can solve the problem of low updating efficiency of the federal learning model.
In detail, the specific implementation of the modules of the updating apparatus 200 of the second federated learning model is as follows:
the model obtaining module 201 is configured to obtain the federated learning model when the client has removal data for removing the federated learning model.
In an embodiment of the present invention, the federated learning model may be a federated linear model or a federated logistic regression model.
The gradient and matrix calculation module 202 is configured to calculate gradient parameters of the removed data with respect to the federal learning model according to a gradient formula, and calculate a blackplug matrix of a loss function of the federal learning model of the full amount of data included in the client after the removed data is removed according to a matrix formula.
Further, the gradient formula includes a first gradient sub-formula and a second gradient sub-formula, the matrix formula includes a first matrix sub-formula and a second matrix sub-formula, after the federal learning model is obtained, the apparatus further includes the gradient and matrix calculation module 202, and the gradient and matrix calculation module 202 is configured to determine the type of the federal learning model; if the federal learning model belongs to a federal linear model, calculating gradient parameters of the removed data about the federal learning model according to a first gradient sub-formula, and calculating a blackplug matrix of a loss function of the federal learning model after the removed data of the total data contained in the client is removed according to a first matrix sub-formula; if the federal learning model belongs to a federal logistic regression model, calculating gradient parameters of the removed data about the federal learning model according to a second gradient sub-formula, and calculating a blackplug matrix of a loss function of the federal learning model after the removed data of the total data contained in the client is removed according to a second matrix sub-formula.
Specifically, the first gradient sub-formula includes:
wherein Δ is a gradient parameter, m is the number of removed data, λ is a regularization factor, w is a parameter vector of the federated learning model, b is a preset loss perturbation factor, xjAnd yjT is a preset parameter and is input data of the model.
Specifically, the second gradient sub-formula includes:
wherein Δ is a gradient parameter, m is the number of removed data, w is a parameter vector of the federated learning model, xjAnd yjT is a preset parameter and is input data of the model.
Specifically, the first matrix sub-formula includes:
wherein the content of the first and second substances,is a black plug matrix, and the black plug matrix is,in order to be a function of the loss,for the full amount of data that client i contains,remove data, x, for client ijAnd yjFor the input data of the model, m and T are preset parameters.
In this embodiment, when the federal learning model is a federal linear model, the loss function of the federal learning model includes:
wherein the content of the first and second substances,the method comprises the steps of obtaining a loss function, wherein lambda is a regularization factor, w is a parameter vector of the federal learning model, b is a preset loss disturbance factor, (x, y) is input data of the federal learning model, and n is the number of parameter federal learning clients.
Specifically, the second matrix sub-formula includes:
wherein the content of the first and second substances,is a black plugThe matrix is a matrix of a plurality of matrices,in order to be a function of the loss,is the rotation.
In this embodiment, when the federal learning model is a federal logistic regression model, the loss function of the federal learning model includes:
wherein the content of the first and second substances,for the loss function, x and y are input data of the model, and T is a preset parameter.
The parameter sending module 203 is configured to send the gradient parameter and the black plug matrix to a server, so that the server updates the parameter of the federal learning model according to the gradient parameter and the black plug matrix.
In the embodiment of the invention, when the client and the server are in a connected state, the gradient parameters and the black plug matrix are sent to the server, so that the server adjusts the parameters of the federal learning model according to the gradient parameters and the black plug matrix.
Specifically, in the embodiment of the present invention, before sending the gradient parameter and the black plug matrix to the server, the apparatus further includes an encryption module 204, where the encryption module 204 is configured to:
and carrying out encryption calculation on the gradient parameters and the black plug matrix.
Further, the performing encryption calculation on the gradient parameters and the black plug matrix includes:
randomly selecting two large prime numbers p and q meeting preset conditions, so that the maximum common multiple of pq and (p-1) (q-1) is 1;
calculating n ═ p × q, and satisfying λ (n) ═ lcm (p-1, q-1), where lcm is the least common multiple and λ is the kamichael function;
randomly selecting one less than n2And calculating μ ═ L (g) by the positive integer g of (c)λmodn2))-1modn;
Obtaining a public key (n, g) and a private key (lambda, mu) according to the n, g, lambda and mu;
and encrypting the gradient parameters and the black plug matrix by using the private key (lambda, mu) to obtain the encrypted gradient parameters and the encrypted black plug matrix.
The prime number refers to a natural number having no other factors than 1 and itself among natural numbers greater than 1, and the large prime number refers to the largest one or more of the natural numbers satisfying the definition of prime number.
Further, in the embodiment of the present invention, the client transmits the public key to the server, and encrypts the gradient parameter and the black plug matrix by using the private key (λ, μ) to obtain the encrypted gradient parameter and the encrypted black plug matrix.
The embodiment of the invention utilizes the private key to encrypt the gradient parameters and the black plug matrix, thereby improving the security of data transmission.
According to the embodiment of the invention, the gradient parameters and the black plug matrix are calculated in parallel at the client, so that the calculation pressure of the server is reduced, the speed of obtaining the gradient parameters and the black plug matrix is improved, and the calculated gradient parameters and the calculated black plug matrix are transmitted to the server, so that the server can rapidly update the model. Therefore, the method for updating the federal learning model provided by the invention can solve the problem of low updating efficiency of the federal learning model.
Fig. 4 is a schematic structural diagram of an electronic device that implements an update method of a federal learning model according to an embodiment of the present invention.
The electronic device 1 may comprise a processor 10, a memory 11 and a bus, and may further comprise a computer program, such as an update program 12 of a federal learning model, stored in the memory 11 and operable on the processor 10.
The memory 11 includes at least one type of readable storage medium, which includes flash memory, removable hard disk, multimedia card, card-type memory (e.g., SD or DX memory, etc.), magnetic memory, magnetic disk, optical disk, etc. The memory 11 may in some embodiments be an internal storage unit of the electronic device 1, such as a removable hard disk of the electronic device 1. The memory 11 may also be an external storage device of the electronic device 1 in other embodiments, such as a plug-in mobile hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the electronic device 1. Further, the memory 11 may also include both an internal storage unit and an external storage device of the electronic device 1. The memory 11 may be used not only to store application software installed in the electronic device 1 and various types of data, such as codes of the update program 12 of the federal learning model, but also to temporarily store data that has been output or is to be output.
The processor 10 may be composed of an integrated circuit in some embodiments, for example, a single packaged integrated circuit, or may be composed of a plurality of integrated circuits packaged with the same or different functions, including one or more Central Processing Units (CPUs), microprocessors, digital Processing chips, graphics processors, and combinations of various control chips. The processor 10 is a Control Unit (Control Unit) of the electronic device, connects various components of the whole electronic device by using various interfaces and lines, and executes various functions and processes data of the electronic device 1 by running or executing programs or modules (e.g., update programs of the federal learning model, etc.) stored in the memory 11 and calling data stored in the memory 11.
The bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. The bus is arranged to enable connection communication between the memory 11 and at least one processor 10 or the like.
Fig. 4 only shows an electronic device with components, and it will be understood by those skilled in the art that the structure shown in fig. 4 does not constitute a limitation of the electronic device 1, and may comprise fewer or more components than those shown, or some components may be combined, or a different arrangement of components.
For example, although not shown, the electronic device 1 may further include a power supply (such as a battery) for supplying power to each component, and preferably, the power supply may be logically connected to the at least one processor 10 through a power management device, so as to implement functions of charge management, discharge management, power consumption management, and the like through the power management device. The power supply may also include any component of one or more dc or ac power sources, recharging devices, power failure detection circuitry, power converters or inverters, power status indicators, and the like. The electronic device 1 may further include various sensors, a bluetooth module, a Wi-Fi module, and the like, which are not described herein again.
Further, the electronic device 1 may further include a network interface, and optionally, the network interface may include a wired interface and/or a wireless interface (such as a WI-FI interface, a bluetooth interface, etc.), which are generally used for establishing a communication connection between the electronic device 1 and other electronic devices.
Optionally, the electronic device 1 may further comprise a user interface, which may be a Display (Display), an input unit (such as a Keyboard), and optionally a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable for displaying information processed in the electronic device 1 and for displaying a visualized user interface, among other things.
It is to be understood that the described embodiments are for purposes of illustration only and that the scope of the appended claims is not limited to such structures.
The update program 12 of the federal learning model stored in the memory 11 of the electronic device 1 is a combination of a plurality of instructions, which when executed in the processor 10, can realize:
in detail, when the electronic device is a server-side electronic device, the method for updating the federal learning model includes:
receiving a data removal request sent by a target client participating in federal learning;
after deleting the removed data from the data for the federated learning model, sending a data removal message to all the clients participating in the federated learning so that the target client calculates gradient parameters of the removed data about the federated learning model according to a gradient formula and each client calculates a blackplug matrix of a loss function of the federated learning model after the removed data of the total data contained in each client is removed according to a matrix formula;
receiving gradient parameters sent by the target client and the black plug matrix sent by each client;
calculating an updating parameter according to the gradient parameter, the black plug matrix and a preset model parameter formula;
and updating the federal learning model by using the updating parameters.
Further, when the electronic device is a client-side electronic device, the method for updating the federal learning model includes:
when the client side has removal data for removing a federated learning model, acquiring the federated learning model;
calculating gradient parameters of the removal data about the federal learning model according to a gradient formula, and calculating a blackplug matrix of a loss function of the federal learning model of full data contained in the client after the removal data is removed according to a matrix formula;
and sending the gradient parameters and the black plug matrix to a server side so that the server side updates the parameters of the federal learning model according to the gradient parameters and the black plug matrix.
Specifically, the specific implementation method of the processor 10 for the instruction may refer to the description of the relevant steps in the embodiments corresponding to fig. 1 to fig. 4, which is not repeated herein.
Further, the integrated modules/units of the electronic device 1, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. The computer readable storage medium may be volatile or non-volatile. For example, the computer-readable medium may include: any entity or device capable of carrying said computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM).
The present invention also provides a computer-readable storage medium, storing a computer program which, when executed by a processor of an electronic device, may implement:
in detail, when the electronic device is a server-side electronic device, the method for updating the federal learning model includes:
receiving a data removal request sent by a target client participating in federal learning;
after deleting the removed data from the data for the federated learning model, sending a data removal message to all the clients participating in the federated learning so that the target client calculates gradient parameters of the removed data about the federated learning model according to a gradient formula and each client calculates a blackplug matrix of a loss function of the federated learning model after the removed data of the total data contained in each client is removed according to a matrix formula;
receiving gradient parameters sent by the target client and the black plug matrix sent by each client;
calculating an updating parameter according to the gradient parameter, the black plug matrix and a preset model parameter formula;
and updating the federal learning model by using the updating parameters.
Further, when the electronic device is a client-side electronic device, the method for updating the federal learning model includes:
when the client side has removal data for removing a federated learning model, acquiring the federated learning model;
calculating gradient parameters of the removal data about the federal learning model according to a gradient formula, and calculating a blackplug matrix of a loss function of the federal learning model of full data contained in the client after the removal data is removed according to a matrix formula;
and sending the gradient parameters and the black plug matrix to a server side so that the server side updates the parameters of the federal learning model according to the gradient parameters and the black plug matrix.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus, device and method can be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof.
The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference signs in the claims shall not be construed as limiting the claim concerned.
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the system claims may also be implemented by one unit or means in software or hardware. The terms second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.
Claims (10)
1. The method for updating the federated learning model is applied to a server side and comprises the following steps:
receiving a data removal request sent by a target client participating in federal learning;
after deleting the removed data from the data for the federated learning model, sending a data removal message to all the clients participating in the federated learning so that the target client calculates gradient parameters of the removed data about the federated learning model according to a gradient formula and each client calculates a blackplug matrix of a loss function of the federated learning model after the removed data of the total data contained in each client is removed according to a matrix formula;
receiving gradient parameters sent by the target client and the black plug matrix sent by each client;
calculating an updating parameter according to the gradient parameter, the black plug matrix and a preset model parameter formula;
and updating the federal learning model by using the updating parameters.
2. The method for updating a federal learning model as claimed in claim 1, wherein said predetermined model parameter formula comprises:
3. The method for updating a federated learning model as defined in claim 1, wherein after the updating the federated learning model with the updated parameters, the method further comprises:
and sending the updated federal learning model to all clients participating in federal learning.
4. The method for updating a federal learning model as claimed in claim 1, wherein prior to said calculating updated parameters based on said gradient parameters, said blackplug matrix, and a preset model parameter formula, said method further comprises:
judging whether the gradient parameters or the black plug matrix are encrypted data or not;
and if the gradient parameters or the black plug matrix are encrypted data, decrypting the encrypted data according to a preset decryption formula to obtain the decrypted gradient parameters and the decrypted black plug matrix.
5. The updating method of a federal learning model as claimed in claim 4, wherein the decrypting the encrypted data according to a preset decryption formula to obtain the decrypted gradient parameters and the blackplug matrix comprises:
the preset decryption formula is as follows:
m=L(cλmodn2)*μmodn
where m is the decrypted gradient parameter or the blackplug matrix, c is the encrypted gradient parameter or the blackplug matrix, mod is the modulus operator, and n is p × q, where p and q are large prime numbers satisfying oq and (p-1) (q-1) with the greatest common multiple of 1, λ is the kamichael function, and μ is a preset parameter.
6. The method for updating a federated learning model as defined in claim 1, wherein after receiving a data removal request sent by a target client participating in federated learning, the method further comprises:
and opening the monitoring ports according to the number of the clients.
7. The method for updating a federal learning model as claimed in claim 1, wherein the receiving the gradient parameters sent by the target client and the black plug matrix sent by each client comprises:
acquiring the monitoring port;
and receiving the gradient parameters sent by the target client and the black plug matrix sent by each client through the monitoring port by using a preset request response protocol.
8. The utility model provides an update device of nation learning model, its characterized in that, the device is applied to server side, the device includes:
the data request module is used for receiving a data removal request sent by a target client participating in federal learning;
a message sending module, configured to send a data removal message to all clients participating in federated learning after deleting the removal data from the data for the federated learning model, so that the target client calculates gradient parameters of the removal data with respect to the federated learning model according to a gradient formula, and each client calculates a blackplug matrix of a loss function of the federated learning model after removing the removal data, based on a matrix formula, of the full amount of data contained in each client;
the parameter receiving module is used for receiving the gradient parameters sent by the target client and the black plug matrix sent by each client;
the updating parameter calculation module is used for calculating updating parameters according to the gradient parameters, the black plug matrix and a preset model parameter formula;
and the model updating module is used for updating the federal learning model by using the updating parameters.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method for updating a federal learning model as claimed in any of claims 1 to 6.
10. A computer-readable storage medium storing a computer program which, when executed by a processor, implements a method for updating a federated learning model as defined in any one of claims 1 to 6.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011640520.0A CN112732297B (en) | 2020-12-31 | 2020-12-31 | Method and device for updating federal learning model, electronic equipment and storage medium |
PCT/CN2021/083180 WO2022141839A1 (en) | 2020-12-31 | 2021-03-26 | Method and apparatus for updating federated learning model, and electronic device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011640520.0A CN112732297B (en) | 2020-12-31 | 2020-12-31 | Method and device for updating federal learning model, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112732297A true CN112732297A (en) | 2021-04-30 |
CN112732297B CN112732297B (en) | 2022-09-27 |
Family
ID=75609096
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011640520.0A Active CN112732297B (en) | 2020-12-31 | 2020-12-31 | Method and device for updating federal learning model, electronic equipment and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112732297B (en) |
WO (1) | WO2022141839A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110851786A (en) * | 2019-11-14 | 2020-02-28 | 深圳前海微众银行股份有限公司 | Longitudinal federated learning optimization method, device, equipment and storage medium |
CN113887743A (en) * | 2021-09-29 | 2022-01-04 | 浙江大学 | Platform for forgetting and verifying data in federated learning |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115225405B (en) * | 2022-07-28 | 2023-04-21 | 上海光之树科技有限公司 | Matrix decomposition method based on security aggregation and key exchange under federal learning framework |
CN115329985B (en) * | 2022-09-07 | 2023-10-27 | 北京邮电大学 | Unmanned cluster intelligent model training method and device and electronic equipment |
CN117094410B (en) * | 2023-07-10 | 2024-02-13 | 西安电子科技大学 | Model repairing method for poisoning damage federal learning |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109189825A (en) * | 2018-08-10 | 2019-01-11 | 深圳前海微众银行股份有限公司 | Lateral data cutting federation learning model building method, server and medium |
CN110378488A (en) * | 2019-07-22 | 2019-10-25 | 深圳前海微众银行股份有限公司 | Federal training method, device, training terminal and the storage medium of client variation |
CN110610242A (en) * | 2019-09-02 | 2019-12-24 | 深圳前海微众银行股份有限公司 | Method and device for setting participant weight in federated learning |
US20200050443A1 (en) * | 2018-08-10 | 2020-02-13 | Nvidia Corporation | Optimization and update system for deep learning models |
CN111553483A (en) * | 2020-04-30 | 2020-08-18 | 同盾控股有限公司 | Gradient compression-based federated learning method, device and system |
CN111814985A (en) * | 2020-06-30 | 2020-10-23 | 平安科技(深圳)有限公司 | Model training method under federated learning network and related equipment thereof |
-
2020
- 2020-12-31 CN CN202011640520.0A patent/CN112732297B/en active Active
-
2021
- 2021-03-26 WO PCT/CN2021/083180 patent/WO2022141839A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109189825A (en) * | 2018-08-10 | 2019-01-11 | 深圳前海微众银行股份有限公司 | Lateral data cutting federation learning model building method, server and medium |
US20200050443A1 (en) * | 2018-08-10 | 2020-02-13 | Nvidia Corporation | Optimization and update system for deep learning models |
CN110378488A (en) * | 2019-07-22 | 2019-10-25 | 深圳前海微众银行股份有限公司 | Federal training method, device, training terminal and the storage medium of client variation |
CN110610242A (en) * | 2019-09-02 | 2019-12-24 | 深圳前海微众银行股份有限公司 | Method and device for setting participant weight in federated learning |
CN111553483A (en) * | 2020-04-30 | 2020-08-18 | 同盾控股有限公司 | Gradient compression-based federated learning method, device and system |
CN111814985A (en) * | 2020-06-30 | 2020-10-23 | 平安科技(深圳)有限公司 | Model training method under federated learning network and related equipment thereof |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110851786A (en) * | 2019-11-14 | 2020-02-28 | 深圳前海微众银行股份有限公司 | Longitudinal federated learning optimization method, device, equipment and storage medium |
CN113887743A (en) * | 2021-09-29 | 2022-01-04 | 浙江大学 | Platform for forgetting and verifying data in federated learning |
CN113887743B (en) * | 2021-09-29 | 2022-07-22 | 浙江大学 | Platform for forgetting and verifying data in federated learning |
Also Published As
Publication number | Publication date |
---|---|
WO2022141839A1 (en) | 2022-07-07 |
CN112732297B (en) | 2022-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112732297B (en) | Method and device for updating federal learning model, electronic equipment and storage medium | |
CN114389889B (en) | File full life cycle management method and device based on block chain technology | |
CN111612458A (en) | Method and device for processing block chain data and readable storage medium | |
CN114124502B (en) | Message transmission method, device, equipment and medium | |
CN111563268B (en) | Data encryption method and device based on matrix operation and storage medium | |
CN112217642A (en) | Data encryption sharing method and device, electronic equipment and computer storage medium | |
CN113420049A (en) | Data circulation method and device, electronic equipment and storage medium | |
CN115374150A (en) | Character string data query method and device, electronic equipment and storage medium | |
CN115048664A (en) | Data security storage method, device, equipment and medium based on solid state disk | |
CN112506559A (en) | Gray scale publishing method and device based on gateway, electronic equipment and storage medium | |
CN113112252A (en) | Resource transfer method and device based on block chain, electronic equipment and storage medium | |
CN114417374A (en) | Intelligent contract business card method, device, equipment and storage medium based on block chain | |
CN113162763A (en) | Data encryption and storage method and device, electronic equipment and storage medium | |
CN113221154A (en) | Service password obtaining method and device, electronic equipment and storage medium | |
CN111260532B (en) | Privacy image encryption method, device, electronic equipment and computer readable storage medium | |
CN114760073B (en) | Block chain-based warehouse commodity distribution method and device, electronic equipment and medium | |
CN112637341A (en) | File uploading method and device, electronic equipment and storage medium | |
CN115002062B (en) | Message processing method, device, equipment and readable storage medium | |
CN112988888B (en) | Key management method, device, electronic equipment and storage medium | |
CN111683070B (en) | Data transmission method and device based on identity encryption and storage medium | |
CN114826725A (en) | Data interaction method, device, equipment and storage medium | |
CN114298211A (en) | Feature binning method and device, electronic equipment and storage medium | |
CN112182598A (en) | Public sample ID identification method, device, server and readable storage medium | |
CN112446765A (en) | Product recommendation method and device, electronic equipment and computer-readable storage medium | |
CN111934882A (en) | Identity authentication method and device based on block chain, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |