WO2021056760A1 - Dispositif, appareil et procédé de chiffrement de données d'apprentissage fédéré et support de stockage lisible - Google Patents

Dispositif, appareil et procédé de chiffrement de données d'apprentissage fédéré et support de stockage lisible Download PDF

Info

Publication number
WO2021056760A1
WO2021056760A1 PCT/CN2019/118845 CN2019118845W WO2021056760A1 WO 2021056760 A1 WO2021056760 A1 WO 2021056760A1 CN 2019118845 W CN2019118845 W CN 2019118845W WO 2021056760 A1 WO2021056760 A1 WO 2021056760A1
Authority
WO
WIPO (PCT)
Prior art keywords
encryption
encrypted
parameters
model parameter
update
Prior art date
Application number
PCT/CN2019/118845
Other languages
English (en)
Chinese (zh)
Inventor
程勇
刘洋
陈天健
Original Assignee
深圳前海微众银行股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳前海微众银行股份有限公司 filed Critical 深圳前海微众银行股份有限公司
Publication of WO2021056760A1 publication Critical patent/WO2021056760A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/008Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols involving homomorphic encryption
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/06Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols the encryption apparatus using shift registers or memories for block-wise or stream coding, e.g. DES systems or RC4; Hash functions; Pseudorandom sequence generators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/0816Key establishment, i.e. cryptographic processes or cryptographic protocols whereby a shared secret becomes available to two or more parties, for subsequent use
    • H04L9/0819Key transport or distribution, i.e. key establishment techniques where one party creates or otherwise obtains a secret value, and securely transfers it to the other(s)
    • H04L9/0825Key transport or distribution, i.e. key establishment techniques where one party creates or otherwise obtains a secret value, and securely transfers it to the other(s) using asymmetric-key encryption or public key infrastructure [PKI], e.g. key signature or public key certificates
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/0891Revocation or update of secret information, e.g. encryption key update or rekeying

Definitions

  • This application relates to the field of data processing technology, and in particular to a method, device, device, and readable storage medium for encrypting federated learning data.
  • the local model parameter updates for example, neural network model weights or gradient information
  • the coordinator In the actual application of horizontal federated learning scenarios, the local model parameter updates (for example, neural network model weights or gradient information) sent by participants to the coordination device will be obtained by the coordinator.
  • the reliability of the coordinator cannot be guaranteed, It may leak the participant's privacy, data information, and the trained machine learning model to the coordinator.
  • participants can use encryption methods, for example, using homomorphic encryption (homomorphic encryption). encryption technology, secret sharing technology or differential privacy privacy) technology, which sends model parameter updates to the coordinator, and the coordinator cannot obtain model weights or gradient information if the coordinator cannot decrypt, thereby ensuring that no information will be leaked to the coordinator.
  • homomorphic encryption homomorphic encryption
  • the use of encryption technology will significantly increase the length of the information that needs to be transmitted.
  • the length of the ciphertext (measured by the number of bits) obtained is at least twice the length of the plaintext. That is, encryption at least doubles the communication bandwidth requirement than no encryption.
  • communication bandwidth is severely limited, and the additional communication bandwidth requirements brought about by participants' encryption operations may not be met, or at least significantly Increase the communication delay.
  • the main purpose of this application is to provide a federated learning data encryption method, device, equipment and readable storage medium, aiming to implement a security mechanism so that the participant’s information will not be leaked to the coordinator and will not cause a significant increase Communication bandwidth requirements.
  • this application provides a method for encrypting federated learning data.
  • the method for encrypting federated learning data is applied to a participating device, and the participating device is in communication connection with a coordinating device.
  • the method for encrypting federated learning data includes the following steps:
  • the present application also provides a federal learning data encryption device, the federal learning data encryption device is deployed on participating equipment, the participating equipment is in communication connection with the coordinating device, and the federal learning data encryption device includes: an acquisition module , Is set to obtain the encryption parameters of this model update during the federated learning process; the determining module is set to determine the to-be-encrypted part and the plaintext part of the model parameter update obtained by local training according to the encryption parameters; the encryption module is set to adopt The preset encryption algorithm encrypts the part to be encrypted to obtain the ciphertext part of the model parameter update; the sending module is configured to send the ciphertext part and the plaintext part of the model parameter update to the coordination device.
  • an acquisition module Is set to obtain the encryption parameters of this model update during the federated learning process
  • the determining module is set to determine the to-be-encrypted part and the plaintext part of the model parameter update obtained by local training according to the encryption parameters
  • the encryption module is set to adopt The preset encryption algorithm encrypts
  • the federated learning data encryption device includes a memory, a processor, and federated learning data stored on the memory and running on the processor.
  • An encryption program which implements the steps of the federated learning data encryption method as described above when the federated learning data encryption program is executed by the processor.
  • this application also proposes a computer-readable storage medium on which is stored a federated learning data encryption program.
  • the federated learning data encryption program is executed by a processor, the implementation is as described above. The steps of the federated learning data encryption method are described.
  • the encryption parameters of this model update are obtained during the federated learning process; the to-be-encrypted part and the plaintext part of the model parameter update obtained by local training are determined according to the encryption parameters; the part to be encrypted is obtained by using a preset encryption algorithm
  • the ciphertext part of the model parameter update; the ciphertext part and the plaintext part of the model parameter update are sent to the coordination device.
  • This application realizes that in the process of federated learning, the private data of participating devices is not leaked to the coordinating device and the data security of the participating devices is ensured, while reducing the computational complexity of encryption and the power consumption of the participating devices, and reducing the communication to the participating devices Bandwidth requirements can adapt to application scenarios where power and computing resources are limited, such as IoT devices, commercial satellites, or remote sensing satellites, and communication bandwidth is severely limited.
  • FIG. 1 is a schematic structural diagram of a hardware operating environment involved in a solution of an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a first embodiment of a method for encryption of federated learning data according to the application;
  • FIG. 3 is a schematic diagram of a processing flow of federated learning data encryption related to an embodiment of this application
  • FIG. 4 is a functional schematic diagram and module diagram of a preferred embodiment of a federal learning data encryption device according to this application.
  • FIG. 1 is a schematic diagram of the device structure of the hardware operating environment involved in the solution of the embodiment of the present application.
  • the federated learning data encryption device in the embodiment of the present application may be a smart phone, a personal computer, a server, and other devices, which are not specifically limited here.
  • the federal learning data encryption device may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, and a communication bus 1002.
  • the communication bus 1002 is used to implement connection and communication between these components.
  • the user interface 1003 may include a display screen (Display) and an input unit such as a keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface and a wireless interface.
  • the network interface 1004 may optionally include a standard wired interface and a wireless interface (such as a WI-FI interface).
  • the memory 1005 can be a high-speed RAM memory or a stable memory (non-volatile memory), such as disk storage.
  • the memory 1005 may also be a storage device independent of the aforementioned processor 1001.
  • FIG. 1 does not constitute a limitation on the federated learning data encryption device, and may include more or less components than shown in the figure, or a combination of certain components, or different components Layout.
  • a memory 1005 as a computer storage medium may include an operating system, a network communication module, a user interface module, and a federal learning data encryption program.
  • the operating system is a program that manages and controls equipment hardware and software resources, and supports the operation of federal learning data encryption programs and other software or programs.
  • the user interface 1003 is mainly used to communicate with the client; the network interface 1004 is mainly used to establish communication connections with the coordinating device of the federated learning and other participating devices; and the processor 1001 can be used to call
  • the federated learning data encryption program is stored in the memory 1005, and each step of the federated learning data encryption method described below is executed.
  • FIG. 2 is a schematic flowchart of a first embodiment of a federal learning data encryption method according to this application.
  • the embodiment of this application provides an embodiment of the method for encrypting the federated learning data. It should be noted that although the logical sequence is shown in the flowchart, in some cases, the sequence shown here can be executed in a different order. Or the steps described.
  • the federal learning data encryption method of this application is applied to participating devices in the federal learning, and the coordination device in the federal learning is in communication connection with at least one participating device.
  • the federal learning data encryption method includes:
  • Step S10 in the process of federated learning, the encryption parameters of this model update are obtained
  • the coordination device and each participating device may establish a communication connection in advance through handshake and identity authentication, and determine the model to be trained for this federated learning.
  • the model to be trained may be a machine learning model, such as a neural network model.
  • the coordination device and the participating device cooperate with each other to perform multiple iterations of the training model to obtain the final convergent model to be trained, and the training process of the training model can be ended.
  • each participating device performs local training on the locally owned local training data, and obtains the respective local model parameter updates.
  • the model parameter update can be the update of the parameters in the training model, such as the neural network.
  • the weight update in the network model can also be gradient information.
  • the encryption parameter can be the encryption ratio, the number of parameters to be encrypted or the number of bits to be encrypted, where the model parameter update may include multiple parameters, such as Q, and the number of parameters to be encrypted refers to the parameters that need to be encrypted
  • model parameter update can be converted into bits, calculate the number of bits for model parameter update, such as N
  • the number of bits to be encrypted refers to the number of bits that need to be encrypted, such as M
  • the encryption ratio can be the number of bits to be encrypted
  • the ratio P/Q of the number of parameters to the number of model parameter update parameters may also be the ratio M/N of the number of bits to be encrypted to the number of model parameter update bits.
  • the encryption parameters can be uniformly set in the coordination device and each participating device in advance to adapt to the communication bandwidth, computing resources and power of each participating device; when the model parameter update is a floating point number, the encryption parameter can be set as the number of parameters to be encrypted When the encryption algorithm is to encrypt the model parameter update according to the bit position, the encryption parameter can be set to the number of bits to be encrypted, or the ratio of the number of encrypted bits. It should be noted that the encryption parameters of each participating device and coordinating device in a model update are unified.
  • Step S20 Determine the to-be-encrypted part and the plaintext part of the model parameter update obtained by local training according to the encryption parameter;
  • the participating device determines the part to be encrypted and the plaintext part of the model parameter update obtained by local training according to the acquired encryption parameters. For example, when the encryption parameter is the number of parameters to be encrypted or the number of bits to be encrypted, the participating device can select the previous part of the parameters as the part to be encrypted from the parameters of the model parameter update, and the number of the previous part of the parameters is the number of parameters to be encrypted , The remaining part of the parameter is the plain text part; or the participating device selects the previous part of the bit as the part to be encrypted from the bits of the model parameter update, and the number of the previous part is the number of bits to be encrypted, then the remaining part of the Bits are the plaintext part.
  • the encryption parameter is the number of parameters to be encrypted or the number of bits to be encrypted
  • Step S30 encrypting the part to be encrypted using a preset encryption algorithm to obtain the ciphertext part of the model parameter update
  • the participating equipment uses a preset encryption algorithm to encrypt the part to be encrypted, and obtains the ciphertext part of the model parameter update.
  • the preset encryption algorithm can be a preset homomorphic encryption (Homomorphic encryption). Encryption) algorithm, such as Paillier algorithm.
  • the principle of the homomorphic encryption algorithm is: processing the homomorphically encrypted data to obtain an output, and decrypting this output, the result is the same as the output result obtained by processing the unencrypted original data with the same method.
  • the coordination device of federated learning can directly fuse the model parameter update encrypted by the homomorphic encryption algorithm without decryption.
  • the participating device decrypts the global model parameter update obtained by the fusion and the result obtained with the coordination device The results obtained by directly fusing the unencrypted model parameter updates are the same.
  • the preset encryption algorithm can also be an algorithm that uses a mask.
  • the mask is also called perturbation. For example, a mask is randomly generated, and the mask is added or subtracted from the original data to get For masked data, subtract or add the mask to the masked data to restore the original data.
  • the participating device adds a mask to the model parameter update by using the algorithm of the mask, and the coordinating device can directly fuse the model parameter update with the mask, and the participating device uses the mask to fuse the result
  • the result obtained by removing the mask from the global model parameter update is the same as the result obtained by the coordinating device directly performing the fusion processing on the model parameter update without the mask.
  • Step S40 Send the ciphertext part and the plaintext part of the model parameter update to the coordination device.
  • the participating device sends the ciphertext part and the plaintext part of the model parameter update to the coordinating device. If the model parameter update of the participating device includes N parameters, the participating device uses a homomorphic encryption algorithm to encrypt the M parameters to obtain the ciphertext part, and the remaining MN unencrypted parameters are used as the plaintext part, and the participating equipment combines the ciphertext part and the plaintext part. Part is sent to the coordinating device. Participating devices may combine the ciphertext part and the plaintext part and send them to the coordinating device, or separately send them to the coordinating device, and there is no specific limitation.
  • the coordination device cannot obtain all the plaintext of the model parameter update, so the coordination device cannot obtain all the parameter information or gradient information of the model to be trained, thereby ensuring participation
  • the private data of the device will not be leaked to the coordinating device, thereby ensuring the security of the data of the participating devices.
  • the participating device partially encrypts the model parameter update according to the encryption parameters, instead of encrypting all the model parameter updates, which reduces the computational complexity of encryption and the power consumption of the participating device, and also reduces the length of the model parameter update after encryption. This reduces the requirements for communication bandwidth and can adapt to application scenarios with limited power and computing resources such as IoT devices, commercial satellites or remote sensing satellites, and severely limited communication bandwidth.
  • the coordination device can directly perform fusion processing on the model parameter updates including the ciphertext part and the plaintext part sent by each participating device, such as The weighted average is used to obtain the global model parameter update.
  • the global model parameter update is encrypted.
  • the coordination device sends the encrypted global model parameter update to each participating device.
  • the participating device After receiving the encrypted global model parameter update, the participating device first decrypts the encrypted global model parameter update to obtain the global model parameter update, and proceed to the next time according to the global model parameter update
  • the model is updated, until the coordination device detects that the model to be trained has converged or the number of iterative training reaches the maximum number of times, the training of the model to be trained can be stopped.
  • the encryption parameters of this model update are obtained during the federated learning process; the encrypted part and the plaintext part of the model parameter update obtained by local training are determined according to the encryption parameters; the part to be encrypted is performed using a preset encryption algorithm
  • the ciphertext part of the model parameter update is obtained by encryption; the ciphertext part and the plaintext part of the model parameter update are sent to the coordinating device, so as to ensure that the private data of the participating device is not leaked to the coordinating device during the federated learning process to ensure that the participating device
  • it reduces the computational complexity of encryption and the power consumption of participating devices, and reduces the requirements for the communication bandwidth of the participating devices, so that it can adapt to the limited power and computing resources of IoT devices, commercial satellites or remote sensing satellites, and the communication bandwidth is severely affected. Limited application scenarios.
  • step S40 it further includes:
  • Step S50 receiving the encrypted global model parameter update sent by the coordination device, wherein the coordination device performs ciphertext fusion processing on the ciphertext part of the model parameter update of each participating device to obtain the ciphertext of the encrypted global model parameter update Part, performing plaintext fusion processing on the plaintext part of each of the participating device model parameter updates to obtain the plaintext part of the encrypted global model parameter update;
  • the coordination device receives the model parameter update including the ciphertext part and the plaintext part sent by each participating device, performs ciphertext fusion processing on the ciphertext part of each model parameter update, and obtains the ciphertext part of the encrypted global model parameters, and updates each model parameter Perform plaintext fusion processing on the plaintext part of, and obtain the plaintext part of the encrypted global model parameter update.
  • the process of ciphertext fusion processing and plaintext fusion processing may be the same or different. Specifically, when the encryption algorithm adopted by the participating device has different effects on the arithmetic operation on the plaintext and the arithmetic operation on the ciphertext , The process of ciphertext fusion processing and plaintext fusion processing is different.
  • the addition operation on the plaintext is equivalent to the multiplication operation on the ciphertext
  • the scalar multiplication operation on the plaintext is equivalent to the variable power operation on the ciphertext.
  • the fusion process of the coordination device is Weighted average
  • the plaintext fusion processing performed by the coordination device on the plaintext part of each model parameter update uses ordinary weighted average
  • the ciphertext fusion processing performed on the ciphertext part is based on the ordinary weighted average operation, adding Convert to multiplication, and convert multiplication to power.
  • the coordination device sends the encrypted global model parameter update to each participating device, that is, the ciphertext part and the plaintext part of the encrypted global model parameter update are sent to each participating device.
  • the coordinating device may combine the ciphertext part and the plaintext part and send it to the participating device, or separately send it to the participating device, and there is no specific limitation.
  • Each participating device receives the encrypted global model parameter update sent by the coordination device.
  • Step S60 Decrypt the ciphertext part of the encrypted global model parameter update, and combine the decryption result with the plaintext part of the encrypted global model parameter update to obtain the global model parameter update of the current model update.
  • the participating equipment decrypts the ciphertext part of the encrypted global model parameter update, and combines the decryption result with the plaintext part of the encrypted global model parameter update to obtain the global model parameter update of this model update.
  • the encryption algorithm adopted by the participating device is symmetric encryption
  • the participating device uses a key to encrypt the part to be encrypted for the model parameter update, and uses the same key to decrypt the ciphertext part of the encrypted global model parameter update.
  • the encryption algorithm adopted by the participating device is asymmetric encryption
  • the participating device uses the public key to encrypt the part to be encrypted for the model parameter update, and uses the private key to decrypt the ciphertext part of the encrypted global model parameter update.
  • Each participating device can obtain the key required by the encryption algorithm from the third-party secret key server, and the coordination device cannot obtain the key, and thus cannot decrypt the ciphertext part of the model parameter update.
  • the manner in which the participating device combines the decryption result with the plaintext part of the encrypted global model parameter update corresponds to the manner in which the participating device selects the to-be-encrypted part and the plaintext part of the model parameter update.
  • the participating device when the participating device selects the previous part of the parameters as the part to be encrypted from among the parameters updated by the model parameters, the number of the previous part is the number of the parameters to be encrypted, and the remaining part of the parameters is the plaintext part, the participating device will decrypt the result Put each parameter in the front, and put each parameter in the plaintext part at the back to get each parameter of the global model parameter update.
  • the coordinating device by receiving the encrypted global model parameter update sent by the coordinating device, performs ciphertext fusion processing on the ciphertext part of each participating device model parameter update to obtain the ciphertext part of the encrypted global model parameter update, Perform plaintext fusion processing on the plaintext part of each participating device model parameter update to obtain the plaintext part of the encrypted global model parameter update; decrypt the ciphertext part of the encrypted global model parameter update, and combine the decryption result with the plaintext of the encrypted global model parameter update Partially combine to obtain the global model parameter update of this model update, so that the participating devices can decrypt and obtain accurate global model parameter updates, thereby ensuring the normal progress of federated learning, and the coordination device cannot obtain all model parameter updates of each participating device. This ensures that the private data of the participating devices will not be leaked to the coordinating device.
  • the encryption parameter includes at least an encryption ratio
  • the step S20 include:
  • Step S201 Determine the number of parameters to be encrypted or the number of bits to be encrypted according to the encryption ratio and the number of parameters or the number of bits of the model parameter update obtained by local training;
  • the participating device determines the number of parameters to be encrypted according to the encryption ratio and the number of updated model parameters obtained by local training, or determines the number of bits to be encrypted according to the encryption ratio and the number of updated bits of the model parameter obtained by local training. Specifically, the participating device multiplies the encryption ratio by the number of parameters updated by the model parameters to obtain the number of parameters to be encrypted, or multiplies the encryption ratio by the number of bits updated by the model parameters to obtain the number of bits to be encrypted. If the model parameter update of the participating device includes 100 parameters and the encryption ratio is 50%, the participating device determines that the number of parameters to be encrypted is 50.
  • Step S202 Determine the to-be-encrypted part and the plaintext part of the model parameter update according to the number of parameters to be encrypted or the number of bits to be encrypted.
  • the participating equipment determines the to-be-encrypted part and the plaintext part of the model parameter update based on the number of parameters to be encrypted or the number of bits to be encrypted. Specifically, the participating device can select the previous part of the parameters as the part to be encrypted from the parameters of the model parameter update, and the number of the previous part of the parameters is the number of the parameters to be encrypted, then the remaining part of the parameters is the plain text part; or the participating device Among the bits in the model parameter update, the previous part is selected as the part to be encrypted, the number of the previous part is the number of bits to be encrypted, and the remaining part is the plain text part.
  • the number of parameters to be encrypted or the number of bits to be encrypted is determined correspondingly by the number of parameters to be encrypted or the number of bits to be encrypted according to the encryption ratio and the number of parameters or the number of bits to be updated for the model parameters obtained by local training, according to the number of parameters to be encrypted or the number of bits to be encrypted Determine the to-be-encrypted part and the plaintext part of the model parameter update, which enables the participating device to encrypt a part of the model parameter update, thus not only ensuring the security of the participating device data, but also reducing the computational complexity and power of the participating device encryption Consumption, and reduce the length of the model parameter update after encryption, thereby reducing the requirements for communication bandwidth.
  • the encryption parameter also includes an encryption part selection method, and the step S202 includes:
  • Step S2021 Select the part to be encrypted and the plaintext part from the model parameter update according to the encryption part selection method, wherein the number of parameters of the part to be encrypted is the number of parameters to be encrypted, or the number of parameters to be encrypted Part of the number of bits is the number of bits to be encrypted.
  • the participating equipment can select the method according to the encryption part in the encryption parameters, and the number of parameters to be encrypted or the number of bits to be encrypted, and select the part to be encrypted and the number of bits to be encrypted from the model parameter update.
  • the number of parameters of the selected part to be encrypted is the number of parameters to be encrypted, or the number of bits in the part to be encrypted is the number of bits to be encrypted.
  • the encryption part selection method specifies how to select the number of parameters to be encrypted from each parameter of the model parameter update, or how to select the number of bits to be encrypted from each bit of the model parameter update.
  • the encryption part selection method can be to select M bits from M fixed positions in the N-bit model parameter update, and the M fixed positions are uniform. Distributed; or it can select M consecutive bits from the N-bit model parameter update, for example, the first M bits, or the last M bits, or M consecutive bits at any position; it can also be the model to be trained It is a neural network model.
  • the network includes multiple layers, select one or more important parameters as the part to be encrypted, and the other parts as the plaintext part.
  • the encryption parameters of each participating device and coordinating device in a model update are unified.
  • the participating devices A and B each according to the same encryption ratio and encryption part selection method, encrypt the first 50 parameters in the model parameter update each including 100 parameters; coordinate the equipment to A and B
  • the model parameters update the first 50 ciphertext parameters respectively for ciphertext fusion processing, and the latter 50 plaintext parameters are respectively processed for plaintext fusion processing, and the fusion parameters of the first 50 ciphertexts and the last 50 plaintext fusion parameters are obtained.
  • Encrypted global model parameter update A and B each decrypt the fusion parameters of the first 50 ciphertexts of the encrypted global model parameter update, and obtain the fusion parameters of the first 50 plaintexts, and the last 50 ciphertexts of the encrypted global model parameter update
  • the fusion parameters are combined to obtain the global model parameter update of this model update.
  • the encryption parameters for each model update can be the same or different. Specifically, when the power, communication bandwidth, computing resources and other information of the participating devices change during the federated learning process, the encryption parameters can be dynamically adjusted.
  • the participating device can select the part to be encrypted.
  • the difficulty of updating the model parameters after cracking the encryption is improved, thereby further improving the security of the data of the participating equipment.
  • the step S10 includes:
  • Step A10 Obtain local device information, where the device information includes at least one or more of communication bandwidth information, computing resource information, and power information;
  • the participating device can obtain the current local device information every time the model is updated, or the participating device can obtain the current local device information once after joining the federated learning and before training the model to be trained.
  • the local device information is no longer obtained.
  • the local device information includes at least one or more of the local communication bandwidth information, computing resource information, and power information of the participating devices.
  • the acquired device information may be different, such as when participating in federated learning When the communication bandwidth and computing resources of each participating device are limited, but the power is not limited, each participating device can only obtain local communication bandwidth information and computing resource information.
  • Step A20 Determine local encryption parameters according to the device information
  • the participating devices determine the local encryption parameters according to the local device information. Specifically, when the encryption parameter is the encryption ratio, or the number of parameters to be encrypted or the number of bits to be encrypted, the participating device can preset a calculation formula for calculating the encryption parameters based on the device information, such as the positive correlation between the encryption ratio and the communication bandwidth in advance. Set the calculation formula for the relationship, substitute the communication bandwidth into the calculation formula, and calculate the encryption ratio. It can also be that different device information corresponding to different encryption parameters are set in the participating devices. For example, when the power is 0%-20%, the corresponding encryption ratio is 0.1, and when the power is 21%-40%, the corresponding encryption ratio can be set. When it is 0.3 and the power is 41% ⁇ 60%, the corresponding encryption ratio is 0.5, and so on.
  • Step A30 sending the local encryption parameters to the coordination device, so that the coordination device can determine the encryption parameters of this model update according to the local encryption parameters of each participating device;
  • the participating device sends the local encryption parameters to the coordinating device. If the participating devices send local encryption parameters before training, the coordinating device can determine the encryption parameters in the entire process of federated learning according to the local encryption parameters sent by each participating device, and use the encryption parameters in each subsequent model update . If the participating device sends encryption parameters to the coordinating device every time the model is updated, the coordinating device determines the encryption parameters of the current model update according to the local encryption parameters sent by each participating device, and is only used for the current model update. Of course, the participating device may also send the local encryption parameters once at intervals of a preset number of times, that is, adjust the encryption parameters once at intervals of a preset number of times.
  • the coordination device determines the encryption parameters according to the local encryption parameters of each participating device. For example, when the encryption parameter is the encryption ratio, or the number of parameters to be encrypted or the number of bits to be encrypted, the coordination device can select each according to the specific situation. The smallest, median, or average number of participating devices is used as an encryption parameter.
  • the coordinating device can choose the smallest of the local encryption parameters as the encryption parameter; when the number of participating devices in federated learning is relatively large If the data volume of each participating device is evenly distributed, and the lack of some of the participating devices has little impact on training, the coordinating device can select the median of each local encryption parameter as the encryption parameter, or calculate the average of each local encryption parameter The value is used as the encryption parameter; when the local encryption parameters of the participating devices are not much different, the coordination device can randomly select one of the local encryption parameters as the encryption parameter.
  • Each participating device receives the encryption parameters of this model update sent by the coordination device, and determines the to-be-encrypted part and the plaintext part of the model parameter update in this model update according to the encryption parameters.
  • each participating device uses the encryption parameters to update the model parameters obtained in each model update to determine the part to be encrypted and the plaintext section.
  • the participating devices determine the local encryption parameters according to the local device information, and send the local encryption parameters to the coordination device for the coordination device to determine the encryption parameters according to the local encryption parameters of each participating device, and send them to each participating device , Realizes that in the process of federated learning, according to the communication bandwidth, power and computing resources of each participating device, the encryption parameters are dynamically adjusted, so as to ensure the data security of the participating devices, while being able to adapt to the communication bandwidth and power of the participating devices. And computing resources and other restricted scenarios.
  • control encryption ratio of each participating device is between 0 and 1, for example, a value of 0.3, you can easily control the "confidentiality-communication bandwidth & computational complexity & power consumption" trade-off, suitable for different Application scenarios.
  • step S10 includes:
  • Step B10 Obtain local device information and the amount of training data, where the device information includes at least one or more of communication bandwidth information, computing resource information, and power information;
  • Participating equipment obtains the data volume of local equipment information and training data, where the local equipment information includes at least one or more of the local communication bandwidth information, computing resource information, and power information of the participating equipment.
  • Training data refers to Train the data of the model to be trained. Participating devices can obtain the current local device information and data volume every time the model is updated, or the participating devices can obtain the current local device information and data volume once after joining the federated learning and before training the model to be trained. During the training process, local device information and data volume are no longer obtained.
  • Step B20 Send the device information and data volume to the coordination device, so that the coordination device can determine the encryption parameters for this model update according to the device information and data volume of each participating device;
  • the participating device sends the local device information and data volume to the coordinating device, and the coordinating device determines the encryption parameters for this model update according to the device information and data volume of each participating device.
  • the coordination device determines the encryption parameters according to the device information and the amount of data. For example, the coordinating device compares the data volume of each participating device, and determines the encryption parameter based on the device information of the participating device with a large amount of data.
  • the specific determination method may be similar to the process of determining the local encryption parameter by the participating device in step A20; when the coordinating device compares each When the amount of data of the participating devices is equivalent, look at the average value of the device information of each participating device, such as the average power, and determine the encryption parameters based on the average value.
  • the coordinating device can determine the encryption parameters in the entire process of federated learning according to the device parameters and data volume sent by each participating device. In each subsequent model update, the coordinating device can determine the encryption parameters in the entire process of federated learning. Use this encryption parameter.
  • Step B30 Receive the encryption parameters of this model update sent by the coordination device.
  • Each participating device receives the encryption parameters of this model update sent by the coordination device, and determines the to-be-encrypted part and the plaintext part of the model parameter update in this model update according to the encryption parameters.
  • each participating device uses the encryption parameters to update the model parameters obtained in each model update to determine the part to be encrypted and the plaintext section.
  • the participating device sends the local device information and the amount of training data to the coordinating device, so that the coordinating device determines the encryption parameters based on the device information and data volume of each participating device, and sends it to each participating device to achieve
  • the encryption parameters are dynamically adjusted to ensure the data security of the participating devices, while being able to adapt to the communication bandwidth, power and computing of the participating devices. Scenarios with limited resources, etc. If the value of the coordinated device control encryption ratio is between 0 and 1, for example, a value of 0.3, you can easily control the "confidentiality-communication bandwidth & computational complexity & power consumption" trade-off, suitable for different applications Scenes.
  • the coordination device determines the encryption parameters of the entire federated learning process, or determines the encryption parameters of each model update, and sends the encryption parameters to each participating device.
  • Each participating device combines the encryption parameters sent by the coordination device with their respective The local device information determines whether it can participate in federated learning, and feeds the result back to the coordination device, and the coordination device determines whether to adjust the encryption parameters according to the feedback of each participating device.
  • the participating device encrypts the private data in the federated learning process according to the process steps shown in FIG. 3.
  • the participating device can obtain the key to encrypt the model parameter update through key distribution, such as obtaining the key from a third-party server;
  • the participating device or coordinating device determines the encryption parameters, such as the encryption ratio P, and then according to the encryption The proportion determines the number of bits to be encrypted M;
  • the participating device or coordinating device determines the encryption part selection method, such as selecting M bits from M fixed positions in the N-bit model parameter update;
  • the participating device determines the encryption part selection method according to the encryption parameters and the encryption part selection method Part (M bits), use the key to encrypt the M bits to obtain the L-bit ciphertext, and send the L-bit ciphertext and the (NM)-bit plaintext to the coordination device;
  • the coordination device treats all received participating devices The sent plaintext and ciphertext are merged, and the merged result (encrypted global
  • an embodiment of the present application also proposes a federal learning data encryption device.
  • the federal learning data encryption device includes:
  • the obtaining module 10 is set to obtain the encryption parameters of this model update during the federated learning process
  • the determining module 20 is configured to determine the to-be-encrypted part and the plaintext part of the model parameter update obtained by local training according to the encryption parameter;
  • the encryption module 30 is configured to use a preset encryption algorithm to encrypt the part to be encrypted to obtain the ciphertext part of the model parameter update;
  • the sending module 40 is configured to send the ciphertext part and the plaintext part of the model parameter update to the coordination device.
  • federal learning data encryption device further includes:
  • the receiving module is configured to receive the encrypted global model parameter update sent by the coordination device, wherein the coordination device performs ciphertext fusion processing on the ciphertext part of each of the participating device model parameter updates to obtain the encrypted global model parameter update
  • the ciphertext part performing plaintext fusion processing on the plaintext part of each of the participating device model parameter updates, to obtain the plaintext part of the encrypted global model parameter update;
  • the decryption module is configured to decrypt the ciphertext part of the encrypted global model parameter update, and combine the decryption result with the plaintext part of the encrypted global model parameter update to obtain the global model parameter update of the current model update.
  • the encryption parameter includes at least an encryption ratio
  • the determining module 20 is further set to:
  • the to-be-encrypted part and the plaintext part of the model parameter update are determined.
  • the encryption parameters further include an encryption part selection method, and the determining module 20 is further set to:
  • the acquisition module 10 includes:
  • the first acquiring unit is configured to acquire local device information, where the device information includes at least one or more of communication bandwidth information, computing resource information, and power information;
  • a determining unit configured to determine local encryption parameters according to the device information
  • the first sending unit is configured to send the local encryption parameters to the coordination device, so that the coordination device determines the encryption parameters of this model update according to the local encryption parameters of each of the participating devices;
  • the first receiving unit is configured to receive the encryption parameters of this model update sent by the coordination device.
  • the acquisition module 10 includes:
  • the second acquiring unit is configured to acquire local device information and the amount of training data, where the device information includes at least one or more of communication bandwidth information, computing resource information, and power information;
  • the second sending unit is configured to send the device information and data volume to the coordination device, so that the coordination device can determine the encryption parameters of this model update according to the device information and data volume of each participating device;
  • the second receiving unit is configured to receive the encryption parameters of this model update sent by the coordination device.
  • the preset encryption algorithm is a preset homomorphic encryption algorithm.
  • the expanded content of the specific implementation of the federal learning data encryption device of the present application is basically the same as each embodiment of the above-mentioned federal learning data encryption method, and will not be repeated here.
  • an embodiment of the present application also proposes a computer-readable storage medium with a federal learning data encryption program stored on the storage medium.
  • the federal learning data encryption program is executed by a processor, the following federal learning data encryption is implemented Method steps.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Storage Device Security (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

La présente demande concerne un dispositif, un appareil et un procédé de chiffrement de données d'apprentissage fédéré et un support de stockage lisible, le procédé consistant à : dans un processus d'apprentissage fédéré, acquérir un paramètre de chiffrement d'une mise à jour de modèle actuel ; déterminer une partie à chiffrer et une partie de texte en clair de la mise à jour de paramètre de modèle obtenue par entraînement local en fonction du paramètre de chiffrement ; chiffrer la partie à chiffrer en adoptant un algorithme de chiffrement prédéfini pour obtenir une partie de texte chiffré de la mise à jour de paramètre de modèle ; et envoyer la partie de texte chiffré et la partie de texte en clair de la mise à jour de paramètre de modèle à un dispositif de coordination. Selon la présente invention, dans le processus d'apprentissage fédéré, il est garanti que les données de confidentialité du dispositif participant ne seront pas divulguées au dispositif de coordination, de sorte que la sécurité des données du dispositif participant soit garantie. Pendant ce temps, la complexité de calcul du chiffrement et la consommation d'énergie du dispositif participant sont réduites et l'exigence de largeur de bande de communication du dispositif participant est réduite.
PCT/CN2019/118845 2019-09-24 2019-11-15 Dispositif, appareil et procédé de chiffrement de données d'apprentissage fédéré et support de stockage lisible WO2021056760A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910905457.X 2019-09-24
CN201910905457.XA CN110601814B (zh) 2019-09-24 2019-09-24 联邦学习数据加密方法、装置、设备及可读存储介质

Publications (1)

Publication Number Publication Date
WO2021056760A1 true WO2021056760A1 (fr) 2021-04-01

Family

ID=68862799

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/118845 WO2021056760A1 (fr) 2019-09-24 2019-11-15 Dispositif, appareil et procédé de chiffrement de données d'apprentissage fédéré et support de stockage lisible

Country Status (2)

Country Link
CN (1) CN110601814B (fr)
WO (1) WO2021056760A1 (fr)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113312169B (zh) * 2020-02-27 2023-12-19 香港理工大学深圳研究院 一种计算资源的分配方法及装置
CN113449872B (zh) * 2020-03-25 2023-08-08 百度在线网络技术(北京)有限公司 基于联邦学习的参数处理方法、装置和系统
CN111582508A (zh) * 2020-04-09 2020-08-25 上海淇毓信息科技有限公司 一种基于联邦学习框架的策略制定方法、装置和电子设备
CN111178547B (zh) * 2020-04-10 2020-07-17 支付宝(杭州)信息技术有限公司 一种基于隐私数据进行模型训练的方法及系统
CN111565174A (zh) * 2020-04-20 2020-08-21 中移雄安信息通信科技有限公司 车辆质量信息确定模型训练方法及车辆质量信息确定方法
CN113553602A (zh) * 2020-04-26 2021-10-26 华为技术有限公司 一种数据处理方法、装置、系统、设备及介质
CN113569301B (zh) * 2020-04-29 2024-07-05 杭州锘崴信息科技有限公司 基于联邦学习的安全计算系统和方法
CN111898137A (zh) * 2020-06-30 2020-11-06 深圳致星科技有限公司 一种联邦学习的隐私数据处理方法、设备及系统
CN111832050B (zh) * 2020-07-10 2021-03-26 深圳致星科技有限公司 用于联邦学习的基于FPGA芯片实现的Paillier加密方案
CN112016954A (zh) * 2020-07-14 2020-12-01 北京淇瑀信息科技有限公司 一种基于区块链网络技术的资源配置方法、装置和电子设备
CN112001452B (zh) * 2020-08-27 2021-08-27 深圳前海微众银行股份有限公司 特征选择方法、装置、设备及可读存储介质
CN112287377A (zh) * 2020-11-25 2021-01-29 南京星环智能科技有限公司 基于联邦学习的模型训练方法、计算机设备及存储介质
CN112560088B (zh) * 2020-12-11 2024-05-28 同盾控股有限公司 基于知识联邦的数据安全交换方法、装置及存储介质
CN112763845B (zh) * 2020-12-23 2022-07-08 广东电网有限责任公司梅州供电局 基于联邦学习的边缘物联网固件故障检测方法和系统
CN113239404B (zh) * 2021-06-04 2022-07-19 南开大学 一种基于差分隐私和混沌加密的联邦学习方法
CN113542228B (zh) * 2021-06-18 2022-08-12 腾讯科技(深圳)有限公司 基于联邦学习的数据传输方法、装置以及可读存储介质
CN115705301A (zh) * 2021-08-11 2023-02-17 华为技术有限公司 神经网络参数部署方法、ai集成芯片及其相关装置
CN113950046B (zh) * 2021-10-19 2022-05-03 北京工商大学 一种基于联邦学习的异构拓扑网络可信加密定位方法
CN114091690A (zh) * 2021-11-25 2022-02-25 支付宝(杭州)信息技术有限公司 联邦学习模型的训练方法和调用方法以及联邦学习系统
CN114124360B (zh) * 2021-12-10 2023-06-16 胜斗士(上海)科技技术发展有限公司 加密装置及方法、设备和介质
CN114444108A (zh) * 2021-12-22 2022-05-06 深圳市洞见智慧科技有限公司 同态加密处理方法及相关设备
CN117272389B (zh) * 2023-11-14 2024-04-02 信联科技(南京)有限公司 一种非交互可验证的联合安全建模方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882995A (zh) * 2009-05-06 2010-11-10 中兴通讯股份有限公司 数据发送、接收和传输方法及装置
CN104702781A (zh) * 2015-02-04 2015-06-10 深圳市中兴移动通信有限公司 一种信息加密的方法及装置
WO2017079109A1 (fr) * 2015-11-02 2017-05-11 Servicenow, Inc. Configuration de chiffrement sélectif
CN107871160A (zh) * 2016-09-26 2018-04-03 谷歌公司 通信高效联合学习
CN109241752A (zh) * 2018-08-08 2019-01-18 北京君瑞立信科技有限公司 一种自有数据不泄漏给合作方的数据交互系统及方法
CN110263936A (zh) * 2019-06-14 2019-09-20 深圳前海微众银行股份有限公司 横向联邦学习方法、装置、设备及计算机存储介质
CN110263908A (zh) * 2019-06-20 2019-09-20 深圳前海微众银行股份有限公司 联邦学习模型训练方法、设备、系统及存储介质

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101114450B (zh) * 2007-07-20 2011-07-27 华中科技大学 一种语音编码选择性加密方法
US8370629B1 (en) * 2010-05-07 2013-02-05 Qualcomm Incorporated Trusted hybrid location system
CN103281299B (zh) * 2013-04-26 2016-12-28 天地融科技股份有限公司 一种加解密装置以及信息处理方法和系统
KR101847492B1 (ko) * 2015-05-19 2018-04-10 삼성에스디에스 주식회사 데이터 암호화 장치 및 방법, 데이터 복호화 장치 및 방법
US11475350B2 (en) * 2018-01-22 2022-10-18 Google Llc Training user-level differentially private machine-learned models
CN109002861B (zh) * 2018-08-10 2021-11-09 深圳前海微众银行股份有限公司 联邦建模方法、设备及存储介质
CN109167695B (zh) * 2018-10-26 2021-12-28 深圳前海微众银行股份有限公司 基于联邦学习的联盟网络构建方法、设备及可读存储介质
CN109871702B (zh) * 2019-02-18 2024-06-28 深圳前海微众银行股份有限公司 联邦模型训练方法、系统、设备及计算机可读存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882995A (zh) * 2009-05-06 2010-11-10 中兴通讯股份有限公司 数据发送、接收和传输方法及装置
CN104702781A (zh) * 2015-02-04 2015-06-10 深圳市中兴移动通信有限公司 一种信息加密的方法及装置
WO2017079109A1 (fr) * 2015-11-02 2017-05-11 Servicenow, Inc. Configuration de chiffrement sélectif
CN107871160A (zh) * 2016-09-26 2018-04-03 谷歌公司 通信高效联合学习
CN109241752A (zh) * 2018-08-08 2019-01-18 北京君瑞立信科技有限公司 一种自有数据不泄漏给合作方的数据交互系统及方法
CN110263936A (zh) * 2019-06-14 2019-09-20 深圳前海微众银行股份有限公司 横向联邦学习方法、装置、设备及计算机存储介质
CN110263908A (zh) * 2019-06-20 2019-09-20 深圳前海微众银行股份有限公司 联邦学习模型训练方法、设备、系统及存储介质

Also Published As

Publication number Publication date
CN110601814A (zh) 2019-12-20
CN110601814B (zh) 2021-08-27

Similar Documents

Publication Publication Date Title
WO2021056760A1 (fr) Dispositif, appareil et procédé de chiffrement de données d'apprentissage fédéré et support de stockage lisible
WO2020029585A1 (fr) Procédé et dispositif de modélisation de fédération de réseau neuronal faisant intervenir un apprentissage par transfert et support d'informations
WO2021095998A1 (fr) Procédé et système informatiques sécurisés
WO2020125251A1 (fr) Procédé d'apprentissage de paramètres de modèle basé sur un apprentissage fédéré, dispositif, appareil et support
WO2020147383A1 (fr) Procédé, dispositif et système d'examen et d'approbation de processus utilisant un système de chaîne de blocs, et support de stockage non volatil
WO2013025085A2 (fr) Appareil et procédé permettant de prendre en charge un nuage de famille dans un système informatique en nuage
WO2019132272A1 (fr) Identifiant en tant que service basé sur une chaîne de blocs
WO2021092973A1 (fr) Procédé et dispositif de traitement d'informations sensibles, et support de stockage pouvant être lu
WO2014069783A1 (fr) Procédé d'authentification par mot de passe et appareil pour l'exécuter
WO2017071363A1 (fr) Procédé de partage de mot de passe, système de partage de mot de passe, et dispositif terminal
WO2020155758A1 (fr) Procédé et dispositif de commande de transmission à chiffrement de données, appareil informatique et support de stockage
WO2014063455A1 (fr) Procédé et système de messagerie instantanée
WO2018151390A1 (fr) Dispositif de l'internet des objets
WO2013005989A2 (fr) Procédé et appareil de gestion de clé de groupe pour dispositif mobile
WO2020186775A1 (fr) Procédé, appareil et dispositif de fourniture de données de service, et support de stockage lisible par ordinateur
WO2015158038A1 (fr) Dispositif et procédé de chiffrement à protection contre les attaques par analyse de puissance différentielle
WO2020101325A1 (fr) Système et procédé de chiffrement utilisant une technologie de chiffrement basée sur un groupe de permutation
WO2019182377A1 (fr) Procédé, dispositif électronique et support d'enregistrement lisible par ordinateur permettant de générer des informations d'adresse utilisées pour une transaction de cryptomonnaie à base de chaîne de blocs
WO2023120906A1 (fr) Procédé permettant de recevoir un micrologiciel et procédé permettant de transmettre un micrologiciel
WO2020166879A1 (fr) Appareil permettant de réaliser une conception de seuil sur une clé secrète et son procédé
WO2021027134A1 (fr) Procédé, appareil et dispositif de stockage de données et support d'enregistrement informatique
WO2020032351A1 (fr) Procédé permettant d'établir une identité numérique anonyme
WO2017016272A1 (fr) Procédé, appareil et système de traitement de données de ressources virtuelles
WO2020114184A1 (fr) Procédé, appareil et dispositif de modélisation conjointe, et support de stockage lisible par ordinateur
WO2020116807A1 (fr) Appareil et procédé pour effectuer un calcul non polynomial sur un cryptogramme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19947156

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19947156

Country of ref document: EP

Kind code of ref document: A1