CN114221811A - Model establishing method, device, equipment and computer storage medium - Google Patents

Model establishing method, device, equipment and computer storage medium Download PDF

Info

Publication number
CN114221811A
CN114221811A CN202111533833.0A CN202111533833A CN114221811A CN 114221811 A CN114221811 A CN 114221811A CN 202111533833 A CN202111533833 A CN 202111533833A CN 114221811 A CN114221811 A CN 114221811A
Authority
CN
China
Prior art keywords
gradient
key
parameter
model
decrypted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111533833.0A
Other languages
Chinese (zh)
Other versions
CN114221811B (en
Inventor
雷伟
邓杨
许长山
刘碧春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CCB Finetech Co Ltd
Original Assignee
CCB Finetech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CCB Finetech Co Ltd filed Critical CCB Finetech Co Ltd
Priority to CN202111533833.0A priority Critical patent/CN114221811B/en
Publication of CN114221811A publication Critical patent/CN114221811A/en
Application granted granted Critical
Publication of CN114221811B publication Critical patent/CN114221811B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0407Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the identity of one or more communicating identities is hidden
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/14Network analysis or design
    • H04L41/145Network analysis or design involving simulating, designing, planning or modelling of a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0428Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0428Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload
    • H04L63/0478Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload applying multiple layers of encryption, e.g. nested tunnels or encrypting the content with a first key and then with at least a second key

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Storage Device Security (AREA)

Abstract

The embodiment of the application provides a model building method, a device, equipment and a computer storage medium, wherein the method is applied to first equipment and comprises the steps of encrypting a first parameter by adopting a first secret key, and calculating the first parameter according to first data which is sent by second equipment and encrypted by adopting a second secret key; receiving a first gradient sent by second equipment, wherein the first gradient is obtained by the second equipment through calculation according to a first parameter encrypted by a first key and a second key; decrypting the first gradient and sending the decrypted first gradient to the second equipment so that the second equipment can update the first model parameter of the preset combined model according to the decrypted first gradient; calculating a second gradient of the first device according to the first parameter; and decrypting the second gradient, and updating a second model parameter of the preset combined model according to the decrypted second gradient to obtain the target combined model. According to the embodiment of the application, the combined model can be established under the condition that data are not shared, and the hidden danger of data leakage is solved.

Description

Model establishing method, device, equipment and computer storage medium
Technical Field
The application belongs to the technical field of federal learning, and particularly relates to a model building method, a model building device, model building equipment and a computer storage medium.
Background
With the development of network technology, privacy of user data is more and more concerned, so that data islanding exists between organizations, between organizations and enterprises and inside organizations currently, and in order to improve service quality, a plurality of organizations and enterprises need to collaboratively acquire data. Therefore, how to protect data privacy and solve the data island problem becomes a problem to be solved urgently.
In the prior art, in order to protect data privacy and solve the data island problem, federal learning defines a machine learning framework, different data owners can establish a combined model under the condition of not exchanging data with each other, and the established model only serves local targets in respective areas. In addition to the data provider and the data application party, the above method needs a federal learning coordinator, which needs to have the function of the federal learning participant and undertake the responsibility of part of federal learning network management and arbitration. However, the coordinator needs to be absolutely trusted, otherwise, data of the participants may be leaked, so the above method cannot establish a joint model without sharing data, and cannot solve the hidden danger of data leakage.
Disclosure of Invention
The embodiment of the application provides a model building method, a model building device and a computer storage medium, and can solve the problems that a combined model cannot be built under the condition of not sharing data and the hidden danger of data leakage cannot be solved in the prior art.
In a first aspect, an embodiment of the present application provides a model building method, where the method is applied to a first device, and the method includes:
encrypting a first parameter by using a first key, wherein the first parameter is obtained by calculation according to first data which is sent by second equipment and encrypted by using a second key;
receiving a first gradient sent by the second equipment, wherein the first gradient is obtained by the second equipment through calculation according to the first parameter encrypted by adopting a first key and a second key;
decrypting the first gradient and sending the decrypted first gradient to the second device, so that the second device updates a first model parameter of a preset joint model according to the decrypted first gradient;
calculating a second gradient of the first equipment according to the first parameter, wherein the second gradient is a gradient encrypted by a second secret key;
and decrypting the second gradient, and updating a second model parameter of a preset combined model according to the decrypted second gradient to obtain the target combined model.
In one embodiment, decrypting the first gradient and sending the decrypted first gradient to the second device includes:
decrypting the first gradient by using a decryption key corresponding to the first key to obtain a first gradient encrypted by using a second key;
and sending the first gradient encrypted by the second key to a third device, so that the third device decrypts the first gradient encrypted by the second key by using a decryption key corresponding to the second key, and sends the decrypted first gradient to the second device.
In an embodiment, the decrypting the second gradient and updating a second model parameter of a preset joint model according to the decrypted second gradient to obtain a target joint model includes:
adding a random mask to the second gradient;
sending the second gradient added with the random mask to a third device, so that the third device decrypts the second gradient by adopting a decryption key corresponding to a second key;
and receiving the decrypted second gradient, and updating a second model parameter of a preset combined model according to the decrypted second gradient to obtain a target combined model.
In an embodiment, the decrypting the second gradient and updating a second model parameter of a preset joint model according to the decrypted second gradient by the second device, where the first data includes a second parameter calculated by the second device according to a preset learning rate and a preset regularization parameter, to obtain the target joint model, includes:
calculating a loss function according to the second parameter;
and decrypting the second gradient and the loss function, and updating a second model parameter of a preset combined model according to the decrypted second gradient and the decrypted loss function to obtain a target combined model.
The embodiment of the application provides another model building method, which is applied to second equipment and comprises the following steps:
encrypting first data by adopting a second key, sending the first data to first equipment so that the first equipment can calculate a first parameter and a second gradient according to the first data, decrypting the second gradient, updating a second model parameter of a preset combined model according to the decrypted second gradient, and encrypting the first parameter by adopting the first key;
receiving the first parameter encrypted with a first key;
calculating a first gradient according to the first parameter encrypted by using a first key, and sending the first gradient to the first device for the first device to decrypt the first gradient, wherein the first gradient is a gradient encrypted by using a second key and the first key;
receiving the decrypted first gradient sent by the first device;
and updating the first model parameter of the preset joint model according to the decrypted first gradient to obtain the target joint model.
In one embodiment, the receiving the decrypted first gradient sent by the first device includes:
and receiving the decrypted first gradient sent by the third device, wherein the first gradient is a gradient obtained by the third device decrypting the first gradient encrypted by adopting the second key by adopting a decryption key corresponding to the second key, and the first gradient encrypted by adopting the second key is a gradient obtained by the third device receiving the decryption key corresponding to the first key by the first device decrypting the first gradient.
In a second aspect, an embodiment of the present application provides a model building apparatus, where the apparatus is applied to a first device, and the model building apparatus includes:
the encryption module is used for encrypting a first parameter by adopting a first key, and the first parameter is obtained by calculation according to first data which is sent by second equipment and is encrypted by adopting a second key;
a receiving module, configured to receive a first gradient sent by the second device, where the first gradient is calculated by the second device according to the first parameter encrypted by using a first key and a second key;
the decryption module is used for decrypting the first gradient and sending the decrypted first gradient to the second equipment, so that the second equipment updates the first model parameter of the preset combined model according to the decrypted first gradient;
a calculation module, configured to calculate a second gradient of the first device according to the first parameter, where the second gradient is a gradient encrypted by using a second key;
and the decryption module is further used for decrypting the second gradient and updating a second model parameter of a preset combined model according to the decrypted second gradient to obtain the target combined model.
In one embodiment, the model building apparatus further comprises a sending module;
the decryption module is further configured to decrypt the first gradient by using a decryption key corresponding to the first key to obtain a first gradient encrypted by using a second key;
the sending module is configured to send the first gradient encrypted by using the second key to a third device, so that the third device decrypts the first gradient encrypted by using the second key by using a decryption key corresponding to the second key, and sends the decrypted first gradient to the second device.
In one embodiment, the model building apparatus further comprises an adding module;
the adding module is used for adding a random mask to the second gradient;
the sending module is further configured to send the second gradient to a third device after adding the random mask, so that the third device decrypts the second gradient by using a decryption key corresponding to a second key;
the receiving module is further configured to receive the decrypted second gradient, and update a second model parameter of a preset combined model according to the decrypted second gradient to obtain a target combined model.
In one embodiment, the first data includes a second parameter calculated by the second device according to a preset learning rate and a preset regularization parameter, and the calculation module is further configured to calculate a loss function according to the second parameter;
the decryption module is further configured to decrypt the second gradient and the loss function, and update a second model parameter of a preset combined model according to the decrypted second gradient and the decrypted loss function, so as to obtain a target combined model.
The embodiment of the application provides another model building device, which is applied to a second device, and the model building device comprises:
the encryption module is used for encrypting first data by adopting a second key, sending the first data to first equipment, calculating a first parameter and a second gradient according to the first data by the first equipment, decrypting the second gradient, updating a second model parameter of a preset combined model according to the decrypted second gradient, and encrypting the first parameter by adopting the first key;
a receiving module, configured to receive the first parameter encrypted by using a first key;
a calculation module, configured to calculate a first gradient according to the first parameter encrypted by using a first key, and send the first gradient to the first device, so that the first device decrypts the first gradient, where the first gradient is a gradient encrypted by using a second key and the first key;
the receiving module is further configured to receive the decrypted first gradient sent by the first device;
and the updating module is used for updating the first model parameter of the preset combined model according to the decrypted first gradient to obtain the target combined model.
In an embodiment, the receiving module is further configured to receive the decrypted first gradient sent by the third device, where the first gradient is a gradient obtained by the third device decrypting the first gradient encrypted by using the second key with the decryption key corresponding to the second key, and the first gradient encrypted by using the second key is a gradient obtained by the third device receiving the gradient obtained by the first device decrypting the first gradient with the decryption key corresponding to the first key.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory storing computer program instructions;
the processor, when executing the computer program instructions, implements a model building method as described in any of the embodiments of the first aspect.
In a fourth aspect, the present application provides a computer storage medium having computer program instructions stored thereon, where the computer program instructions, when executed by a processor, implement the model building method as described in any one of the embodiments of the first aspect.
In a fifth aspect, the present application provides a computer program product, wherein instructions in the computer program product, when executed by a processor of an electronic device, enable the electronic device to perform the model building method as described in any one of the embodiments of the first aspect.
According to the model establishing method, the model establishing device, the model establishing equipment and the computer storage medium, the first equipment receives first data which are sent by the second equipment and encrypted by the second secret key, the first parameter is calculated according to the first data which are encrypted by the second secret key, the first parameter is encrypted by the first secret key, and then the first equipment sends the encrypted first parameter to the second equipment. And the second equipment calculates a first gradient according to the encrypted first parameter and sends the first gradient to the first equipment. And the first equipment receives and decrypts the first gradient sent by the second equipment, and sends the decrypted first gradient to the second equipment, so that the second equipment updates the first model parameter of the preset joint model according to the decrypted first gradient. And the first equipment calculates the second gradient according to the first parameter and decrypts the second gradient, so that the second model parameter of the preset combined model is updated according to the decrypted second gradient to obtain the target combined model. Therefore, the combined model can be established under the condition of not sharing data, and the hidden danger of data leakage is solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram of a modeling method according to an embodiment of the present application;
FIG. 2 is a second schematic flowchart of a modeling method according to an embodiment of the present application;
FIG. 3 is a third schematic flowchart of a modeling method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an algorithm flow of a model building method according to an embodiment of the present application;
FIG. 5 is a schematic structural diagram of a modeling apparatus according to an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a modeling apparatus according to another embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Features and exemplary embodiments of various aspects of the present application will be described in detail below, and in order to make objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are intended to be illustrative only and are not intended to be limiting. It will be apparent to one skilled in the art that the present application may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present application by illustrating examples thereof.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
As described in the background art, in the prior art, a joint model cannot be established without sharing data, and the problem of hidden danger of data leakage cannot be solved. In order to solve the above problem, embodiments of the present application provide a model building method, an apparatus, a device, and a computer storage medium, where a first device receives first data sent by a second device and encrypted by a second key, calculates a first parameter according to the first data encrypted by the second key and encrypts the first parameter by the first key, and further sends the encrypted first parameter to the second device, so that the second device calculates a first gradient according to the encrypted first parameter, receives and decrypts the first gradient sent by the second device, sends the decrypted first gradient to the second device, so that the second device updates a first model parameter of a preset joined model according to the decrypted first gradient, calculates a second gradient of the first device according to the first parameter and decrypts the second gradient, so as to update a second model parameter of the preset joined model according to the decrypted second gradient, and obtaining a target combined model, thus avoiding the problems that the combined model cannot be established under the condition of not sharing data and the hidden danger of data leakage cannot be solved. First, a model building method provided in the embodiment of the present application is described below.
Fig. 1 shows a schematic flow chart of a model building method according to an embodiment of the present application.
As shown in fig. 1, the model building method may specifically include the following steps:
s101, the second device encrypts the first data by using the second key.
S102, first data are sent to first equipment.
S103, the first device calculates a first parameter and a second gradient according to the first data.
S104, the first device encrypts the first parameter by using the first key.
And S105, sending the encrypted first parameter to the second equipment.
And S106, the second equipment calculates a first gradient according to the encrypted first parameter.
S107, the first gradient is sent to the first device.
S108, the first device decrypts the first gradient and the second gradient.
And S109, the first device updates the second model parameter of the preset combined model according to the decrypted second gradient.
And S110, sending the decrypted first gradient to the second device.
And S111, the second equipment updates the first model parameter of the preset combined model according to the decrypted first gradient.
In this embodiment, the first device may be a device of a data application in a federal learning participant, and the second device may be a device of a data provider in the federal learning participant. The first device receives first data which are sent by the second device and encrypted by the second key, calculates a first parameter according to the first data encrypted by the second key, encrypts the first parameter by the first key, and then sends the encrypted first parameter to the second device. And the second equipment calculates a first gradient according to the encrypted first parameter and sends the first gradient to the first equipment. And the first equipment receives and decrypts the first gradient sent by the second equipment, and sends the decrypted first gradient to the second equipment, so that the second equipment updates the first model parameter of the preset joint model according to the decrypted first gradient. And the first equipment calculates the second gradient according to the first parameter and decrypts the second gradient, so that the second model parameter of the preset combined model is updated according to the decrypted second gradient to obtain the target combined model. Therefore, the combined model can be established under the condition of not sharing data, and the hidden danger of data leakage is solved.
Based on this, fig. 2 shows a schematic flow chart of a model building method according to another embodiment of the present application.
As shown in fig. 2, the model building method is applied to the first device, and may specifically include the following steps:
s210, encrypting the first parameter by using the first key, wherein the first parameter is obtained by calculating according to the first data which is sent by the second equipment and encrypted by using the second key.
Wherein the first device may be a device of a data application in a federal learning participant and the second device may be a device of a data provider in the federal learning participant. And the device of the data application party receives first data which is sent by the device of the data provider and encrypted by adopting the second key, calculates a first parameter according to the first data encrypted by adopting the second key based on a preset algorithm, and then encrypts the first parameter by adopting the first key. The first key may be obtained by a device of the data application party creating an encryption key pair, the second key may be a public key obtained by an encryption key pair created by a third party, and the first data may be a parameter provided by a device of the data provider for updating the federation model.
And S220, receiving a first gradient sent by the second equipment, wherein the first gradient is obtained by the second equipment through calculation according to a first parameter encrypted by using a first key and a second key.
In this embodiment of the application, the first parameter may be a parameter encrypted by using a second key and a first key, the first device sends the encrypted first parameter to the second device, and the second device calculates a first gradient according to the encrypted first parameter, where the first gradient may be a gradient at a second device end.
And S230, decrypting the first gradient, and sending the decrypted first gradient to the second device, so that the second device updates the first model parameter of the preset joint model according to the decrypted first gradient.
In this embodiment of the application, the first device decrypts the first gradient, and sends the decrypted first gradient to the second device, where the decrypted first gradient may be used by the second device to update a first model parameter of the preset joined model, and the first model parameter may be a parameter of the first model at the second device end in the preset joined model.
And S240, calculating a second gradient of the first equipment according to the first parameter, wherein the second gradient is the gradient encrypted by adopting a second secret key.
Wherein the second gradient may be calculated by the first device based on the first parameter encrypted with the second key.
And S250, decrypting the second gradient, and updating a second model parameter of the preset combined model according to the decrypted second gradient to obtain the target combined model.
The second gradient may be used to update a second model parameter of the preset joint model, the second model parameter may be a parameter of a second model at the first device end in the preset joint model, and the second model is combined with the first model to obtain the target joint model.
Therefore, the first device receives the first data which is sent by the second device and encrypted by the second key, calculates the first parameter according to the first data encrypted by the second key, encrypts the first parameter by the first key, and then sends the encrypted first parameter to the second device. And the second equipment calculates a first gradient according to the encrypted first parameter and sends the first gradient to the first equipment. And the first equipment receives and decrypts the first gradient sent by the second equipment, and sends the decrypted first gradient to the second equipment, so that the second equipment updates the first model parameter of the preset joint model according to the decrypted first gradient. And the first equipment calculates the second gradient according to the first parameter and decrypts the second gradient, so that the second model parameter of the preset combined model is updated according to the decrypted second gradient to obtain the target combined model. Therefore, the combined model can be established under the condition of not sharing data, and the hidden danger of data leakage is solved.
Based on this, in one embodiment, the above S230: decrypting the first gradient and sending the decrypted first gradient to the second device may specifically include:
decrypting the first gradient by using a decryption key corresponding to the first key to obtain a first gradient encrypted by using a second key;
and sending the first gradient encrypted by the second key to the third device, so that the third device decrypts the first gradient encrypted by the second key by using the decryption key corresponding to the second key, and sends the decrypted first gradient to the second device.
The first gradient is encrypted by using a second key and a first key, and the first gradient encrypted by using the second key can be obtained after the first gradient is decrypted by the first device by using a decryption key corresponding to the first key, so that the first gradient encrypted by using the second key is sent to the third device. The third device may be a device of a coordinator that provides support or auxiliary functions for activities of federal learning participants in federal learning, and the third device and the second device may be devices belonging to the same party, as shown in fig. 4, and the third device and the second device may trust each other and may be deployed in the same environment. And after receiving the first gradient encrypted by the second key, the third device decrypts the first gradient encrypted by the second key by using a decryption key corresponding to the second key, and sends the completely decrypted first gradient to the second device.
Therefore, the first gradient encrypted by the second key is obtained by decrypting the first gradient by using the decryption key corresponding to the first key, and the first gradient encrypted by using the second key is further sent to the third device, so that the third device decrypts the first gradient encrypted by using the second key by using the decryption key corresponding to the second key, and sends the decrypted first gradient to the second device.
In one embodiment, the step S250: decrypting the second gradient, and updating a second model parameter of the preset joint model according to the decrypted second gradient to obtain the target joint model, which may specifically include:
adding a random mask to the second gradient;
sending the second gradient added with the random mask to the third device, so that the third device decrypts the second gradient by adopting a decryption key corresponding to the second key;
and receiving the decrypted second gradient, and updating a second model parameter of the preset combined model according to the decrypted second gradient to obtain the target combined model.
The random mask may be a mask used for encrypting the second gradient, the first device sends the second gradient added with the random mask to the third device, and the third device decrypts the second gradient by using a decryption key corresponding to the second key and sends the second gradient to the first device, so that the first device updates the second model parameter of the preset combined model according to the decrypted second gradient to obtain the target combined model.
Therefore, the random mask is added to the second gradient, the second gradient is sent to the third equipment, the third equipment decrypts the second gradient by using the decryption key corresponding to the second key, the first equipment updates the second model parameter of the preset combination model according to the decrypted second gradient to obtain the target combination model, the second gradient received by the third equipment is encrypted by using the random mask, the second model parameter of the preset combination model is updated through decryption of the second gradient of the third equipment to obtain the target combination model, and the combination model is established on the premise that data are not leaked.
In an embodiment, the first data includes a second parameter calculated by the second device according to a preset learning rate and a preset regularization parameter, and S250: decrypting the second gradient, and updating a second model parameter of the preset joint model according to the decrypted second gradient to obtain the target joint model, which may specifically include:
calculating a loss function according to the second parameter;
and decrypting the second gradient and the loss function, and updating a second model parameter of the preset combined model according to the decrypted second gradient and the decrypted loss function to obtain the target combined model.
The second parameter may be a parameter encrypted by a second key and calculated by the second device according to a preset learning rate and a preset regularization parameter, and may be used to calculate a loss function, decrypt the second gradient and the loss function, and update the second model parameter of the preset joint model according to the decrypted second gradient and the loss function, so as to obtain the target joint model by combining with the updated first model parameter.
Therefore, the target combined model can be obtained by calculating the loss function according to the second parameter, decrypting the second gradient and the loss function, and updating the second model parameter of the preset combined model according to the decrypted second gradient and the decrypted loss function, so that the updating of the second model parameter is more accurate, and the target combined model with better effect is obtained.
Based on this, an embodiment of the present application provides another model building method, as shown in fig. 3, where the model building method is applied to a second device, and specifically includes the following steps:
s310, encrypting the first data by adopting a second key, sending the first data to the first equipment, calculating a first parameter and a second gradient by the first equipment according to the first data, decrypting the second gradient, updating a second model parameter of the preset combined model according to the decrypted second gradient, and encrypting the first parameter by adopting the first key.
The second device encrypts the first data with a second key of the encryption key pair created by the third device, and sends the first data to the first device. And after receiving the first data, the first device calculates a first parameter according to the first data, encrypts the first parameter again by using a first key, calculates a second gradient and decrypts the second gradient, so that a second model parameter of the preset combined model is updated according to the decrypted second gradient.
S320, receiving the first parameter encrypted by the first key.
And the first equipment calculates the first parameter according to the first data, encrypts the first parameter again by using the first key and then sends the first parameter encrypted by using the first key to the second equipment.
S330, calculating a first gradient according to the first parameter encrypted by the first key, and sending the first gradient to the first device for the first device to decrypt the first gradient, wherein the first gradient is the gradient encrypted by the second key and the first key.
S340, receiving the decrypted first gradient sent by the first device.
In an embodiment, the S340 may specifically include:
and receiving a decrypted first gradient sent by the third equipment, wherein the first gradient is a gradient obtained by the third equipment decrypting the first gradient encrypted by adopting the second key by adopting a decryption key corresponding to the second key, and the first gradient encrypted by adopting the second key is a gradient obtained by the third equipment receiving the first gradient decrypted by the first equipment by adopting the decryption key corresponding to the first key.
In this embodiment of the application, the first device decrypts the first gradient by using the decryption key corresponding to the first key to obtain the first gradient encrypted by using the second key, and sends the first gradient encrypted by using the second key to the third device, and then the third device decrypts the first gradient encrypted by using the second key by using the decryption key corresponding to the second key, and sends the decrypted first gradient to the second device, so that the second device receives the decrypted first gradient sent by the third device.
Therefore, by receiving the decrypted first gradient sent by the third device, the first gradient is the gradient of the third device after decrypting the first gradient encrypted by the second key by using the decryption key corresponding to the second key, and the gradient obtained by decrypting the first gradient by the first device by using the decryption key corresponding to the first key is received by the third device, so that the third device cannot receive the initial data of the first device and only can receive the calculated first gradient, the data is ensured not to be leaked while the combined model is established.
And S350, updating the first model parameter of the preset combined model according to the decrypted first gradient to obtain the target combined model.
And after the first model parameters of the first model in the preset combined model are updated, the first model and the second model with the updated second model parameters are combined to obtain the target combined model.
Therefore, the second device encrypts the first data by using the second key and sends the first data to the first device so that the first device can calculate the first parameter and the second gradient according to the first data, further the first device decrypts the second gradient and updates the second model parameter of the preset combined model according to the decrypted second gradient, encrypts the first parameter by using the first key and sends the first parameter to the second device, the second device calculates the first gradient according to the first parameter encrypted by using the first key and sends the first gradient to the first device so that the first gradient is decrypted by the first device and sent to the second device, and the second device updates the first model parameter of the preset combined model according to the decrypted first gradient to obtain the target combined model. Therefore, the combined model can be established under the condition of not sharing data, and the hidden danger of data leakage is solved.
To better describe the whole scheme, based on the various embodiments described above, as a specific example, as shown in fig. 4, as an algorithm flow diagram of a model building method, when a system is deployed, a device a of a data provider in a federal learning participant and a device of a federal learning coordinator are deployed under the same party, the device of the coordinator creates an encryption key pair, and sends a first public key C in the encryption key pair to the device a of the data provider and a device B of a data application in the federal learning participant, and the device a of the data provider calculates the following formula
Figure BDA0003412419260000131
And LA
Figure BDA0003412419260000132
Figure BDA0003412419260000133
Wherein, thetaAAs are the local model parameters of the data provider,
Figure BDA0003412419260000134
which is the feature space of the data provider, lambda is the regularization parameter,
Figure BDA0003412419260000135
is an intermediate parameter, L, for calculating the gradient of the A-side of the device at the data provider sideAAre parameters used to calculate the loss function. Device A of the data provider uses the first public key C pair
Figure BDA0003412419260000136
And LAPerforming encryption to obtain the encrypted
Figure BDA0003412419260000137
And [ LA]cThe data transmitted to the device B of the data application party is calculated according to the following formula after the device B of the data application party receives the data transmitted from the device A of the data provider
Figure BDA0003412419260000138
Figure BDA0003412419260000141
Figure BDA0003412419260000142
Figure BDA0003412419260000143
Wherein,
Figure BDA0003412419260000144
is an intermediate parameter for calculating the gradient of the B-side of the device on the data application side,
Figure BDA0003412419260000145
and
Figure BDA0003412419260000146
are the parameters used to calculate the loss function,
Figure BDA0003412419260000147
is a feature space on the data application side,
Figure BDA0003412419260000148
is a loss function encrypted with a first public key C, [ d [i]cIs the second intermediate parameter used to calculate the gradient encrypted with the first public key C,
Figure BDA0003412419260000149
is the gradient B of the device B side of the data application side encrypted with the first public key C. While calculating the above parameters, the data application side device B creates an encryption key pair and encrypts it using the second public key B to obtain [ [ d ]i]c]BAnd encrypted [ [ d ]i]c]BAnd the second public key B is transmitted to the device A of the data provider for calculating the gradient a of the device A side of the data provider, and the device A of the data provider receives [ [ d ]i]c]BAnd its gradient a is calculated using the following formula:
Figure BDA00034124192600001410
the equipment A of the data provider sends the calculated gradient a back to the equipment B end of the data application party, and the equipment B of the data application party decrypts the gradient a by adopting a decryption key corresponding to the second public key B to obtain the gradient a
Figure BDA00034124192600001411
Device B of data application side will
Figure BDA00034124192600001412
And
Figure BDA00034124192600001413
plus a random mask to the device of the coordinator,
Figure BDA00034124192600001414
devices directly sent to coordinator without adding mask, device pair of coordinator
Figure BDA00034124192600001415
With addition of a random mask
Figure BDA00034124192600001416
And
Figure BDA00034124192600001417
decrypting by adopting a decryption key corresponding to the first public key C, and further respectively transmitting the decrypted gradient and the decrypted loss function to the equipment B of the data application party and the equipment A of the data provider by the equipment of the coordinator party so as to respectively update respective model parameters of the equipment B of the data application party and the equipment A of the data provider, wherein the equipment A of the data provider respectively updates respective model parameters by a formula
Figure BDA00034124192600001418
Updating the parameter θADevice B of data application side and formula
Figure BDA00034124192600001419
Updating the parameter θBAnd finally obtaining a target model, namely a combined model.
Fig. 5 is a schematic diagram illustrating a structure of a model building apparatus 500 according to an exemplary embodiment.
As shown in fig. 5, the model building apparatus 500 is applied to a first device, and may specifically include:
the encryption module 501 is configured to encrypt a first parameter with a first key, where the first parameter is obtained by calculation according to first data sent by a second device and encrypted with a second key;
a receiving module 502, configured to receive a first gradient sent by the second device, where the first gradient is calculated by the second device according to the first parameter encrypted by using a first key and a second key;
a decryption module 503, configured to decrypt the first gradient and send the decrypted first gradient to the second device, so that the second device updates the first model parameter of the preset joint model according to the decrypted first gradient;
a calculating module 504, configured to calculate a second gradient of the first device according to the first parameter, where the second gradient is a gradient encrypted by using a second key;
the decryption module 503 is further configured to decrypt the second gradient, and update a second model parameter of a preset combined model according to the decrypted second gradient, so as to obtain a target combined model.
In one embodiment, the model building apparatus 500 may further include a sending module;
the decryption module 503 is further configured to decrypt the first gradient by using a decryption key corresponding to the first key, so as to obtain a first gradient encrypted by using the second key;
and the sending module is used for sending the first gradient encrypted by the second key to the third equipment, so that the third equipment decrypts the first gradient encrypted by the second key by using the decryption key corresponding to the second key, and sends the decrypted first gradient to the second equipment.
In one embodiment, the model building apparatus 500 may further include an add module;
an adding module for adding a random mask to the second gradient;
the sending module is further configured to send the second gradient to the third device after the random mask is added, so that the third device decrypts the second gradient by using a decryption key corresponding to the second key;
the receiving module 502 is further configured to receive the decrypted second gradient, and update a second model parameter of the preset combination model according to the decrypted second gradient to obtain the target combination model.
In one embodiment, the first data includes a second parameter calculated by the second device according to a preset learning rate and a preset regularization parameter;
a calculating module 504, further configured to calculate a loss function according to the second parameter;
the decryption module 503 is further configured to decrypt the second gradient and the loss function, and update the second model parameter of the preset joint model according to the decrypted second gradient and the decrypted loss function, so as to obtain the target joint model.
The embodiment of the present application provides another model building apparatus 600, where the model building apparatus 600 is applied to a second device, and specifically may include:
the encryption module 601 is configured to encrypt first data by using a second key, send the first data to the first device, calculate a first parameter and a second gradient according to the first data, decrypt the second gradient, update a second model parameter of the preset joint model according to the decrypted second gradient, and encrypt the first parameter by using the first key;
a receiving module 602, configured to receive a first parameter encrypted by using a first key;
a calculating module 603, configured to calculate a first gradient according to a first parameter encrypted by using a first key, and send the first gradient to a first device, where the first gradient is used for the first device to decrypt the first gradient, and the first gradient is a gradient encrypted by using a second key and the first key;
a receiving module 602, further configured to receive the decrypted first gradient sent by the first device;
and an updating module 604, configured to update a first model parameter of the preset joint model according to the decrypted first gradient, so as to obtain the target joint model.
In an embodiment, the receiving module 602 is further configured to receive a decrypted first gradient sent by the third device, where the first gradient is a gradient obtained by the third device decrypting the first gradient encrypted by using the second key with the decryption key corresponding to the second key, and the first gradient encrypted by using the second key is a gradient obtained by the third device receiving the gradient obtained by the first device decrypting the first gradient by using the decryption key corresponding to the first key.
Therefore, the first device receives first data which is sent by the second device and encrypted by the second key, calculates a first parameter according to the first data encrypted by the second key, encrypts the first parameter by the first key, and sends the encrypted first parameter to the second device, so that the second device calculates a first gradient according to the encrypted first parameter, receives and decrypts the first gradient sent by the second device, sends the decrypted first gradient to the second device, so that the second device updates a first model parameter of the preset joint model according to the decrypted first gradient, calculates a second gradient of the first device according to the first parameter and decrypts the second gradient, updates a second model parameter of the preset joint model according to the decrypted second gradient, obtains a target joint model, and can establish the joint model without sharing data, and the hidden trouble of data leakage is solved.
Fig. 7 shows a hardware schematic diagram of an electronic device provided in an embodiment of the present application.
The electronic device may include a processor 701 and a memory 702 that stores computer program instructions.
Specifically, the processor 701 may include a Central Processing Unit (CPU), or an Application Specific Integrated Circuit (ASIC), or may be configured to implement one or more Integrated circuits of the embodiments of the present Application.
Memory 702 may include a mass storage for data or instructions. By way of example, and not limitation, memory 702 may include a Hard Disk Drive (HDD), a floppy Disk Drive, flash memory, an optical Disk, a magneto-optical Disk, tape, or a Universal Serial Bus (USB) Drive or a combination of two or more of these. Memory 702 may include removable or non-removable (or fixed) media, where appropriate. The memory 702 may be internal or external to the integrated gateway disaster recovery device, where appropriate. In a particular embodiment, the memory 702 is non-volatile solid-state memory.
The memory may include Read Only Memory (ROM), Random Access Memory (RAM), magnetic disk storage media devices, optical storage media devices, flash memory devices, electrical, optical, or other physical/tangible memory storage devices. Thus, in general, the memory includes one or more tangible (non-transitory) computer-readable storage media (e.g., memory devices) encoded with software comprising computer-executable instructions and when the software is executed (e.g., by one or more processors), it is operable to perform operations described with reference to the methods according to an aspect of the present disclosure.
The processor 701 may implement any of the model building methods described in the embodiments above by reading and executing computer program instructions stored in the memory 702.
In one example, the electronic device may also include a communication interface 703 and a bus 710. As shown in fig. 7, the processor 701, the memory 702, and the communication interface 703 are connected by a bus 710 to complete mutual communication.
The communication interface 703 is mainly used for implementing communication between modules, apparatuses, units and/or devices in this embodiment of the application.
Bus 710 comprises hardware, software, or both to couple the components of the model building apparatus to each other. By way of example, and not limitation, a bus may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front Side Bus (FSB), a Hypertransport (HT) interconnect, an Industry Standard Architecture (ISA) bus, an infiniband interconnect, a Low Pin Count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or other suitable bus or a combination of two or more of these. Bus 610 may include one or more buses, where appropriate. Although specific buses are described and shown in the embodiments of the application, any suitable buses or interconnects are contemplated by the application.
The electronic device may calculate the first parameter based on the first data encrypted by using the second key, and perform the model building method in the embodiment of the present application by using the first key to encrypt the first parameter, thereby implementing the model building method described in conjunction with fig. 1.
In addition, in combination with the model building method in the foregoing embodiments, the embodiments of the present application may provide a computer storage medium to implement. The computer storage medium having computer program instructions stored thereon; the computer program instructions, when executed by a processor, implement any of the model building methods in the above embodiments.
It is to be understood that the present application is not limited to the particular arrangements and instrumentality described above and shown in the attached drawings. A detailed description of known methods is omitted herein for the sake of brevity. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present application are not limited to the specific steps described and illustrated, and those skilled in the art can make various changes, modifications, and additions or change the order between the steps after comprehending the spirit of the present application.
The functional blocks shown in the above-described structural block diagrams may be implemented as hardware, software, firmware, or a combination thereof. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, plug-in, function card, or the like. When implemented in software, the elements of the present application are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine-readable medium or transmitted by a data signal carried in a carrier wave over a transmission medium or a communication link. A "machine-readable medium" may include any medium that can store or transfer information. Examples of a machine-readable medium include electronic circuits, semiconductor memory devices, ROM, flash memory, Erasable ROM (EROM), floppy disks, CD-ROMs, optical disks, hard disks, fiber optic media, Radio Frequency (RF) links, and so forth. The code segments may be downloaded via computer networks such as the internet, intranet, etc.
It should also be noted that the exemplary embodiments mentioned in this application describe some methods or systems based on a series of steps or devices. However, the present application is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, may be performed in an order different from the order in the embodiments, or may be performed simultaneously.
Aspects of the present disclosure are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such a processor may be, but is not limited to, a general purpose processor, a special purpose processor, an application specific processor, or a field programmable logic circuit. It will also be understood that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware for performing the specified functions or acts, or combinations of special purpose hardware and computer instructions.
As described above, only the specific embodiments of the present application are provided, and it can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system, the module and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. It should be understood that the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the present application, and these modifications or substitutions should be covered within the scope of the present application.

Claims (11)

1. A model building method, applied to a first device, the method comprising:
encrypting a first parameter by using a first key, wherein the first parameter is obtained by calculation according to first data which is sent by second equipment and encrypted by using a second key;
receiving a first gradient sent by the second equipment, wherein the first gradient is obtained by the second equipment through calculation according to the first parameter encrypted by adopting a first key and a second key;
decrypting the first gradient and sending the decrypted first gradient to the second device, so that the second device updates a first model parameter of a preset joint model according to the decrypted first gradient;
calculating a second gradient of the first equipment according to the first parameter, wherein the second gradient is a gradient encrypted by a second secret key;
and decrypting the second gradient, and updating a second model parameter of a preset combined model according to the decrypted second gradient to obtain the target combined model.
2. The method of claim 1, wherein decrypting the first gradient and sending the decrypted first gradient to the second device comprises:
decrypting the first gradient by using a decryption key corresponding to the first key to obtain a first gradient encrypted by using a second key;
and sending the first gradient encrypted by the second key to a third device, so that the third device decrypts the first gradient encrypted by the second key by using a decryption key corresponding to the second key, and sends the decrypted first gradient to the second device.
3. The method according to claim 1, wherein the decrypting the second gradient and updating a second model parameter of a preset joint model according to the decrypted second gradient to obtain a joint target model comprises:
adding a random mask to the second gradient;
sending the second gradient added with the random mask to a third device, so that the third device decrypts the second gradient by adopting a decryption key corresponding to a second key;
and receiving the decrypted second gradient, and updating a second model parameter of a preset combined model according to the decrypted second gradient to obtain a target combined model.
4. The method according to any one of claims 1 to 3, wherein the first data comprises a second parameter calculated by the second device according to a preset learning rate and a preset regularization parameter; the decrypting the second gradient and updating a second model parameter of a preset joint model according to the decrypted second gradient to obtain a target joint model, including:
calculating a loss function according to the second parameter;
and decrypting the second gradient and the loss function, and updating a second model parameter of a preset combined model according to the decrypted second gradient and the decrypted loss function to obtain a target combined model.
5. A method of modeling, the method being applied to a second device, the method comprising:
encrypting first data by adopting a second key, sending the first data to first equipment so that the first equipment can calculate a first parameter and a second gradient according to the first data, decrypting the second gradient, updating a second model parameter of a preset combined model according to the decrypted second gradient, and encrypting the first parameter by adopting the first key;
receiving the first parameter encrypted with a first key;
calculating a first gradient according to the first parameter encrypted by using a first key, and sending the first gradient to the first device for the first device to decrypt the first gradient, wherein the first gradient is a gradient encrypted by using a second key and the first key;
receiving the decrypted first gradient sent by the first device;
and updating the first model parameter of the preset joint model according to the decrypted first gradient to obtain the target joint model.
6. The method of claim 5, wherein the receiving the decrypted first gradient sent by the first device comprises:
and receiving the decrypted first gradient sent by the third device, wherein the first gradient is a gradient obtained by the third device decrypting the first gradient encrypted by adopting the second key by adopting a decryption key corresponding to the second key, and the first gradient encrypted by adopting the second key is a gradient obtained by the third device receiving the decryption key corresponding to the first key by the first device decrypting the first gradient.
7. A modeling apparatus, the apparatus being applied to a first device, the apparatus comprising:
the encryption module is used for encrypting a first parameter by adopting a first key, and the first parameter is obtained by calculation according to first data which is sent by second equipment and is encrypted by adopting a second key;
a receiving module, configured to receive a first gradient sent by the second device, where the first gradient is calculated by the second device according to the first parameter encrypted by using a first key and a second key;
the decryption module is used for decrypting the first gradient and sending the decrypted first gradient to the second equipment, so that the second equipment updates the first model parameter of the preset combined model according to the decrypted first gradient;
a calculation module, configured to calculate a second gradient of the first device according to the first parameter, where the second gradient is a gradient encrypted by using a second key;
and the decryption module is further used for decrypting the second gradient and updating a second model parameter of a preset combined model according to the decrypted second gradient to obtain the target combined model.
8. A modeling apparatus, applied to a second device, the apparatus comprising:
the encryption module is used for encrypting first data by adopting a second key, sending the first data to first equipment, calculating a first parameter and a second gradient according to the first data by the first equipment, decrypting the second gradient, updating a second model parameter of a preset combined model according to the decrypted second gradient, and encrypting the first parameter by adopting the first key;
a receiving module, configured to receive the first parameter encrypted by using a first key;
a calculation module, configured to calculate a first gradient according to the first parameter encrypted by using a first key, and send the first gradient to the first device, so that the first device decrypts the first gradient, where the first gradient is a gradient encrypted by using a second key and the first key;
the receiving module is further configured to receive the decrypted first gradient sent by the first device;
and the updating module is used for updating the first model parameter of the preset combined model according to the decrypted first gradient to obtain the target combined model.
9. An electronic device, characterized in that the device comprises: a processor, and a memory storing computer program instructions; the processor reads and executes the computer program instructions to implement the model building method of any one of claims 1-6.
10. A computer-readable storage medium having computer program instructions stored thereon, which when executed by a processor, implement the model building method of any one of claims 1-6.
11. A computer program product, wherein instructions in the computer program product, when executed by a processor of an electronic device, enable the electronic device to perform the model building method of any one of claims 1-6.
CN202111533833.0A 2021-12-15 2021-12-15 Model building method, device, equipment and computer storage medium Active CN114221811B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111533833.0A CN114221811B (en) 2021-12-15 2021-12-15 Model building method, device, equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111533833.0A CN114221811B (en) 2021-12-15 2021-12-15 Model building method, device, equipment and computer storage medium

Publications (2)

Publication Number Publication Date
CN114221811A true CN114221811A (en) 2022-03-22
CN114221811B CN114221811B (en) 2023-05-26

Family

ID=80702289

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111533833.0A Active CN114221811B (en) 2021-12-15 2021-12-15 Model building method, device, equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN114221811B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112288002A (en) * 2020-10-28 2021-01-29 同盾控股有限公司 Model training method and device, data prediction method, medium, and electronic device
CN113239391A (en) * 2021-07-13 2021-08-10 深圳市洞见智慧科技有限公司 Third-party-free logistic regression federal learning model training system and method
US20210319353A1 (en) * 2020-04-09 2021-10-14 International Business Machines Corporation Verification of stochastic gradient descent

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210319353A1 (en) * 2020-04-09 2021-10-14 International Business Machines Corporation Verification of stochastic gradient descent
CN112288002A (en) * 2020-10-28 2021-01-29 同盾控股有限公司 Model training method and device, data prediction method, medium, and electronic device
CN113239391A (en) * 2021-07-13 2021-08-10 深圳市洞见智慧科技有限公司 Third-party-free logistic regression federal learning model training system and method

Also Published As

Publication number Publication date
CN114221811B (en) 2023-05-26

Similar Documents

Publication Publication Date Title
CN101098222B (en) Wireless communication system, wireless communication apparatus, and method of exchanging cryptography key between wireless communication apparatuses
CN112182595A (en) Model training method and device based on federal learning
JP2024500489A (en) Secure access methods and devices
EP2961094A1 (en) System and method for generating a random number
CN112287377A (en) Model training method based on federal learning, computer equipment and storage medium
CN113569267A (en) Privacy safety data set intersection method, device, equipment and storage medium
AU2016269390A1 (en) System and method for secure communications between a computer test tool and a cloud-based server
CN103679000A (en) Apparatus and method for remotely deleting critical information
CN113190871A (en) Data protection method and device, readable medium and electronic equipment
JP2024512110A (en) Data transmission methods, devices, electronic devices and storage media
CN110730447B (en) User identity protection method, user terminal and core network
CN114912105A (en) Data storage method, device, system, equipment, medium and product
KR20150081410A (en) Communication information transmission method and system
CN114095277A (en) Power distribution network secure communication method, secure access device and readable storage medium
WO2021168614A1 (en) Data encryption processing method, data decryption processing method, apparatus, and electronic device
CN117540426A (en) Method and device for sharing energy power data based on homomorphic encryption and federal learning
CN114221811B (en) Model building method, device, equipment and computer storage medium
CN115344848B (en) Identification acquisition method, device, equipment and computer readable storage medium
CN109617676B (en) Password synchronization method, communication node, electronic equipment and readable storage medium
CN114386075B (en) Data transmission channel establishment, data transmission method, device, equipment and medium
CN114422159B (en) Data processing method and device based on block chain
KR20180080655A (en) System and method for rsa dispersed key managing with card
CN112667992A (en) Authentication method, authentication device, storage medium, and electronic apparatus
CN110851270A (en) Resource transfer method, device, equipment and medium
CN116522404B (en) Data processing method, device, equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant