CN115438807A - Method, system, equipment, medium and product for optimizing horizontal federal model construction - Google Patents
Method, system, equipment, medium and product for optimizing horizontal federal model construction Download PDFInfo
- Publication number
- CN115438807A CN115438807A CN202211282836.6A CN202211282836A CN115438807A CN 115438807 A CN115438807 A CN 115438807A CN 202211282836 A CN202211282836 A CN 202211282836A CN 115438807 A CN115438807 A CN 115438807A
- Authority
- CN
- China
- Prior art keywords
- sample
- local
- periodic
- transformation
- private
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 110
- 238000010276 construction Methods 0.000 title claims description 19
- 230000000737 periodic effect Effects 0.000 claims abstract description 249
- 230000009466 transformation Effects 0.000 claims abstract description 185
- 238000013528 artificial neural network Methods 0.000 claims abstract description 145
- 238000005457 optimization Methods 0.000 claims abstract description 36
- 238000012549 training Methods 0.000 claims abstract description 22
- 238000013507 mapping Methods 0.000 claims description 52
- 230000006870 function Effects 0.000 claims description 23
- 230000004913 activation Effects 0.000 claims description 22
- 238000003860 storage Methods 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 15
- 230000004931 aggregating effect Effects 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 7
- 230000008569 process Effects 0.000 abstract description 39
- 238000004891 communication Methods 0.000 abstract description 32
- 238000004364 calculation method Methods 0.000 description 13
- 239000000654 additive Substances 0.000 description 7
- 230000000996 additive effect Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 230000009286 beneficial effect Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 230000001131 transforming effect Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000002776 aggregation Effects 0.000 description 3
- 238000004220 aggregation Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 1
- 238000013501 data transformation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/20—Ensemble learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Bioethics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Databases & Information Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Security & Cryptography (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The application discloses a method, a system, equipment, a medium and a product for constructing and optimizing a transverse federated model, which are applied to participant equipment and comprise the following steps: obtaining a local sample, and inputting the local sample into a private privacy protection module to perform periodic sample transformation on the local sample to obtain a periodic transformation sample; performing local iterative training optimization on the private privacy protection module and the local shared neural network by inputting the periodic transformation samples into the local shared neural network; local network parameters of the local shared neural network are obtained, and the local network parameters are uploaded to a federal server to carry out iterative optimization based on transverse federal learning on the private privacy protection module and the local shared neural network. The method and the device solve the technical problem of how to improve the communication efficiency among all the participant devices in the horizontal federal modeling process on the premise of protecting data privacy.
Description
Technical Field
The application relates to the technical field of artificial intelligence of financial technology (Fintech), in particular to a method, a system, equipment, a medium and a product for constructing and optimizing a transverse federated model.
Background
With the continuous development of financial science and technology, especially internet science and technology, more and more technologies (such as distributed technology, artificial intelligence and the like) are applied to the financial field, but the financial industry also puts higher requirements on the technologies, for example, higher requirements on the distribution of backlog in the financial industry are also put forward.
Federal learning is used as a distributed machine learning mode, the problem of data islanding can be solved on the premise of protecting data privacy of each participant device, homomorphic encryption is usually adopted in a model construction process based on horizontal federal learning at present to protect privacy, but the homomorphic encryption or secret sharing can cause the communication data volume between each participant device to be increased suddenly, and the communication efficiency between each participant device is influenced.
Disclosure of Invention
The application mainly aims to provide a method, a system, equipment, a medium and a product for constructing and optimizing a horizontal federated model, and aims to solve the technical problem of how to improve the communication efficiency among equipment of each participant in the horizontal federated modeling process on the premise of protecting data privacy.
In order to achieve the above object, the present application provides a method for optimizing the construction of a horizontal federated model, which is applied to participant devices, and the method for optimizing the construction of the horizontal federated model comprises:
obtaining a local sample, and inputting the local sample into a private privacy protection module to perform periodic sample transformation on the local sample to obtain a periodic transformation sample;
performing local iterative training optimization on the private privacy protection module and the local shared neural network by inputting the periodic transformation samples into the local shared neural network;
local network parameters of the local shared neural network are obtained, and the local network parameters are uploaded to a federal server, wherein the federal server is used for aggregating the local network parameters uploaded by the participating equipment into federal network parameters;
receiving the federal network parameters issued by the federal server, and updating the local network parameters of the local shared neural network into the federal network parameters;
and returning to the execution step: and obtaining a local sample, inputting the local sample into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity to obtain a periodic transformation sample until the transverse federated learning modeling is detected to be finished.
Optionally, the private privacy protection module comprises a private periodic neural network and a private noise module,
the step of obtaining a periodically transformed sample by inputting the local sample into a private privacy protection module and performing a periodic-based sample transformation on the local sample comprises:
inputting the local sample into the private periodic neural network, and performing periodic-based sample mapping on the local sample to obtain a periodic mapping sample;
and according to the private noise module, carrying out noise addition on the periodic mapping sample to obtain the periodic transformation sample.
Optionally, the private periodic neural network comprises neural network parameters and a periodic activation function,
the step of inputting the local samples into the private periodic neural network, and performing periodic-based sample mapping on the local samples to obtain periodic mapping samples includes:
according to the neural network parameters, carrying out sample mapping on the local sample to obtain a mapping sample;
and according to the periodic activation function, carrying out periodic activation on the mapping sample to obtain the periodic mapping sample.
Optionally, the method for optimizing the horizontal federal model building further includes:
obtaining a sample to be predicted, and performing sample transformation based on periodicity on the sample to be predicted according to the private privacy protection module to obtain a first periodic transformation prediction sample;
and performing sample prediction on the first periodic transformation prediction sample according to the local shared neural network to obtain a first sample prediction result.
Optionally, the method for optimizing the horizontal federal model building further includes:
receiving a second periodic transformation prediction sample sent by a client, and performing sample prediction on the second periodic transformation prediction sample according to a local shared neural network to obtain a second sample prediction result, wherein the second periodic transformation prediction sample is obtained by performing sample transformation on a to-be-predicted sample on a periodic basis by the client according to a target privacy protection module determined by a private privacy protection module issued by the participant equipment;
and feeding back the second sample prediction result to the client.
Optionally, before the step of receiving the second periodic transformation prediction samples sent by the client, the method for optimizing the horizontal federal model building further includes:
and issuing the private privacy protection module and a preset noise threshold value to a client, so that the client adjusts the private noise module in the private privacy protection module according to the local additional noise determined by the preset noise threshold value to obtain a target privacy protection module.
The present application further provides a lateral federal learning system, including:
the system comprises at least one participant device, a private privacy protection module and a data processing module, wherein the participant device is used for obtaining a local sample, and inputting the local sample into the private privacy protection module to perform periodic sample transformation on the local sample to obtain a periodic transformation sample; performing local iterative training optimization on the private privacy protection module and the local shared neural network by inputting the periodic transformation samples into the local shared neural network; local network parameters of the local shared neural network are obtained, and the local network parameters are uploaded to a federal server; receiving the federal network parameters issued by the federal server, and updating the local network parameters of the local shared neural network into the federal network parameters; and returning to the execution step: obtaining a local sample, inputting the local sample into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity to obtain a periodic transformation sample until the transverse federated learning modeling is detected to be finished;
and the federal server is used for aggregating the local network parameters uploaded by the participant devices into federal network parameters.
Optionally, the lateral federated learning system further includes a client,
the participant equipment is also used for issuing the private privacy protection module to a client; receiving a periodic transformation prediction sample sent by the client, and performing sample prediction on the periodic transformation prediction sample according to a local shared neural network to obtain a sample prediction result; feeding back a sample prediction result to the client;
the client is used for carrying out sample transformation to be predicted based on periodicity according to the target privacy protection module determined by the private privacy protection module to obtain a periodic transformation prediction sample; sending the periodic transform prediction samples to a participant device; and receiving a sample prediction result fed back by the participant device.
Optionally, the private privacy protection module comprises a private periodic neural network and a private noise module, the participant device further configured to:
inputting the local sample into the private periodic neural network, and performing periodic-based sample mapping on the local sample to obtain a periodic mapping sample;
and according to the private noise module, carrying out noise addition on the periodic mapping sample to obtain the periodic transformation sample.
Optionally, the private periodic neural network comprises neural network parameters and a periodic activation function, the participant device further configured to:
according to the neural network parameters, carrying out sample mapping on the local samples to obtain mapping samples;
and according to the periodic activation function, carrying out periodic activation on the mapping sample to obtain the periodic mapping sample.
Optionally, the participant device is further configured to:
obtaining a sample to be predicted, and performing sample transformation based on periodicity on the sample to be predicted according to the private privacy protection module to obtain a first periodic transformation prediction sample;
and according to the local shared neural network, carrying out sample prediction on the first periodic transformation prediction sample to obtain a first sample prediction result.
Optionally, the participant device is further configured to:
and issuing the private privacy protection module and a preset noise threshold to a client.
Optionally, the client is further configured to:
and adjusting a private noise module in the private privacy protection module according to the local additional noise determined by the preset noise threshold value to obtain a target privacy protection module.
The present application further provides an electronic device, the electronic device including: a memory, a processor, and a program of the horizontal federated model build optimization method stored on the memory and executable on the processor, the program of the horizontal federated model build optimization method when executed by the processor may implement the steps of the horizontal federated model build optimization method as described above.
The present application also provides a computer-readable storage medium having stored thereon a program for implementing a horizontal federated model build optimization method that, when executed by a processor, implements the steps of the horizontal federated model build optimization method described above.
The present application also provides a computer program product comprising a computer program which, when executed by a processor, performs the steps of the method of optimizing the lateral federal model build as described above.
In the horizontal federal learning process, each participant device is provided with a local shared neural network and a private privacy protection module, so that the participant device can input a local sample into the private privacy protection module, perform sample transformation on the local sample based on periodicity to obtain a periodic transformation sample, and input the periodic transformation sample into the local shared neural network to perform local iterative training optimization on the private privacy protection module and the local shared neural network; obtaining local network parameters of the local shared neural network, and uploading the local network parameters to a federal server, wherein the federal server is used for aggregating the local network parameters uploaded by each participant device into federal network parameters; receiving the federal network parameters issued by the federal server, and updating the local network parameters of the local shared neural network into the federal network parameters; and returning to the execution step: and obtaining a local sample, inputting the local sample into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity to obtain a periodic transformation sample until the transverse federated learning modeling is detected to be finished. The data privacy protection method comprises the steps that a periodic transformation sample is obtained by periodically transforming samples, so that one periodic transformation sample corresponds to input samples of a plurality of private privacy protection modules, even if the outside pushes back the periodic transformation sample according to model parameters of a local shared neural network, the input sample of the private privacy protection module corresponding to the periodic transformation sample in which period is in particular difficult to reversely push out, the difficulty of the reverse pushing is equivalent to the difficulty of fault-tolerant learning, the data privacy protection on participant equipment in the transverse federated learning modeling process is realized, local network parameters uploaded to a federated server by the participant equipment are plaintext data, and compared with a technical means for carrying out the data privacy protection through homomorphic encryption in the transverse federated learning modeling process in the prior art, the data magnitude of the plaintext data in the method is far lower than that of the ciphertext data magnitude of homomorphic encryption, the communication data volume for carrying out data transmission between the participant equipment and the federated server in the transverse learning process can be reduced, the communication efficiency in the transverse learning process is improved, and the problem of how to the privacy efficiency of communication among the participant equipment in the transverse federated modeling process for carrying out the data protection is solved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without inventive labor.
FIG. 1 is a schematic flow chart of a first embodiment of a horizontal federated model building optimization method of the present application;
FIG. 2 is a schematic flow chart of periodic sample transformation based on a private privacy protection module in the optimization method for constructing a horizontal federated model in the present application;
FIG. 3 is a schematic flow chart of the method for optimizing the horizontal federal model construction, in which a private privacy protection module and a local shared neural network are constructed based on horizontal federal learning;
FIG. 4 is a system architecture diagram of an embodiment of the horizontal federated learning system of the present application;
fig. 5 is a schematic device structure diagram of a hardware operating environment related to the horizontal federated model building optimization method in the embodiment of the present application.
The objectives, features, and advantages of the present application will be further described with reference to the accompanying drawings.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments of the present application are described in detail below with reference to the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Example one
In a first embodiment of the method for optimizing the construction of a horizontal federated model according to the present application, referring to fig. 1, the method of the present embodiment is applied to a participant device, and the method for optimizing the construction of a horizontal federated model includes:
step S10, obtaining a local sample, inputting the local sample into a private privacy protection module, and performing periodic sample conversion on the local sample to obtain a periodic conversion sample;
step S20, inputting the periodic transformation sample into a local shared neural network, and carrying out local iterative training optimization on the private privacy protection module and the local shared neural network;
step S30, local network parameters of the local shared neural network are obtained, and the local network parameters are uploaded to a federal server, wherein the federal server is used for aggregating the local network parameters uploaded by each participant device into federal network parameters;
step S40, receiving the federal network parameters issued by the federal server, and updating the local network parameters of the local shared neural network into the federal network parameters;
step S50, return to the execution step: and obtaining a local sample, inputting the local sample into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity to obtain a periodic transformation sample until the transverse federated learning modeling is detected to be finished.
In this embodiment, it should be noted that the participant device is a participant of horizontal federal learning, the federal server is a coordinator of horizontal federal learning, and the private privacy protection module is configured to perform sample transformation based on periodicity on a local sample, and convert each sample feature value in the local sample into a periodic transformation feature value conforming to periodic variation, so as to obtain a periodic transformation sample, so that since the periodic transformation feature value conforms to the periodic variation, a sample feature value corresponding to one periodic transformation feature value is not unique, that is, one periodic transformation feature value corresponds to a plurality of different sample feature values, even if the server knows module parameters of the periodic transformation sample and the private privacy protection module, it is difficult to reversely deduce a unique sample feature value corresponding to each periodic transformation feature value in the periodic transformation sample, and the difficulty of the reverse deduction is equivalent to the difficulty of fault-tolerant learning, so that data privacy of the local sample in the participant device can be well protected. The participant equipment is also provided with a local shared neural network, the local shared neural network and the private privacy protection module synchronously carry out iterative training and updating, and local network parameters of the local shared neural network are directly uploaded to the federal server in a plaintext data form for aggregation, so that all the participant equipment can share the local neural network parameters.
As an example, steps S10 to S50 include: obtaining a local sample and a sample label corresponding to the local sample, wherein the local sample at least consists of a sample characteristic value; inputting the local sample into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity so as to convert each sample characteristic value in the local sample into a characteristic value which accords with periodic variation, thereby obtaining a periodic transformation sample corresponding to the local sample; inputting the periodic transformation sample into a local shared neural network for sample prediction to obtain a sample prediction result; calculating corresponding model loss according to the difference value between the sample prediction result and the sample label, judging that the transverse federated learning modeling is finished if the model loss is converged, and performing back propagation updating on the private privacy protection module and the local shared neural network according to the model gradient calculated by the model loss if the model loss is not converged; and returning to the execution step: obtaining a local sample and a sample label corresponding to the local sample, detecting whether the iteration update times of the private privacy protection module and the local shared neural network reach a preset iteration update time, if so, obtaining local network parameters of the local shared neural network, and uploading the local network parameters to a federal server, wherein the federal server is used for aggregating the local network parameters uploaded by each participant device into federal network parameters, and the aggregation mode can be weighted summation or weighted average; receiving the federal network parameters issued by the federal server, and replacing and updating the local network parameters of the local shared neural network into the federal network parameters; and returning to the execution step: and obtaining the local sample and a sample label corresponding to the local sample to perform the next round of iterative training until the transverse federated learning modeling is detected to be finished.
As an example, in order that the privacy protection module may more efficiently learn knowledge at each participant device in horizontal federal learning, after the step of replacing the local network parameters of the local shared neural network with the federal network parameters, the method further comprises:
under the condition of keeping the network parameters of the local shared neural network unchanged, training and updating the privacy protection module according to the current local sample until the privacy protection module meets the preset training and updating finishing conditions, and returning to the execution step: and obtaining the local sample and a sample label corresponding to the local sample to perform the next round of iterative training until the transverse federated learning modeling is detected to be finished. The preset training update ending condition may be that a preset training update frequency is reached or model loss convergence is reached. And the current local sample is the sample obtained in the iteration process of the current round.
It should be noted that, when model construction is performed based on horizontal federal learning at present, in order to protect data privacy of participant devices, horizontal federal learning encryption is generally performed based on homomorphic encryption or secret sharing, but the data magnitude of encrypted ciphertext data is far larger than that of plaintext data, so that the communication transmission data volume between each participant device and a federal server is increased to a great extent, and the respective calculation data volumes of the participant device and the federal server are increased to a great extent, so that the communication efficiency between each participant device and the federal server is greatly influenced, and the data calculation efficiency of each participant device and the federal server in the horizontal federal learning modeling process is greatly influenced. In addition, although the federal server directly takes local network parameters of local shared neural networks of the participator devices which are plaintext data, the federal server can reversely deduce input data of the local shared neural networks, namely periodically transform samples, but the federal server does not know module parameters in the private privacy protection module, so that the federal server cannot reversely deduce local samples of the participator devices; even if the federal server takes the module parameters in the private privacy protection module, the periodic transformation samples are periodically changed, so that the sample characteristic value corresponding to each characteristic value in the same periodic transformation sample is not unique, the federal server is difficult to reversely deduce a unique local sample based on the periodic transformation samples, and the difficulty of the reverse deduction is equivalent to the difficulty of fault-tolerant learning; therefore, data privacy of the participant equipment can be well protected, and in conclusion, in the embodiment of the application, the private privacy protection module and the local shared neural network are constructed on the basis of the horizontal federal learning, so that the data privacy of the participant equipment is protected, the communication efficiency between the participant equipment and the federal server is improved, and the data calculation efficiency of the participant equipment and the federal server in the horizontal federal learning modeling process is improved.
Wherein the private privacy protection module comprises a private periodic neural network and a private noise module,
the step of obtaining a periodically transformed sample by inputting the local sample into a private privacy protection module and performing a periodic-based sample transformation on the local sample comprises:
s11, inputting the local sample into the private periodic neural network, and carrying out periodic sample mapping on the local sample to obtain a periodic mapping sample;
and S12, carrying out noise addition on the periodic mapping sample according to the private noise module to obtain the periodic transformation sample.
In this embodiment, it should be noted that the private privacy protection module may be composed of a private periodic neural network and a private noise module, wherein the private periodic neural network is configured to perform the periodic-based sample transformation on the local samples, and the private noise module is configured to add noise to an output of the private periodic neural network.
As an example, steps S11 to S12 include: inputting the local samples into the private periodic neural network, and performing periodic sample mapping on the local samples to convert characteristic values of the samples in the local samples into periodic transformation characteristic values which are in accordance with periodic changes, so as to obtain periodic mapping samples corresponding to the local samples; and inputting the periodic mapping sample into the private noise module, and adding corresponding additional noise into the periodic mapping sample to obtain the periodic transformation sample. In the embodiment of the application, the private privacy protection module is provided with the noise module, so that an additional noise can be added to the private privacy protection module on the basis of converting the characteristic value of each sample in the local sample into the periodically-changed periodic transformation characteristic value which is in accordance with the periodic change, and the private noise module belongs to the private possession of the participant equipment, so that the difficulty of the federal server in reversely deducing the local sample of the participant equipment can be further improved under the condition that the federal server cannot obtain the additional noise, and the effect of data privacy protection in the horizontal federal learning modeling process of the embodiment of the application is improved.
Wherein the private periodic neural network comprises neural network parameters and a periodic activation function,
the step of inputting the local samples into the private periodic neural network, and performing periodic-based sample mapping on the local samples to obtain periodic mapping samples includes:
step S111, according to the neural network parameters, carrying out sample mapping on the local sample to obtain a mapping sample;
and step S112, periodically activating the mapping sample according to the periodic activation function to obtain the periodic mapping sample.
In this embodiment, it should be noted that the private periodic neural network may be composed of a neural network parameter and a periodic activation function, where the neural network parameter is used to perform linear transformation on the local samples, and the periodic activation function is a periodic function and is used to activate the local samples after the linear transformation, so as to output periodic mapping samples.
As an example, steps S111 to S112 include: according to the neural network parameters, carrying out linear transformation on characteristic values of all samples in the local samples to obtain linear transformation samples; and activating each characteristic value in the linear transformation sample according to the periodic activation function to obtain a periodic mapping sample. The characteristic values in the linear transformation sample are input into the periodic activation function for activation, so that each characteristic value in the periodic mapping sample is in accordance with periodic variation, each characteristic value in the periodic mapping sample is not in one-to-one correspondence with each sample characteristic value in the local sample, for example, if the periodic activation function is sinX and one characteristic value in the periodic mapping sample is 1, a plurality of corresponding X values exist, so that even if the neural network parameters and the periodic mapping sample are taken from the outside, the sample characteristic values in the local sample are difficult to reversely deduce, and in addition, additional noise is applied to the periodic mapping sample in the embodiment of the application, and the difficulty of reversely deducing the local sample from the outside is further increased.
In addition, it should be noted that the current data encryption manners such as homomorphic encryption or secret sharing are all nonlinear data transformation processes, so the data magnitude of ciphertext data obtained by final encryption is usually much larger than that of plaintext data, and in the embodiment of the present invention, only a simple linear transformation is performed on a local sample, although the data magnitude level of plaintext data after the linear transformation is larger than that of plaintext data before the linear transformation, the data magnitude level of plaintext data after the linear transformation is also much smaller than that of ciphertext data, so compared with a manner of performing horizontal federal learning modeling based on a data encryption method such as homomorphic encryption or secret sharing, the amount of data transmitted by communication between a participant device and a federal server in the embodiment of the present invention is less, and the amount of data calculation performed by the participant device and the federal server in the horizontal federal learning modeling process is less, so that the communication efficiency between the participant device and the federal server in the horizontal federal learning modeling process can be improved, and the efficiency of data calculation performed by the participant device and the federal server in the horizontal federal learning process can be improved.
As an example, the above-mentioned specific formula for performing the periodic sample transformation on the local sample according to the private privacy protection module is as follows:
wherein O is a periodic transformation sample output by the private privacy protection module, W is the neural network parameter, X is the local sample,for a periodic activation function with a period of 1/r, epsilon is additive noise of the private noise module, referring to fig. 2, fig. 2 is a schematic flow chart of the periodic sample transformation based on the private privacy protection module in the embodiment of the present application, where the input data X is the local sample, and the periodic neurons are activation functions of 1/rThe random noise epsilon is additive noise, and the output data O is a periodic transformation sample output by the private privacy protection module.
The method for optimizing the horizontal federated model construction further comprises the following steps:
step A10, obtaining a sample to be predicted, and performing sample transformation based on periodicity on the sample to be predicted according to the private privacy protection module to obtain a first periodic transformation prediction sample;
step A20, according to the local shared neural network, performing sample prediction on the first periodic transformation prediction sample to obtain a first sample prediction result.
As an example, steps a10 to a20 include: obtaining a sample to be predicted, wherein the sample to be predicted at least consists of one sample characteristic value to be predicted; inputting a sample to be predicted into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity so as to convert the characteristic value of each sample to be predicted in the sample to be predicted into the characteristic value which accords with the periodic change, thereby obtaining a first periodic transformation prediction sample; and according to the local shared neural network, performing sample prediction on the first periodic transformation prediction sample to obtain a first sample prediction result, wherein the first sample prediction result can be a prediction label or a predicted probability value.
The method for optimizing the horizontal federated model construction further comprises the following steps:
step B10, receiving a second periodic transformation prediction sample sent by a client, and performing sample prediction on the second periodic transformation prediction sample according to a local shared neural network to obtain a second sample prediction result, wherein the second periodic transformation prediction sample is obtained by performing sample transformation based on periodicity on a sample to be predicted by the client according to a target privacy protection module determined by a private privacy protection module issued by the participant equipment;
and B20, feeding back the second sample prediction result to the client.
Before the step of receiving the second periodic transformation prediction samples sent by the client, the method for constructing and optimizing the horizontal federated model further includes:
in this embodiment, it should be noted that the participant device may deploy the private privacy protection module to the client, so that the client holds the private privacy protection module, and the participant device holds the local shared neural network, which may implement the two-party model security inference between the client and the participant device.
As an example, steps B10 to B20 include: the private privacy protection module is used by the client as a target privacy protection module, and samples to be predicted are input into the private privacy protection module to be subjected to sample transformation based on periodicity, so that each characteristic value in the samples to be predicted is transformed into a periodic transformation characteristic value which accords with periodic transformation, and a second periodic transformation prediction sample is obtained; receiving a second periodic transformation prediction sample sent by a client, and performing sample prediction on the second periodic transformation prediction sample according to a local shared neural network to obtain a second sample prediction result, wherein the second sample prediction result can be a prediction label or a predicted probability value; and feeding back the second sample prediction result to the client.
It should be noted that, at present, when performing two-party model security inference, a participant device is willing to provide an inference model as a service, but does not want to directly give an inference model to a client, but the client wants to use the inference model to predict a local sample, but the client considers that the local sample is private information, and does not want to transmit a plaintext of the local sample to the participant device, at present, a homomorphic encrypted ciphertext of the local sample of the client is usually transmitted to the participant device, and then the participant device calculates the homomorphic encrypted ciphertext in a ciphertext state by using the inference model, but since a data volume level of the homomorphic encrypted ciphertext is much larger than a data level of plaintext data, the above two-party model security inference method based on homomorphic encryption greatly increases a communication data volume between the client and the participant device, and increases respective calculation data volumes of the client and the participant device, resulting in that a communication efficiency between the client and the participant device becomes low, thereby affecting an efficiency of two-party model security inference. In the embodiment of the application, since the second periodic transformation prediction sample is subjected to periodic transformation, the sample characteristic value of the sample to be predicted corresponding to each periodic transformation characteristic value in the same second periodic transformation prediction sample is not unique, and the participant device is difficult to reversely deduce the sample to be predicted of a unique client based on the second periodic transformation prediction sample, so that the data privacy of the client is protected, the participant device does not need to give a local shared neural network serving as an inference model to the client, and since the second periodic transformation prediction sample is itself plaintext data, the purpose of realizing the two-party model security inference by interacting plaintext data between the client and the participant device is achieved, and the data magnitude of the plaintext data is much smaller than that of the ciphertext data, so that the technical defects that the communication data quantity between the client and the participant device is greatly increased, the respective calculation data quantities of the client and the participant device are increased, the communication efficiency between the client and the participant device is lowered, and the respective calculation efficiency of the client and the participant device is lowered are overcome, and the security inference efficiency of the two-party model is raised.
And step C10, issuing the private privacy protection module and a preset noise threshold value to a client, so that the client can adjust the private noise module in the private privacy protection module according to the local additional noise determined by the preset noise threshold value to obtain a target privacy protection module.
In this embodiment, it should be noted that the participant device may further issue the private privacy protection module and the preset noise threshold together to the client, and the client adjusts the private noise module in the private privacy protection module according to the local additive noise determined by the preset noise threshold to obtain the target privacy protection module, that is, the client obtains a target privacy protection module including the private noise module of the local additive noise, so that on the basis of converting the characteristic value of each sample in the sample to be predicted into the periodically-changed characteristic value, a local additive noise may be added to the sample to be predicted, where the local additive noise is separately held by the client, so as to further improve the difficulty of external reverse-pushing the sample to be predicted of the client, thereby improving the effect of data privacy protection in the two-party model security inference of the embodiment of the present application, and since the local additive noise is limited to be smaller than the preset noise threshold, the prediction accuracy of the local shared neural network may not be affected.
As an example, the local sample may be a local image sample, the local shared neural network may be an image recognition neural network, and steps S10 to S50 include: obtaining a local image sample, and inputting the local image sample into a private privacy protection module to perform periodic sample transformation on the local image sample to obtain a periodic transformed image sample; performing local iterative training optimization on the private privacy protection module and the image recognition neural network by inputting the periodically transformed image samples into the image recognition neural network; acquiring local network parameters of the image recognition neural network, and uploading the local network parameters to a federal server, wherein the federal server is used for aggregating the local network parameters uploaded by each participant device into federal network parameters; receiving the federal network parameters issued by the federal server, and updating the local network parameters of the image recognition neural network into the federal network parameters; and returning to the execution step: the method comprises the steps of obtaining a local image sample, inputting the local image sample into a private privacy protection module, and carrying out periodic sample transformation on the local image sample to obtain a periodic transformation image sample until the transverse federated learning modeling is detected to be finished. In this way, since the periodically transformed image samples are obtained by periodically transforming the samples, one periodically transformed image sample corresponds to the input image samples of the plurality of private privacy protection modules, and therefore, even if the external world reversely deduces the periodically transformed image samples according to the model parameters of the image recognition neural network, it is difficult to reversely deduce the input image sample of the private privacy protection module in which period the periodically transformed image sample specifically corresponds, and the difficulty of the reverse deduction is equivalent to the difficulty of fault-tolerant learning, so that image data privacy protection for the participant device in the process of constructing the image recognition model based on the horizontal federal learning is realized, and the local network parameters uploaded to the federal server by the participant device are plaintext data, and the data magnitude of the plaintext data is far lower than that of the ciphertext data of the homomorphic encryption, so that the communication data amount for data transmission between the participant device and the federal server in the process of constructing the image recognition model based on the horizontal federal learning can be reduced, and the calculation data amount for image data calculation by the participant device and the federal server can be reduced, and the communication efficiency and the image data calculation efficiency in the process of constructing the image recognition model based on the horizontal federal learning can be improved.
As an example, it should be noted that the private privacy protection module may be composed of one or more passport-embedded network modules connected in series, the passport-embedded network module may be composed of a private periodic neural network and a private noise module connected in series, the private privacy protection module of each participant device may be a heterogeneous network, that is, the number of passport-embedded network modules in the private privacy protection module at each participant device may be different, and each participant device may design how many passport-embedded network modules are included in its own private privacy protection module according to its actual requirement, for example, the number of passport-embedded network modules included in its own private privacy protection module may be the same as the actual requirement of each participant deviceThe participant equipment with a large number can be designed into the private privacy protection module consisting of more passport-embedded network modules so as to deal with the sample data with more complicated data distribution, the accuracy of periodically transforming the samples is improved, the accuracy of predicting the final samples is improved, and the participant equipment with a large number of samples can be designed into the private privacy protection module consisting of less passport-embedded network modules so as to reduce the resource consumption of the system, improve the efficiency of periodically transforming the samples and improve the efficiency of predicting the final samples. With particular reference to fig. 3, fig. 3 is a schematic flow chart of constructing a private privacy protection module and a local shared neural network based on horizontal federal learning in an embodiment of the present application, X N For local samples of participant devices, a private neural network D N For the private privacy protection module, the private neural network D N One or more passport-embedded network modules, which may be composed of a private periodic neural network and a private noise module connected in series, the number of passport-embedded network modules of each participant device being different, so that the private neural network D between the participant devices N May be a heterogeneous network, a private periodic neural network for performing a periodic-based sample transformation on training samples, a private noise module for performing noise addition on the output of the periodic privacy protection module, and a shared neural network G N Is a local shared neural network, y & N A prediction tag, y, output for the local shared neural network N Sample labels for local samples, L N G for each participant device to send to the federal server for model loss 1 To G N Local network parameters for local sharing of a neural network among different participant devices, G avg Is the federal network parameter.
In the transverse federal learning process, each participant device is provided with a local shared neural network and a private privacy protection module, so that the participant device can input a local sample into the private privacy protection module, perform periodic sample transformation on the local sample based on periodicity to obtain a periodic transformation sample, and input the periodic transformation sample into the local shared neural network to perform local iterative training optimization on the private privacy protection module and the local shared neural network; obtaining local network parameters of the local shared neural network, and uploading the local network parameters to a federal server, wherein the federal server is used for aggregating the local network parameters uploaded by each participant device into federal network parameters; receiving the federal network parameters issued by the federal server, and updating the local network parameters of the local shared neural network into the federal network parameters; and returning to the execution step: and obtaining a local sample, inputting the local sample into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity to obtain a periodic transformation sample until the transverse federated learning modeling is detected to be finished. The data magnitude of the plaintext data in the application is far lower than that of the homomorphic encrypted modeling ciphertext data in the transverse federal learning modeling process in the prior art, so that the communication data volume for data transmission between the participator equipment and the federal server in the transverse federal learning modeling process can be reduced, the communication efficiency in the transverse learning process is improved, and the technical problem of how to improve the privacy efficiency of the participator equipment in the transverse federal modeling process on the premise of carrying out data federation protection is solved.
Example two
The embodiment of the application provides a transverse federated model construction optimization method, which is applied to a federated server and comprises the following steps:
step H10, local network parameters uploaded by each participant device are received, and the local network parameters are aggregated into federal network parameters, wherein the local network parameters are obtained by the participant device through iterative training optimization on a local shared neural network and a private privacy protection module according to a periodic transformation sample, and the periodic transformation sample is obtained by the participant device through periodic sample transformation on the basis of the private privacy protection module;
step H20, sending the federal network parameters to each of the participant devices, so that the participant devices update the local network parameters of the local shared neural network to the federal network parameters, and returning to execute the steps of: and obtaining a local sample, inputting the local sample into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity to obtain a periodic transformation sample until the transverse federated learning modeling is detected to be finished.
In this embodiment, it should be noted that, the aggregation mode may be weighted average, weighted sum, and the like, and the specific implementation process in which the participant device performs iterative training optimization on the local shared neural network and the private privacy protection module according to the periodic transformation sample, performs sample transformation based on the periodicity on the local sample according to the private privacy protection module, and updates the local network parameter of the local shared neural network to the federal network parameter may refer to the specific contents in the above step S10 to step S50, and is not described herein again.
The embodiment of the application provides a method for constructing and optimizing a transverse federated model, wherein in the transverse federated learning process, each participant device is provided with a local shared neural network and a private privacy protection module, so that the participant device can input a local sample into the private privacy protection module, perform sample transformation on the local sample on the basis of periodicity to obtain a periodic transformation sample, and input the periodic transformation sample into the local shared neural network to perform local iterative training optimization on the private privacy protection module and the local shared neural network; obtaining local network parameters of the local shared neural network, and uploading the local network parameters to a federal server, wherein the federal server is used for aggregating the local network parameters uploaded by each participant device into federal network parameters; receiving the federal network parameters issued by the federal server, and updating the local network parameters of the local shared neural network into the federal network parameters; and returning to the execution step: and obtaining a local sample, inputting the local sample into a private privacy protection module, and performing sample transformation on the local sample on a periodic basis to obtain a periodic transformation sample until the transverse federated learning modeling is detected to be finished. The data privacy protection method comprises the steps that a periodic transformation sample is obtained by periodically transforming samples, so that one periodic transformation sample corresponds to input samples of a plurality of private privacy protection modules, even if the outside pushes back the periodic transformation sample according to model parameters of a local shared neural network, the input sample of the private privacy protection module corresponding to the periodic transformation sample in which period is in particular difficult to reversely push out, the difficulty of the reverse pushing is equivalent to the difficulty of fault-tolerant learning, the data privacy protection on participant equipment in the transverse federated learning modeling process is realized, local network parameters uploaded to a federated server by the participant equipment are plaintext data, and compared with a technical means for carrying out the data privacy protection through homomorphic encryption in the transverse federated learning modeling process in the prior art, the data magnitude of the plaintext data in the method is far lower than that of the ciphertext data magnitude of homomorphic encryption, the communication data volume for carrying out data transmission between the participant equipment and the federated server in the transverse learning process can be reduced, the communication efficiency in the transverse learning process is improved, and the problem of how to the privacy efficiency of communication among the participant equipment in the transverse federated modeling process for carrying out the data protection is solved.
EXAMPLE III
The embodiment of the present application further provides a two-party security model inference method, where the method of this embodiment is applied to a client, and the two-party security model inference method includes:
step D10, according to a target privacy protection module, carrying out sample conversion to be predicted based on periodicity to obtain a periodic conversion prediction sample, wherein the target privacy protection module is determined by a private privacy protection module issued by the participant equipment and a preset noise threshold;
step D20, uploading the periodic transformation prediction samples to the participant equipment, so that the participant equipment performs sample prediction on the periodic transformation prediction samples according to a local shared neural network to obtain a sample prediction result;
and D30, receiving a sample prediction result fed back by the participant equipment.
As an example, steps D10 to D30 include: the method comprises the steps that samples to be predicted are input into a target privacy protection module, sample transformation based on periodicity is conducted on the samples to be predicted, so that characteristic values of all samples in the samples to be predicted are converted into characteristic values which accord with the periodicity change, and periodic transformation prediction samples are obtained, wherein the target privacy protection module is determined by a private privacy protection module issued by participant equipment and a preset noise threshold value; uploading the periodic transformation prediction samples to the participant equipment, so that the participant equipment performs sample prediction on the periodic transformation prediction samples according to a local shared neural network to obtain a sample prediction result; and receiving a sample prediction result fed back by the participant equipment.
Before the step of performing, according to the target privacy protection module, periodic-based sample transformation on a sample to be predicted to obtain a periodic transformation prediction sample, the method for constructing and optimizing the transverse federated model further includes:
step E10, receiving a private privacy protection module and a preset noise threshold value issued by the participant equipment, and generating local additional noise according to the preset noise threshold value;
and E20, adjusting a private noise module in the private privacy protection module according to the local additional noise to obtain the target privacy protection module.
As an example, steps E10 to E20 include: receiving a private privacy protection module and a preset noise threshold value issued by the participant equipment, and generating local additional noise smaller than the preset noise threshold value; the additional noise of the private noise module in the private privacy protection module is adjusted to be the local additional noise to obtain the target privacy protection module, so that the client obtains the target privacy protection module of the private noise module containing the local additional noise, and thus, on the basis of converting the characteristic value of each sample in the sample to be predicted into the periodically-changed characteristic value which is in accordance with the periodic change, a local additional noise can be added to the sample to be predicted, and the local additional noise is independently held by the client, so that the difficulty of the external reverse-pushing of the sample to be predicted of the client can be further improved, the effect of data privacy protection in the two-party model security inference of the embodiment of the application is improved, and the local additional noise is limited to be smaller than a preset noise threshold value, so that the prediction accuracy of a local shared neural network cannot be influenced.
The embodiment of the application provides a two-party model security reasoning method, namely, according to a target privacy protection module, a sample to be predicted is subjected to sample transformation based on periodicity to obtain a periodic transformation prediction sample, wherein the target privacy protection module is determined by a private privacy protection module issued by participant equipment and a preset noise threshold; uploading the periodic transformation prediction sample to the participant equipment, so that the participant equipment performs sample prediction on the periodic transformation prediction sample according to a local shared neural network to obtain a sample prediction result; and receiving a sample prediction result fed back by the participant device. Therefore, as the periodic transformation prediction samples are subjected to periodic variation, the sample characteristic value of the sample to be predicted corresponding to each periodic transformation characteristic value in the same periodic transformation prediction sample is not unique, and the participant device is difficult to reversely deduce a unique sample to be predicted based on the second periodic transformation prediction sample, so that the data privacy of the client is protected, the participant device does not need to give a local shared neural network serving as an inference model to the client, and the periodic transformation prediction samples are plaintext data, so that the purpose of realizing the security inference of the two-party model by interacting plaintext data between the client and the participant device is realized, and the data magnitude of the plaintext data is far smaller than that of ciphertext data, so that the technical defects that the communication data quantity between the client and the participant device is greatly increased and the respective calculation data quantities of the client and the participant device are increased by the security inference method of the two-party model based on homomorphic encryption are overcome, so that the communication efficiency between the client and the participant device is lowered, and the respective calculation efficiency of the client and the participant device is raised.
Example four
Referring to fig. 4, an embodiment of the present application further provides a horizontal federal learning system, where the horizontal federal learning system includes:
the system comprises at least one participant device, a private privacy protection module and a data processing module, wherein the participant device is used for obtaining a local sample, and inputting the local sample into the private privacy protection module to perform periodic sample transformation on the local sample to obtain a periodic transformation sample; performing local iterative training optimization on the private privacy protection module and the local shared neural network by inputting the periodic transformation samples into the local shared neural network; local network parameters of the local shared neural network are obtained, and the local network parameters are uploaded to a federal server; receiving the federal network parameters issued by the federal server, and updating the local network parameters of the local shared neural network into the federal network parameters; and returning to the execution step: obtaining a local sample, inputting the local sample into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity to obtain a periodic transformation sample until the transverse federated learning modeling is detected to be finished;
and the federal server is used for aggregating the local network parameters uploaded by the participant equipment into federal network parameters.
Optionally, the horizontal federal learning system further comprises a client,
the participant equipment is also used for issuing the private privacy protection module to a client; receiving a periodic transformation prediction sample sent by the client, and performing sample prediction on the periodic transformation prediction sample according to a local shared neural network to obtain a sample prediction result; feeding back a sample prediction result to the client;
the client is used for carrying out sample transformation to be predicted based on periodicity according to the target privacy protection module determined by the private privacy protection module to obtain a periodic transformation prediction sample; sending the periodic transform prediction samples to a participant device; and receiving a sample prediction result fed back by the participant device.
Optionally, the private privacy protection module comprises a private periodic neural network and a private noise module, the participant device further configured to:
inputting the local sample into the private periodic neural network, and performing periodic-based sample mapping on the local sample to obtain a periodic mapping sample;
and according to the private noise module, carrying out noise addition on the periodic mapping sample to obtain the periodic transformation sample.
Optionally, the private periodic neural network comprises neural network parameters and a periodic activation function, the participant device is further configured to:
according to the neural network parameters, carrying out sample mapping on the local samples to obtain mapping samples;
and according to the periodic activation function, carrying out periodic activation on the mapping sample to obtain the periodic mapping sample.
Optionally, the participant device is further configured to:
obtaining a sample to be predicted, and performing sample transformation based on periodicity on the sample to be predicted according to the private privacy protection module to obtain a first periodic transformation prediction sample;
and performing sample prediction on the first periodic transformation prediction sample according to the local shared neural network to obtain a first sample prediction result.
Optionally, the participant device is further configured to:
and issuing the private privacy protection module and a preset noise threshold to a client.
Optionally, the client is further configured to:
and adjusting a private noise module in the private privacy protection module according to the local additional noise determined by the preset noise threshold value to obtain a target privacy protection module.
According to the horizontal federal learning system provided by the embodiment of the application, the horizontal federal model construction optimization method in the embodiment is adopted, and the technical problem of how to improve the communication efficiency among all participant equipment in the horizontal federal modeling process on the premise of protecting data privacy is solved. Compared with the prior art, the beneficial effect of the horizontal federal learning system provided by the embodiment of the application is the same as that of the horizontal federal model construction optimization method provided by the embodiment, and other technical features in the horizontal federal learning system are the same as those disclosed by the method of the embodiment, which are not described herein again.
EXAMPLE five
An embodiment of the present application provides an electronic device, and the electronic device includes: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method for optimizing horizontal federated model building in the first embodiment.
Referring now to FIG. 5, shown is a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure. The electronic devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., car navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the electronic device may include a processing means (e.g., a central processing unit, a graphic processor, etc.) that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) or a program loaded from a storage means into a Random Access Memory (RAM). In the RAM, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device, ROM and RAM are trained on each other via the bus. An input/output (I/O) interface is also connected to the bus.
Generally, the following systems may be connected to the I/O interface: input devices including, for example, touch screens, touch pads, keyboards, mice, image sensors, microphones, accelerometers, gyroscopes, and the like; output devices including, for example, liquid Crystal Displays (LCDs), speakers, vibrators, and the like; storage devices including, for example, magnetic tape, hard disk, etc.; and a communication device. The communication means may allow the electronic device to communicate wirelessly or by wire with other devices to exchange data. While the figures illustrate an electronic device with various systems, it is to be understood that not all illustrated systems are required to be implemented or provided. More or fewer systems may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means, or installed from a storage means, or installed from a ROM. The computer program, when executed by a processing device, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
According to the electronic equipment provided by the application, the method for constructing and optimizing the horizontal federated model in the embodiment is adopted, and the technical problem of how to improve the communication efficiency among the equipment of each participant in the horizontal federated modeling process on the premise of protecting data privacy is solved. Compared with the prior art, the beneficial effects of the electronic device provided by the embodiment of the application are the same as the beneficial effects of the horizontal federal model construction optimization method provided by the embodiment, and other technical features of the electronic device are the same as those disclosed by the embodiment method, which are not repeated herein.
It should be understood that portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof. In the foregoing description of embodiments, the particular features, structures, materials, or characteristics may be combined in any suitable manner in any one or more embodiments or examples.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
EXAMPLE six
The present embodiment provides a computer-readable storage medium having computer-readable program instructions stored thereon for performing the method for lateral federated model build optimization in the first embodiment described above.
The computer readable storage medium provided by the embodiments of the present application may be, for example, a usb disk, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or device, or a combination of any of the above. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present embodiment, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer-readable storage medium may be embodied in an electronic device; or may be present alone without being incorporated into the electronic device.
The computer-readable storage medium carries one or more programs which, when executed by an electronic device, cause the electronic device to: obtaining a local sample, and inputting the local sample into a private privacy protection module to perform periodic sample transformation on the local sample to obtain a periodic transformation sample; performing local iterative training optimization on the private privacy protection module and the local shared neural network by inputting the periodic transformation samples into the local shared neural network; local network parameters of the local shared neural network are obtained, and the local network parameters are uploaded to a federal server, wherein the federal server is used for aggregating the local network parameters uploaded by the participating equipment into federal network parameters; receiving the federal network parameters issued by the federal server, and updating the local network parameters of the local shared neural network into the federal network parameters; and returning to the execution step: and obtaining a local sample, inputting the local sample into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity to obtain a periodic transformation sample until the transverse federated learning modeling is detected to be finished.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present disclosure may be implemented by software or hardware. Wherein the names of the modules do not in some cases constitute a limitation of the unit itself.
The computer-readable storage medium provided by the application stores computer-readable program instructions for executing the above-mentioned horizontal federated model construction optimization method, and solves the technical problem of how to improve the communication efficiency between the participating side devices in the horizontal federated modeling process on the premise of data privacy protection. Compared with the prior art, the beneficial effects of the computer-readable storage medium provided by the embodiment of the present application are the same as the beneficial effects of the horizontal federal model construction optimization method provided by the above embodiment, and are not described herein again.
EXAMPLE seven
The present application also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of the lateral federated model building optimization method described above.
The computer program product solves the technical problem of how to improve the communication efficiency among the participant devices in the horizontal federal modeling process on the premise of protecting data privacy. Compared with the prior art, the beneficial effects of the computer program product provided by the embodiment of the application are the same as those of the horizontal federal model construction optimization method provided by the embodiment, and are not repeated herein.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.
Claims (11)
1. A method for optimizing the construction of a transverse federated model is characterized by being applied to participant equipment and comprising the following steps:
obtaining a local sample, and inputting the local sample into a private privacy protection module to perform periodic sample transformation on the local sample to obtain a periodic transformation sample;
performing local iterative training optimization on the private privacy protection module and the local shared neural network by inputting the periodic transformation samples into the local shared neural network;
obtaining local network parameters of the local shared neural network, and uploading the local network parameters to a federal server, wherein the federal server is used for aggregating the local network parameters uploaded by each participant device into federal network parameters;
receiving the federal network parameters issued by the federal server, and updating the local network parameters of the local shared neural network into the federal network parameters;
and returning to the execution step: and obtaining a local sample, inputting the local sample into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity to obtain a periodic transformation sample until the transverse federated learning modeling is detected to be finished.
2. The method of optimizing horizontal federated model building of claim 1, wherein the private privacy preserving module comprises a private periodic neural network and a private noise module,
the step of obtaining a periodically transformed sample by inputting the local sample into a private privacy protection module and performing a periodic-based sample transformation on the local sample comprises:
inputting the local sample into the private periodic neural network, and performing periodic-based sample mapping on the local sample to obtain a periodic mapping sample;
and according to the private noise module, carrying out noise addition on the periodic mapping sample to obtain the periodic transformation sample.
3. The method of claim 2, wherein the private periodic neural network comprises neural network parameters and a periodic activation function,
the step of inputting the local samples into the private periodic neural network, and performing periodic-based sample mapping on the local samples to obtain periodic mapping samples comprises:
according to the neural network parameters, carrying out sample mapping on the local sample to obtain a mapping sample;
and according to the periodic activation function, carrying out periodic activation on the mapping sample to obtain the periodic mapping sample.
4. The method for optimizing horizontal federal model build as in claim 1 further comprising:
obtaining a sample to be predicted, and performing sample transformation based on periodicity on the sample to be predicted according to the private privacy protection module to obtain a first periodic transformation prediction sample;
and performing sample prediction on the first periodic transformation prediction sample according to the local shared neural network to obtain a first sample prediction result.
5. The method for optimizing horizontal federated model build according to any one of claims 1 to 4, wherein the method for optimizing horizontal federated model build further comprises:
receiving a second periodic transformation prediction sample sent by a client, and performing sample prediction on the second periodic transformation prediction sample according to a local shared neural network to obtain a second sample prediction result, wherein the second periodic transformation prediction sample is obtained by performing sample transformation based on periodicity on a sample to be predicted by the client according to a target privacy protection module determined by a private privacy protection module issued by the participant equipment;
and feeding back the second sample prediction result to the client.
6. The method of claim 5, wherein prior to the step of receiving the second periodic transformed prediction samples sent by the client, the method of optimizing horizontal federated model building further comprises:
and issuing the private privacy protection module and a preset noise threshold value to a client, so that the client adjusts the private noise module in the private privacy protection module according to the local additional noise determined by the preset noise threshold value to obtain a target privacy protection module.
7. A lateral federal learning system, comprising:
the system comprises at least one participant device, a private privacy protection module and a data processing module, wherein the participant device is used for obtaining a local sample, and inputting the local sample into the private privacy protection module to perform periodic sample transformation on the local sample to obtain a periodic transformation sample; performing local iterative training optimization on the private privacy protection module and the local shared neural network by inputting the periodic transformation samples into the local shared neural network; local network parameters of the local shared neural network are obtained, and the local network parameters are uploaded to a federal server; receiving the federal network parameters issued by the federal server, and updating the local network parameters of the local shared neural network into the federal network parameters; and returning to the execution step: obtaining a local sample, inputting the local sample into a private privacy protection module, and carrying out sample transformation on the local sample on the basis of periodicity to obtain a periodic transformation sample until the transverse federated learning modeling is detected to be finished;
and the federal server is used for aggregating the local network parameters uploaded by the participant equipment into federal network parameters.
8. The lateral federal learning system of claim 7 further comprising a client,
the participant equipment is also used for issuing the private privacy protection module to a client; receiving a periodic transformation prediction sample sent by the client, and performing sample prediction on the periodic transformation prediction sample according to a local shared neural network to obtain a sample prediction result; feeding back a sample prediction result to the client;
the client is used for carrying out sample transformation to be predicted based on periodicity according to the target privacy protection module determined by the private privacy protection module to obtain a periodic transformation prediction sample; sending the periodic transform prediction samples to a participant device; and receiving a sample prediction result fed back by the participant device.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the steps of the lateral federal model build optimization method as claimed in any of claims 1 to 6.
10. A computer-readable storage medium having stored thereon a program for implementing a horizontal federated model build optimization method, the program being executed by a processor to implement the steps of the horizontal federated model build optimization method as recited in any one of claims 1 to 6.
11. A computer program product comprising a computer program which, when executed by a processor, carries out the steps of the lateral federal model building optimization method as claimed in any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211282836.6A CN115438807A (en) | 2022-10-19 | 2022-10-19 | Method, system, equipment, medium and product for optimizing horizontal federal model construction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211282836.6A CN115438807A (en) | 2022-10-19 | 2022-10-19 | Method, system, equipment, medium and product for optimizing horizontal federal model construction |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115438807A true CN115438807A (en) | 2022-12-06 |
Family
ID=84252877
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211282836.6A Pending CN115438807A (en) | 2022-10-19 | 2022-10-19 | Method, system, equipment, medium and product for optimizing horizontal federal model construction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115438807A (en) |
-
2022
- 2022-10-19 CN CN202211282836.6A patent/CN115438807A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114091617B (en) | Federal learning modeling optimization method, electronic device, storage medium, and program product | |
CN112508118A (en) | Target object behavior prediction method aiming at data migration and related equipment thereof | |
WO2022103330A1 (en) | Data protection method, apparatus, medium, and device | |
CN113190872B (en) | Data protection method, network structure training method, device, medium and equipment | |
CN110009101A (en) | Method and apparatus for generating quantization neural network | |
CN112989203B (en) | Material throwing method, device, equipment and medium | |
CN112700067B (en) | Method and system for predicting service quality in unreliable mobile edge environment | |
CN113537512A (en) | Model training method, device, system, equipment and medium based on federal learning | |
CN116340632A (en) | Object recommendation method, device, medium and electronic equipment | |
CN117241092A (en) | Video processing method and device, storage medium and electronic equipment | |
CN114647721B (en) | Educational intelligent robot control method, device and medium | |
CN115438807A (en) | Method, system, equipment, medium and product for optimizing horizontal federal model construction | |
CN115099540B (en) | Carbon neutralization treatment method based on artificial intelligence | |
CN114595474A (en) | Federal learning modeling optimization method, electronic device, medium, and program product | |
CN114237962B (en) | Alarm root cause judging method, model training method, device, equipment and medium | |
CN113723712B (en) | Wind power prediction method, system, equipment and medium | |
CN115293889A (en) | Credit risk prediction model training method, electronic device and readable storage medium | |
CN115470908A (en) | Model security inference method, electronic device, medium, and program product | |
CN111680754B (en) | Image classification method, device, electronic equipment and computer readable storage medium | |
CN118536021B (en) | Real-time groundwater level dynamic monitoring system based on multiple sensors | |
CN114595634B (en) | Modeling method, device, equipment and medium for thermal power generating unit control system | |
CN115470292B (en) | Block chain consensus method, device, electronic equipment and readable storage medium | |
CN118245341B (en) | Service model switching method, device, electronic equipment and computer readable medium | |
CN117648167B (en) | Resource scheduling method, device, equipment and storage medium | |
CN114374738B (en) | Information pushing method and device, storage medium and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |