CN110874637A - Multi-target fusion learning method, device and system based on privacy data protection - Google Patents

Multi-target fusion learning method, device and system based on privacy data protection Download PDF

Info

Publication number
CN110874637A
CN110874637A CN202010048787.4A CN202010048787A CN110874637A CN 110874637 A CN110874637 A CN 110874637A CN 202010048787 A CN202010048787 A CN 202010048787A CN 110874637 A CN110874637 A CN 110874637A
Authority
CN
China
Prior art keywords
hidden layer
layer parameters
learning
local
integrated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010048787.4A
Other languages
Chinese (zh)
Other versions
CN110874637B (en
Inventor
刘磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202010048787.4A priority Critical patent/CN110874637B/en
Publication of CN110874637A publication Critical patent/CN110874637A/en
Application granted granted Critical
Publication of CN110874637B publication Critical patent/CN110874637B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the specification provides a multi-target fusion learning method, device and system based on privacy data protection, and the method comprises the following steps: the cloud acquires a plurality of learning targets when a plurality of local ends perform fusion learning, determines a plurality of hidden layer parameters corresponding to the plurality of learning targets, and sends the plurality of hidden layer parameters to the plurality of local ends, any one of the plurality of local ends can be based on a neural network model, and the plurality of hidden layer parameters sent by the cloud are used as initial training parameters to perform multi-target learning training on local privacy data, and a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters obtained by learning are shared to the cloud, the updated hidden layer parameters of different learning targets are integrated by the cloud, when the integrated hidden layer parameters meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local ends, and the local ends combine the learning targets to obtain target models.

Description

Multi-target fusion learning method, device and system based on privacy data protection
Technical Field
The document relates to the field of machine learning, in particular to a multi-target fusion learning method, device and system based on privacy data protection.
Background
Generally, when a local end needs to acquire data of other local ends for learning, in order to ensure data privacy and security, a plurality of local ends may be combined with a cloud end for fusion learning, such as federal learning. In the fusion learning, a plurality of local terminals can learn based on local data, and the learning result is shared to the cloud, so that the purpose of fusion learning is realized.
Generally, when the plurality of local ends perform fusion learning, the learning targets are generally the same, for example, the learning targets of the plurality of local ends are all trained to obtain the same model. However, in many practical application scenarios, learning targets of different local terminals may be different, and an effective scheme for achieving the purpose of multi-target learning in a fusion learning scenario is still lacking.
Disclosure of Invention
The embodiment of the specification provides a multi-target fusion learning method, device and system based on privacy data protection, and is used for solving the problem that multi-target learning cannot be effectively carried out in a fusion learning scene.
In order to solve the above technical problem, the embodiments of the present specification are implemented as follows:
in a first aspect, a multi-target fusion learning method based on privacy data protection is provided, which includes:
the cloud end determines a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during fusion learning of the plurality of local ends; sending the plurality of hidden layer parameters to the plurality of local terminals;
any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model to obtain a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters; sending the updated hidden layer parameters to the cloud;
the cloud end integrates a plurality of updated hidden layer parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets; aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end;
and the local end determines a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
In a second aspect, a multi-target fusion learning method based on privacy data protection is provided, which is applied to a cloud, and includes:
determining a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during the fusion learning of a plurality of local terminals;
sending the plurality of hidden layer parameters to the plurality of local terminals, so that any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model, and obtaining a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters;
receiving a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals;
integrating a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end, so that the local end determines a target model corresponding to the learning target of the local end according to the integrated hidden layer parameters.
In a third aspect, a multi-target fusion learning device based on private data protection is provided, which is applied to a cloud, and includes:
a determining unit configured to determine, based on a plurality of learning targets at the time of a plurality of local-side fusion learning, a plurality of hidden layer parameters corresponding to the plurality of learning targets;
the first sending unit is used for sending the hidden layer parameters to the local ends so that any local end of the local ends can perform multi-target learning training on local privacy data by taking the hidden layer parameters as initial training parameters based on a neural network model to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
a receiving unit configured to receive a plurality of updated hidden layer parameters transmitted from the plurality of local terminals, respectively;
the integration unit is used for integrating a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
and a second sending unit, configured to send, for an integrated hidden layer parameter corresponding to any learning target, the integrated hidden layer parameter to a corresponding local end when it is determined that the integrated hidden layer parameter meets a preset condition, so that the local end determines, according to the integrated hidden layer parameter, a target model corresponding to a learning target of the local end.
In a fourth aspect, an electronic device is provided, which includes:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
determining a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during the fusion learning of a plurality of local terminals;
sending the plurality of hidden layer parameters to the plurality of local terminals, so that any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model, and obtaining a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters;
receiving a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals;
integrating a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end, so that the local end determines a target model corresponding to the learning target of the local end according to the integrated hidden layer parameters.
In a fifth aspect, a computer-readable storage medium is presented, the computer-readable storage medium storing one or more programs that, when executed by an electronic device comprising a plurality of application programs, cause the electronic device to perform the method of:
determining a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during the fusion learning of a plurality of local terminals;
sending the plurality of hidden layer parameters to the plurality of local terminals, so that any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model, and obtaining a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters;
receiving a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals;
integrating a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end, so that the local end determines a target model corresponding to the learning target of the local end according to the integrated hidden layer parameters.
In a sixth aspect, a multi-target fusion learning method based on privacy data protection is provided, which is applied to a local end and includes:
receiving a plurality of hidden layer parameters sent by a cloud, wherein the hidden layer parameters correspond to a plurality of learning targets during fusion learning of a plurality of local terminals and are determined by the cloud based on the learning targets;
based on a neural network model, learning and training local privacy data by taking the hidden layer parameters as initial training parameters to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
sending the updated hidden layer parameters to the cloud end so that the cloud end integrates the updated hidden layer parameters and the updated hidden layer parameters respectively sent by other local ends to obtain integrated hidden layer parameters corresponding to the learning targets;
receiving an integrated hidden layer parameter corresponding to a learning target of the local end and sent by the cloud end, wherein the integrated hidden layer parameter is sent by the cloud end when the integrated hidden layer parameter is determined to meet a preset condition;
and determining a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
A seventh aspect provides a multi-target fusion learning apparatus based on privacy data protection, which is applied to a local end, and includes:
the first receiving unit is used for receiving a plurality of hidden layer parameters sent by a cloud end, wherein the hidden layer parameters correspond to a plurality of learning targets during fusion learning of a plurality of local ends, and the hidden layer parameters are determined by the cloud end based on the learning targets;
the learning training unit is used for performing learning training on local privacy data by taking the hidden layer parameters as initial training parameters based on a neural network model to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
the sending unit is used for sending the updated hidden layer parameters to the cloud end so that the cloud end integrates the updated hidden layer parameters and the updated hidden layer parameters respectively sent by other local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
a second receiving unit, configured to receive an integrated hidden layer parameter corresponding to a learning target of the local end, where the integrated hidden layer parameter is sent by the cloud end when it is determined that the integrated hidden layer parameter satisfies a preset condition;
and the determining unit is used for determining a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
In an eighth aspect, an electronic device is provided, which includes:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
receiving a plurality of hidden layer parameters sent by a cloud, wherein the hidden layer parameters correspond to a plurality of learning targets during fusion learning of a plurality of local terminals and are determined by the cloud based on the learning targets;
based on a neural network model, learning and training local privacy data by taking the hidden layer parameters as initial training parameters to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
sending the updated hidden layer parameters to the cloud end so that the cloud end integrates the updated hidden layer parameters and the updated hidden layer parameters respectively sent by other local ends to obtain integrated hidden layer parameters corresponding to the learning targets;
receiving an integrated hidden layer parameter corresponding to a learning target of the local end and sent by the cloud end, wherein the integrated hidden layer parameter is sent by the cloud end when the integrated hidden layer parameter is determined to meet a preset condition;
and determining a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
In a ninth aspect, a computer-readable storage medium is presented, the computer-readable storage medium storing one or more programs that, when executed by an electronic device comprising a plurality of application programs, cause the electronic device to perform the method of:
receiving a plurality of hidden layer parameters sent by a cloud, wherein the hidden layer parameters correspond to a plurality of learning targets during fusion learning of a plurality of local terminals and are determined by the cloud based on the learning targets;
based on a neural network model, learning and training local privacy data by taking the hidden layer parameters as initial training parameters to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
sending the updated hidden layer parameters to the cloud end so that the cloud end integrates the updated hidden layer parameters and the updated hidden layer parameters respectively sent by other local ends to obtain integrated hidden layer parameters corresponding to the learning targets;
receiving an integrated hidden layer parameter corresponding to a learning target of the local end and sent by the cloud end, wherein the integrated hidden layer parameter is sent by the cloud end when the integrated hidden layer parameter is determined to meet a preset condition;
and determining a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
In a tenth aspect, a multi-target fusion learning system based on private data protection is provided, including a cloud and a plurality of local terminals, wherein:
the cloud end determines a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during fusion learning of the plurality of local ends; sending the plurality of hidden layer parameters to the plurality of local terminals;
any one of the plurality of local terminals is based on a neural network model, and the plurality of hidden layer parameters are used as initial training parameters to perform learning training on local privacy data to obtain a plurality of updated hidden layer parameters; sending the updated hidden layer parameters to the cloud;
the cloud end integrates a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated parameters corresponding to the learning targets; aiming at the integrated parameters corresponding to any learning target, when the integrated parameters meet preset conditions, the integrated parameters are sent to the corresponding local end;
and the local end determines a target model corresponding to the learning target of the local end based on the integrated parameters.
At least one technical scheme adopted by one or more embodiments of the specification can achieve the following technical effects:
according to the technical scheme provided by one or more embodiments of the specification, when multiple local ends perform multi-target fusion learning, because any local end can perform multi-target learning training on local private data based on hidden layer parameters issued by a neural network model and a cloud end, and sends a plurality of updated hidden layer parameters obtained by learning to the cloud end, the updated hidden layer parameters are integrated by the cloud end, and when the integrated hidden layer parameters corresponding to any learning target meet preset conditions, the integrated hidden layer parameters are issued to the corresponding local ends, and the local ends combine with the learning targets to obtain target models, so that the hidden layer parameters can be shared by the multiple local ends to the cloud end and the cloud end to be integrated, and the purpose of performing multi-target fusion learning by the multiple local ends is effectively achieved. In addition, in the fusion learning process, the local privacy data cannot be shared by the local ends, so that the safety of the local privacy data of the local ends can be ensured.
Drawings
In order to more clearly illustrate the embodiments of the present specification or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present specification, and for those skilled in the art, other drawings can be obtained according to the drawings without any creative efforts.
FIG. 1 is a schematic diagram of an exemplary system architecture provided by an embodiment of the present disclosure;
FIG. 2 is a flow diagram illustrating a multi-objective fusion learning method based on privacy data protection according to an embodiment of the present disclosure;
FIG. 3 is a flow diagram illustrating a multi-objective fusion learning method based on privacy data protection according to an embodiment of the present disclosure;
FIG. 4 is a flow diagram illustrating a multi-objective fusion learning method based on privacy data protection according to an embodiment of the present disclosure;
FIG. 5 is a flow diagram illustrating a multi-objective fusion learning method based on privacy data protection according to an embodiment of the present disclosure;
FIG. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present description;
FIG. 7 is a schematic structural diagram of a multi-objective fusion learning apparatus based on privacy data protection according to an embodiment of the present specification;
FIG. 8 is a schematic structural diagram of an electronic device according to one embodiment of the present description;
FIG. 9 is a schematic structural diagram of a multi-objective fusion learning apparatus based on privacy data protection according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a multi-target fusion learning system based on privacy data protection according to an embodiment of the present specification.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present disclosure, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in one or more embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all embodiments. All other embodiments obtained by a person skilled in the art without making creative efforts based on the embodiments in the present description shall fall within the protection scope of this document.
In a scenario of fusion learning (such as federal learning), on the basis of protecting security of local privacy data, to achieve the purpose of multi-target fusion learning, embodiments of the present specification provide a multi-target fusion learning method, apparatus, and system based on privacy data protection. The multi-target fusion learning method based on privacy data protection provided by the embodiments of the present description may be executed by an electronic device, where the electronic device may be a terminal device or a server device. In other words, the method may be performed by software or hardware installed on the terminal device or the server device. The server device includes but is not limited to: a single server, a server cluster, a cloud server or a cloud server cluster, and the like. The terminal devices include but are not limited to: any one of smart terminal devices such as a smart phone, a Personal Computer (PC), a notebook computer, a tablet computer, an electronic reader, a web tv, and a wearable device.
A possible application scenario of the technical solution provided in the embodiment of the present specification is described below with reference to fig. 1.
As shown in fig. 1, a system architecture provided in the embodiment of the present specification includes: the cloud end 11, the local end 12, the local ends 13 and … …, and the local end 1N (N is an integer greater than 2, which may be determined specifically according to the actual number of the local ends). The multiple local ends can be connected with the cloud end 11 through a network to perform data interaction, the multiple local ends can also be connected through the network (not shown in the figure), one local end can provide local privacy data, but the multiple local ends do not perform interaction of the privacy data, and in a multi-target fusion learning scene, the learning targets of the multiple local ends are multiple.
In the application scenario shown in fig. 1, the cloud end 11, the local end 12, the local ends 13 and … …, and the local end 1N may be used as an execution subject of the multi-target fusion learning method based on privacy data protection provided in the embodiment of the present specification. In a more specific application scenario, the cloud 11 may represent a centralized organization, and the local end 12, the local end 13, … …, and the local end 1N may represent N business organizations capable of providing private data.
The technical solutions provided by the embodiments of the present description are described in detail below with reference to the accompanying drawings.
Fig. 2 is a flowchart illustrating a multi-objective fusion learning method based on privacy data protection according to an embodiment of the present disclosure. The method can be applied to the cloud end 11, the local end 12, the local end 13, … …, and the local end 1N shown in fig. 1, and the method can include:
s202: the cloud end determines a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during fusion learning of the plurality of local ends; and sending the plurality of hidden layer parameters to the plurality of local terminals.
In S202, when multiple local terminals perform multi-target fusion learning, the cloud may obtain multiple learning targets during the multi-target fusion learning. The cloud end can actively acquire a plurality of learning targets from a plurality of local ends, or the local ends can actively upload the learning targets to the cloud end, and no specific limitation is made here.
In this embodiment, one or more learning targets of one local side may be provided, and learning targets of different local sides may be the same or different, and here, one local side may have one learning target, and the learning targets of different local sides are different.
After the cloud end obtains the plurality of learning targets, a plurality of hidden layer parameters corresponding to the plurality of learning targets can be determined based on the plurality of learning targets. The hidden layer parameters can be understood as parameters of a hidden layer in the neural network model, one hidden layer parameter can correspond to one learning target, and one hidden layer parameter can be used for local end to perform learning training of the learning target on local privacy data based on the neural network model with the hidden layer parameter.
In this embodiment, when the cloud determines a plurality of hidden layer parameters based on a plurality of learning objectives, the cloud may be implemented by at least the following two methods:
the first method comprises the following steps:
based on a plurality of learning targets, a plurality of hidden layer parameters corresponding to the plurality of learning targets are determined from a plurality of standard hidden layer parameters obtained in advance.
Specifically, the cloud may store a plurality of standard hidden layer parameters corresponding to a plurality of different learning objectives in advance according to the plurality of different learning objectives. The plurality of standard hidden layer parameters may be obtained by learning in advance based on historical data, may also be obtained by determining through historical experience, and may also be obtained by other ways, which are not specifically limited herein.
After the cloud stores the multiple standard hidden layer parameters in advance, when the multiple learning targets of the multiple local ends are obtained, the cloud can search the corresponding standard hidden layer parameters from the multiple standard hidden layer parameters stored in advance based on the multiple learning targets, and use the searched standard hidden layer parameters as the multiple hidden layer parameters corresponding to the multiple learning targets of the multiple local ends.
The second method comprises the following steps:
based on a plurality of learning targets, learning and training the pre-stored data of a plurality of local terminals by using a neural network model to obtain a plurality of hidden layer parameters corresponding to the plurality of learning targets.
Specifically, the cloud end may store data in the plurality of local ends in advance, and the data may be data that can be shared among the plurality of local ends and that can be used for performing multi-target fusion learning. Therefore, when the cloud acquires a plurality of learning targets of a plurality of local ends, the learning training of the plurality of learning targets can be carried out on the data in the plurality of local ends stored in advance based on the neural network model, and after the training is finished, a plurality of hidden layer parameters corresponding to the plurality of learning targets can be obtained.
It should be noted that, when the cloud performs learning training of multiple learning targets on the data in the multiple local ends stored in advance based on the neural network model, in an implementation manner, the cloud may perform learning training on the data of the multiple local ends stored in advance based on one neural network model, and the learning training target of the one neural network model may be multiple learning targets of the multiple local ends; in another implementation manner, the cloud end may perform learning training on pre-stored data of a plurality of local ends based on a plurality of neural network models, where one neural network model may be used to perform learning training of a learning target on the pre-stored data of the plurality of local ends.
For the two methods, in practical application, the first method may be preferentially selected to determine to obtain a plurality of hidden layer parameters under the condition that the data of the plurality of local ends are not prestored in the cloud, and the second method may be preferentially selected to determine to obtain a plurality of hidden layer parameters under the condition that the data of the plurality of local ends are prestored in the cloud. Of course, under the condition that the data of the plurality of local ends are stored in the cloud end in advance, the first method can be selected to determine to obtain a plurality of hidden layer parameters.
The cloud end can send the plurality of hidden layer parameters to the plurality of local ends after determining the plurality of hidden layer parameters corresponding to the plurality of learning targets. When the cloud end sends a plurality of hidden layer parameters to a plurality of local ends, the cloud end can send the hidden layer parameters for any local end, namely, one local end can receive the hidden layer parameters from the cloud end. Therefore, any local end can conveniently perform multi-target learning on the local privacy data based on the plurality of hidden layer parameters, and further perform multi-target fusion learning based on the result of the multi-target learning.
Optionally, when the cloud sends the plurality of hidden layer parameters to the plurality of local terminals, in order to ensure the security of data transmission, the cloud may encrypt the plurality of hidden layer parameters before sending the plurality of hidden layer parameters, and then send the encrypted hidden layer parameters to the plurality of local terminals. The encryption mode may be symmetric encryption or asymmetric encryption, and is not limited herein.
S204: any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model to obtain a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters; and sending the updated hidden layer parameters to the cloud.
In S204, after the cloud sends the hidden layer parameters to the local ends, the local ends may perform learning training on the local privacy data based on the hidden layer parameters. The following description will be given taking any one of a plurality of local terminals as an example.
After receiving the plurality of hidden layer parameters, the local end can perform multi-target learning training on the local privacy data by taking the received plurality of hidden layer parameters as initial training parameters based on a neural network model. The number of the neural network models used by the local end can be one or multiple, under the condition that the local end uses one neural network model, the local end can carry out multi-target learning training on local privacy data based on the neural network model, and initial training parameters of a hidden layer in the neural network model are multiple hidden layer parameters received by the local end; under the condition that the local end uses a plurality of neural network models, the local end can perform learning training of a learning target on local privacy data based on one neural network model, and the initial training parameters of the hidden layer in one neural network model are hidden layer parameters corresponding to the learning target in a plurality of hidden layer parameters.
In this embodiment, during the process of performing multi-target learning training on the local privacy data, the plurality of hidden layer parameters will change, and after the training is finished, a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters can be obtained.
After obtaining the plurality of updated hidden layer parameters, the local terminal can send the plurality of updated hidden layer parameters to the cloud terminal.
Optionally, when the local end sends the plurality of updated hidden layer parameters to the cloud, in order to ensure the security of data transmission, the local end may encrypt the plurality of updated hidden layer parameters before sending the plurality of updated hidden layer parameters, and then send the plurality of encrypted updated hidden layer parameters to the cloud. The encryption mode may be symmetric encryption or asymmetric encryption, and is not limited herein.
In this embodiment, after the multiple local terminals perform multi-target learning training on the local privacy data based on the same method and obtain multiple updated hidden layer parameters, the multiple updated hidden layer parameters obtained by the multiple local terminals can be sent to the cloud.
S206: the cloud end integrates a plurality of updated hidden layer parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets; aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end.
In S206, after receiving the updated hidden layer parameters respectively sent by the local terminals, the cloud may integrate the updated hidden layer parameters.
Specifically, first, the cloud end may group a plurality of updated hidden layer parameters respectively sent by the plurality of local ends according to a plurality of learning targets of the plurality of local ends, so as to obtain a plurality of updated hidden layer parameters corresponding to different learning targets.
For a plurality of updated hidden layer parameters corresponding to a learning target, one of the updated hidden layer parameters may correspond to a local end, that is, one updated hidden layer parameter is from a local end.
Secondly, for any learning target in a plurality of learning targets, weighted average is performed on a plurality of updated hidden layer parameters corresponding to the learning target, so as to obtain integrated hidden layer parameters corresponding to the learning target.
When performing weighted averaging, the weights of the updated hidden layer parameters may all be set to 1, or may be determined based on a plurality of local terminals corresponding to the updated hidden layer parameters. For example, the weights of the updated hidden layer parameters may be determined based on differences in quality contributions of local private data of the local ends (including distribution imbalance, data noise, scale factor, and the like), where the larger the difference in quality contributions of the local ends is, the smaller the weight of the updated hidden layer parameter corresponding to the local end is; for another example, the weights of the updated hidden layer parameters may also be determined based on the learning and training effects of the local ends (e.g., the confidence of the neural network model obtained by training, etc.), where the better the learning effect of the local end is, the greater the weight of the updated hidden layer parameter corresponding to the local end is. And are not illustrated one by one here.
Finally, after obtaining the integrated hidden layer parameters corresponding to one learning target, the integrated hidden layer parameters corresponding to other learning targets can be obtained based on the same method, and then a plurality of integrated hidden layer parameters corresponding to a plurality of learning targets are obtained.
It should be noted that, when the plurality of updated hidden layer parameters corresponding to one learning target are integrated, the integration mode is a weighted average mode, and in other implementation modes, other modes, such as an averaging mode or a root mean square mode, may also be adopted.
After the cloud obtains the plurality of integrated hidden layer parameters corresponding to the plurality of learning targets, whether the plurality of integrated hidden layer parameters can be used as a plurality of local terminals to determine the hidden layer parameters used in the plurality of target models (namely, a plurality of target results of multi-target fusion learning) can be further judged. Taking the integrated hidden layer parameter corresponding to any learning target of the plurality of learning targets as an example, whether the integrated hidden layer parameter meets the preset condition can be judged.
When determining whether the integrated hidden layer parameters meet the preset conditions, specifically, first, according to a learning target corresponding to the integrated hidden layer parameters, determining hidden layer parameters corresponding to the learning target from among the multiple hidden layer parameters issued by the cloud in S202; secondly, judging whether the difference value between the determined hidden layer parameter and the integrated hidden layer parameter is smaller than or equal to a preset threshold value, wherein the preset threshold value can be determined according to the actual situation, and is not specifically limited; and finally, if the difference is smaller than or equal to a preset threshold, the integrated hidden layer parameters can be shown to meet the preset conditions, otherwise, the integrated hidden layer parameters can be shown to not meet the preset conditions.
In this embodiment, if the integrated hidden layer parameters satisfy the preset condition, it can be said that the integrated hidden layer parameters can be used as hidden layer parameters used when the local end determines the target model, at this time, the integrated hidden layer parameters can be sent to the corresponding local end, and the corresponding local end can be understood as a local end whose learning target is the learning target corresponding to the integrated hidden layer parameters. The local end may execute S208 after receiving the integrated hidden layer parameters.
If the integrated hidden layer parameters do not meet the preset conditions, the integrated hidden layer parameters cannot be used as hidden layer parameters used when the target model is determined by the local end, and at the moment, the cloud end can send the integrated hidden layer parameters to the plurality of local ends.
After receiving the whole back hidden layer parameters, the plurality of local terminals can perform learning training on the local privacy data by using the integrated hidden layer parameters as initial training parameters based on a neural network model to obtain a plurality of updated hidden layer parameters corresponding to the integrated hidden layer parameters, wherein the updated hidden layer parameters can be represented by the updated integrated parameters for the convenience of distinguishing. The specific implementation manner can be referred to the corresponding content recorded in the above-mentioned S204, and the description is not repeated here.
After obtaining a plurality of updated integration parameters, the plurality of local terminals can send the plurality of updated integration parameters to the cloud, and the cloud can integrate the plurality of updated integration parameters according to the method for integrating the plurality of updated hidden layer parameters, so as to obtain integrated hidden layer parameters, wherein the parameters can be represented by the parameters after multiple times of integration for the convenience of distinguishing.
After the multiple integrated parameters are obtained, whether the multiple integrated parameters meet the preset conditions can be judged. If the difference value between the parameter after the multiple integration and the parameter of the hidden layer after the integration issued by the cloud is smaller than or equal to the preset threshold value, it can be stated that the parameter after the multiple integration meets the preset condition, otherwise, it can be stated that the parameter after the multiple integration does not meet the preset condition.
If the parameters after the multiple integration meet the preset conditions, the parameters after the multiple integration can be sent to the corresponding local end so that the local end can execute S208, otherwise, the parameters after the multiple integration can be sent to a plurality of local ends to continue learning and training, and the process is circulated until the hidden layer parameters obtained after the cloud end integrates a plurality of updated hidden layer parameters returned by the local end meet the preset conditions.
Therefore, through multiple iterations of the hidden layer parameters between the cloud end and the local end, whether the hidden layer parameters which are issued to the local end twice by the cloud end meet the preset conditions is used as a judgment condition for judging whether the iteration is terminated, and finally, a plurality of hidden layer parameters which correspond to a plurality of learning targets and meet the preset conditions can be obtained. After determining the plurality of hidden layer parameters satisfying the preset condition, the plurality of hidden layer parameters may be sent to the corresponding local end, so that the local end may perform S208.
It should be noted that, for different learning targets, when determining whether the multiple integrated hidden layer parameters corresponding to the different learning targets satisfy the preset condition, the preset conditions corresponding to the different learning targets may be the same or different.
Optionally, when the cloud sends the integrated hidden layer parameters meeting the preset conditions to the corresponding local end, the cloud may encrypt the integrated hidden layer parameters to ensure the security of data transmission, and the encryption method may be symmetric encryption or asymmetric encryption.
S208: and the local end determines a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
In S208, after receiving the integrated hidden layer parameters satisfying the preset conditions and sent by the cloud, the local end may determine a target model corresponding to its learning target in combination with its learning target.
Taking the target model of the local end as the neural network model as an example, after receiving the integrated hidden layer parameters meeting the preset conditions sent by the cloud end, the local end can take the integrated hidden layer parameters as the parameters of the hidden layer in the neural network model, and take the neural network model with the integrated hidden layer parameters as the target model corresponding to the learning target of the local end.
In the embodiment, the hidden layer parameters are shared by the plurality of local ends to the cloud end and are integrated by the cloud end, so that the aim of multi-target fusion learning by the plurality of local ends can be effectively fulfilled. In addition, in the fusion learning process, the local privacy data cannot be shared by the local ends, so that the safety of the local privacy data of the local ends can be ensured.
Fig. 3 is a flowchart illustrating a multi-objective fusion learning method based on privacy data protection according to an embodiment of the present disclosure. The execution subject in the embodiment shown in fig. 3 may be the cloud 11 shown in fig. 1, and specifically may include the following steps.
S302: and determining a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during the fusion learning of the plurality of local terminals.
Optionally, when determining a plurality of hidden layer parameters corresponding to a plurality of learning targets based on a plurality of learning targets during fusion learning of a plurality of local terminals, the cloud may include:
determining a plurality of hidden layer parameters corresponding to a plurality of learning targets from a plurality of standard hidden layer parameters obtained in advance based on the plurality of learning targets; or the like, or, alternatively,
based on a plurality of learning targets, learning and training the pre-stored data of a plurality of local terminals by using a neural network model to obtain a plurality of hidden layer parameters corresponding to the plurality of learning targets.
S304: and sending the plurality of hidden layer parameters to the plurality of local terminals, so that any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model, and obtaining a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters.
Optionally, when the cloud sends the plurality of hidden layer parameters to the plurality of local terminals, the cloud may include:
encrypting the hidden layer parameters to obtain a plurality of encrypted hidden layer parameters;
and sending the encrypted plurality of hidden layer parameters to a plurality of local terminals.
When any one of the plurality of local ends performs multi-target learning training based on the plurality of hidden layer parameters issued by the cloud end, the specific implementation manner may refer to the specific implementation of the corresponding steps in the embodiment shown in fig. 2, and a description thereof is not repeated.
S306: and receiving a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals.
S308: integrating the updated model parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets.
Optionally, the cloud integrates a plurality of updated hidden layer parameters respectively sent by the local terminals to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets, which may include:
grouping a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals according to the plurality of learning targets to obtain a plurality of updated hidden layer parameters corresponding to different learning targets;
and aiming at any learning target, carrying out weighted average on a plurality of updated hidden layer parameters corresponding to the learning target to obtain integrated hidden layer parameters corresponding to the learning target.
S310: aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end, so that the local end determines a target model corresponding to the learning target of the local end according to the integrated hidden layer parameters.
Optionally, when the cloud sends the integrated hidden layer parameters to the corresponding local end, the cloud may also encrypt and send the integrated hidden layer parameters, so as to ensure the security of data transmission.
Optionally, when determining that the integrated hidden layer parameters do not meet the preset conditions, the cloud may further send the integrated hidden layer parameters to the plurality of local terminals, so that the plurality of local terminals perform learning training on the local privacy data based on the neural network model by using the integrated hidden layer parameters as initial training parameters to obtain a plurality of updated integrated parameters corresponding to the integrated hidden layer parameters;
receiving a plurality of updated integration parameters sent by a plurality of local terminals;
integrating the updated integration parameters to obtain multiple integrated parameters; and when the parameters after multiple integration meet the preset conditions, sending the parameters after multiple integration to the corresponding local end so that the corresponding local end can determine a target model corresponding to the learning target of the local end according to the parameters after multiple integration.
Specific implementation of the above S302 to S310 can refer to specific implementation of corresponding steps in the embodiment shown in fig. 2, and will not be described in detail here.
Fig. 4 is a flowchart illustrating a multi-objective fusion learning method based on privacy data protection according to an embodiment of the present disclosure. The execution main body of the embodiment shown in fig. 4 may be any one of the local terminals shown in fig. 1, and specifically may include the following steps.
S402: receiving a plurality of hidden layer parameters sent by a cloud, wherein the hidden layer parameters correspond to a plurality of learning targets during fusion learning of a plurality of local terminals, and the hidden layer parameters are determined and obtained by the cloud based on the learning targets.
S404: based on a neural network model, learning and training local privacy data by taking the hidden layer parameters as initial training parameters to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters.
S406: and sending the updated hidden layer parameters to the cloud end so that the cloud end integrates the updated hidden layer parameters and the updated hidden layer parameters respectively sent by other local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets.
Optionally, when the local end sends a plurality of updated hidden layer parameters to the cloud, the method may include:
encrypting the updated hidden layer parameters to obtain encrypted updated hidden layer parameters;
and sending the encrypted updated hidden layer parameters to the cloud.
S408: receiving the integrated hidden layer parameters corresponding to the learning target of the local end and sent by the cloud end, wherein the integrated hidden layer parameters are sent by the cloud end when the integrated hidden layer parameters meet preset conditions.
S410: and determining a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
The specific implementation of S402 to S410 can refer to the specific implementation of the corresponding steps in the embodiment shown in fig. 2, and will not be described in detail here.
According to the technical scheme provided by one or more embodiments of the specification, when multiple local ends perform multi-target fusion learning, because any local end can perform multi-target learning training on local privacy data based on hidden layer parameters issued by a neural network model and a cloud end, and send a plurality of updated hidden layer parameters obtained by learning to the cloud end, the updated hidden layer parameters are integrated by the cloud end, and when the integrated hidden layer parameters corresponding to any learning target meet preset conditions, the integrated hidden layer parameters are issued to the corresponding local ends, and the local ends obtain target models by combining the learning targets of the local ends, so that the hidden layer parameters can be shared by the local ends to the cloud end and the cloud end to be integrated, and the purpose of performing multi-target fusion learning by the local ends is effectively achieved. In addition, in the fusion learning process, the local privacy data cannot be shared by the local ends, so that the safety of the local privacy data of the local ends can be ensured.
To facilitate understanding of technical solutions provided by one or more embodiments of the present description, reference may be made to fig. 5, where fig. 5 is a schematic view of a scenario of a multi-target fusion learning method based on privacy data protection according to an embodiment of the present description.
When the local terminals 1 to N shown in fig. 5 perform multi-target fusion learning, the cloud may obtain a plurality of learning targets T1 to Tn of the local terminals during the multi-target fusion learning, where one local terminal has one learning target and the learning targets of different local terminals are different, and it may be assumed that the learning target of the local terminal 1 is T1, the learning targets of the local terminals 2 are T2 and … …, and the learning target of the local terminal N is Tn.
After determining the plurality of learning targets, the cloud may determine, based on the method described in the embodiment shown in fig. 2, a plurality of hidden layer parameters H1 to Hn corresponding to the plurality of learning targets, where H1 may be a hidden layer parameter corresponding to a learning target T1, H2 may be a hidden layer parameter corresponding to a learning target T2, and … … Hn may be a hidden layer parameter corresponding to a learning target Tn.
Fig. 5 does not show the steps of acquiring a plurality of learning targets and determining hidden layer parameters corresponding to the plurality of learning targets by the cloud.
After determining the hidden layer parameters H1 to Hn, the cloud may send the hidden layer parameters H1 to Hn to the local end 1 to the local end N. As shown in fig. 5.
After the local end 1 to the local end N receive the plurality of hidden layer parameters H1 to Hn, multi-objective learning training may be performed on the local privacy data by using the plurality of hidden layer parameters H1 to Hn as initial training parameters based on the neural network model according to the method described in the embodiment shown in fig. 2, so as to obtain a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters H1 to Hn. The updated hidden layer parameters obtained by the local end 1 may be represented by H11 to Hn1, the updated hidden layer parameters obtained by the local end 2 may be represented by H12 to Hn2, and the updated hidden layer parameters obtained by the local end N of … … may be represented by H1N to HnN.
After obtaining a plurality of updated hidden layer parameters, the local terminals 1 to N send the plurality of updated hidden layer parameters to the cloud, and the cloud groups the plurality of updated hidden layer parameters H11 to Hn1, H12 to Hn2, … …, and H1N to HnN of the plurality of local terminals according to a plurality of learning targets to obtain a plurality of updated hidden layer parameters H11, H12, … …, H1N corresponding to the learning target T1, a plurality of updated hidden layer parameters H21, H22, … …, H2N, … … corresponding to the learning target T2, and a plurality of updated hidden layer parameters H1, Hn2, … …, HnN corresponding to the learning target Tn.
Then, the cloud end can integrate a plurality of updated hidden layer parameters corresponding to any learning target to obtain the integrated hidden layer parameters. The integrated hidden layer parameter corresponding to the learning target T1 may be represented as H1 ', the integrated hidden layer parameter corresponding to the learning target T2 may be represented as H2 ', … …, and the integrated hidden layer parameter corresponding to the learning target Tn may be represented as Hn '.
After the plurality of integrated hidden layer parameters are judged, assuming that H2 'meets a preset condition and H1' and H3 'to Hn' do not meet the preset condition, the cloud end can send H2 'to the local end 2 on one hand, the local end determines a corresponding target model based on H2', and can send H1 'and H3' to Hn 'to the local end 1 to the local end N on the other hand, and the local end 1 to the local end N perform learning training again based on H1' and H3 'to Hn' to obtain a plurality of updated integrated parameters.
After obtaining a plurality of updated integration parameters corresponding to H1 ', H3 ', … … Hn ' from the local terminal 1 to the local terminal N, the cloud terminal may send the plurality of updated integration parameters to the cloud terminal, and after performing grouping integration on the plurality of updated integration parameters, the cloud terminal sends the plurality of updated integration parameters to the corresponding local terminal when determining that the plurality of integrated parameters satisfy a preset condition, and the local terminal determines a target model by combining with its own learning target; and when the parameters after multiple integration are determined not to meet the preset conditions, the parameters are sent to the local end 1 to the local end N for continuous learning training, and iteration is carried out in a circulating manner until the parameters after integration meet the preset conditions after the cloud end carries out grouping integration on the updated hidden layer parameters.
Therefore, the hidden layer parameters meeting the preset conditions can be obtained finally through the cyclic iteration of the hidden layer parameters between the cloud end and the local ends, the local ends can determine the target model corresponding to the learning target of the local ends based on the hidden layer parameters meeting the preset conditions, and the purpose of multi-target fusion learning is achieved.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present specification. Referring to fig. 6, at a hardware level, the electronic device includes a processor, and optionally further includes an internal bus, a network interface, and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory, such as at least 1 disk Memory. Of course, the electronic device may also include hardware required for other services.
The processor, the network interface, and the memory may be connected to each other via an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (peripheral component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 6, but that does not indicate only one bus or one type of bus.
And the memory is used for storing programs. In particular, the program may include program code comprising computer operating instructions. The memory may include both memory and non-volatile storage and provides instructions and data to the processor.
The processor reads a corresponding computer program from the nonvolatile memory to the memory and then runs the computer program to form the multi-target fusion learning device based on the privacy data protection on the logic level. The processor is used for executing the program stored in the memory and is specifically used for executing the following operations:
determining a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during the fusion learning of a plurality of local terminals;
sending the plurality of hidden layer parameters to the plurality of local terminals, so that any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model, and obtaining a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters;
receiving a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals;
integrating a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end, so that the local end determines a target model corresponding to the learning target of the local end according to the integrated hidden layer parameters.
The method executed by the multi-target fusion learning device based on privacy data protection disclosed by the embodiment shown in fig. 6 in the specification can be applied to or implemented by a processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present specification may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present specification may be embodied directly in a hardware decoding processor, or in a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The electronic device may further execute the method of fig. 3, and implement the functions of the multi-target fusion learning apparatus based on privacy data protection in the embodiment shown in fig. 3, which are not described herein again in this specification.
Of course, besides the software implementation, the electronic device of the embodiment of the present disclosure does not exclude other implementations, such as a logic device or a combination of software and hardware, and the like, that is, the execution subject of the following processing flow is not limited to each logic unit, and may also be hardware or a logic device.
This specification embodiment also proposes a computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a portable electronic device comprising a plurality of application programs, are capable of causing the portable electronic device to perform the method of the embodiment shown in fig. 3, and in particular to perform the following operations:
determining a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during the fusion learning of a plurality of local terminals;
sending the plurality of hidden layer parameters to the plurality of local terminals, so that any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model, and obtaining a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters;
receiving a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals;
integrating a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end, so that the local end determines a target model corresponding to the learning target of the local end according to the integrated hidden layer parameters.
Fig. 7 is a schematic structural diagram of a private data protection-based multi-objective fusion learning device 70 according to an embodiment of the present disclosure, where the private data protection-based multi-objective fusion learning device 70 can be applied to the cloud. Referring to fig. 7, in a software implementation, the private data protection-based multi-target fusion learning apparatus 70 may include: a determining unit 71, a first sending unit 72, a receiving unit 73, an integrating unit 74 and a second sending unit 75, wherein:
a determination unit 71 that determines, based on a plurality of learning targets at the time of a plurality of local-side fusion learning, a plurality of hidden layer parameters corresponding to the plurality of learning targets;
a first sending unit 72, configured to send the hidden layer parameters to the local ends, so that any local end of the local ends performs multi-objective learning training on local privacy data based on a neural network model by using the hidden layer parameters as initial training parameters, and obtains updated hidden layer parameters corresponding to the hidden layer parameters;
a receiving unit 73 that receives a plurality of updated hidden layer parameters transmitted from the plurality of local terminals, respectively;
an integrating unit 74, configured to integrate the updated model parameters sent by the local ends, so as to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
the second sending unit 75 is configured to, for an integrated hidden layer parameter corresponding to any learning target, send the integrated hidden layer parameter to a corresponding local end when it is determined that the integrated hidden layer parameter satisfies a preset condition, so that the local end determines a target model corresponding to the learning target of the local end according to the integrated hidden layer parameter.
Optionally, when it is determined that the integrated hidden layer parameter does not satisfy the preset condition, the second sending unit 75 further sends the integrated hidden layer parameter to the plurality of local terminals, so that the plurality of local terminals perform learning training on local privacy data based on a neural network model by using the integrated hidden layer parameter as an initial training parameter, obtain a plurality of updated integrated parameters corresponding to the integrated hidden layer parameter, and send the plurality of updated integrated parameters to the cloud;
the integration unit 74 integrates the updated integration parameters to obtain multiple integrated parameters;
the second sending unit 75, when determining that the multiple integrated parameters satisfy the preset condition, sends the multiple integrated parameters to the corresponding local end, so that the local end determines a target model corresponding to a learning target of the local end according to the multiple integrated parameters.
Optionally, the determining unit 71, based on a plurality of learning targets in a plurality of local-side fusion learning, determines a plurality of hidden layer parameters corresponding to the plurality of learning targets, including:
determining a plurality of hidden layer parameters corresponding to the plurality of learning targets from a plurality of standard hidden layer parameters obtained in advance based on the plurality of learning targets; or the like, or, alternatively,
and based on the plurality of learning targets, learning and training the pre-stored data of the plurality of local ends by utilizing a neural network model to obtain a plurality of hidden layer parameters corresponding to the plurality of learning targets.
Optionally, the first sending unit 72 sends the plurality of hidden layer parameters to the plurality of local ends, including:
encrypting the hidden layer parameters to obtain a plurality of encrypted hidden layer parameters;
and sending the encrypted plurality of hidden layer parameters to the plurality of local terminals.
Optionally, the integrating unit 74 integrates the updated hidden layer parameters sent by the local ends respectively to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets, including:
grouping a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals according to the plurality of learning targets to obtain a plurality of updated hidden layer parameters corresponding to different learning targets;
and aiming at any learning target, carrying out weighted average on a plurality of updated hidden layer parameters corresponding to the learning target to obtain integrated hidden layer parameters corresponding to the learning target.
The private data protection-based multi-target fusion learning device 70 provided in the embodiment of the present specification may further execute the method in fig. 3, and implement the functions of the private data protection-based multi-target fusion learning device in the embodiment shown in fig. 3, which are not described herein again in the embodiment of the present specification.
Fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present specification. Referring to fig. 8, at a hardware level, the electronic device includes a processor, and optionally further includes an internal bus, a network interface, and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory, such as at least 1 disk Memory. Of course, the electronic device may also include hardware required for other services.
The processor, the network interface, and the memory may be connected to each other via an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (peripheral component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 8, but that does not indicate only one bus or one type of bus.
And the memory is used for storing programs. In particular, the program may include program code comprising computer operating instructions. The memory may include both memory and non-volatile storage and provides instructions and data to the processor.
The processor reads a corresponding computer program from the nonvolatile memory to the memory and then runs the computer program to form the multi-target fusion learning device based on the privacy data protection on the logic level. The processor is used for executing the program stored in the memory and is specifically used for executing the following operations:
receiving a plurality of hidden layer parameters sent by a cloud, wherein the hidden layer parameters correspond to a plurality of learning targets during fusion learning of a plurality of local terminals and are determined by the cloud based on the learning targets;
based on a neural network model, learning and training local privacy data by taking the hidden layer parameters as initial training parameters to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
sending the updated hidden layer parameters to the cloud end so that the cloud end integrates the updated hidden layer parameters and the updated hidden layer parameters respectively sent by other local ends to obtain integrated hidden layer parameters corresponding to the learning targets;
receiving an integrated hidden layer parameter corresponding to a learning target of the local end and sent by the cloud end, wherein the integrated hidden layer parameter is sent by the cloud end when the integrated hidden layer parameter is determined to meet a preset condition;
and determining a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
The method executed by the multi-target fusion learning device based on privacy data protection disclosed by the embodiment shown in fig. 8 in the specification can be applied to or implemented by a processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present specification may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present specification may be embodied directly in a hardware decoding processor, or in a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The electronic device may further execute the method shown in fig. 4, and implement the functions of the multi-target fusion learning apparatus based on privacy data protection in the embodiment shown in fig. 4, which are not described herein again in this specification.
Of course, besides the software implementation, the electronic device of the embodiment of the present disclosure does not exclude other implementations, such as a logic device or a combination of software and hardware, and the like, that is, the execution subject of the following processing flow is not limited to each logic unit, and may also be hardware or a logic device.
This specification embodiment also proposes a computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a portable electronic device comprising a plurality of application programs, are capable of causing the portable electronic device to perform the method of the embodiment shown in fig. 3, and in particular to perform the following operations:
receiving a plurality of hidden layer parameters sent by a cloud, wherein the hidden layer parameters correspond to a plurality of learning targets during fusion learning of a plurality of local terminals and are determined by the cloud based on the learning targets;
based on a neural network model, learning and training local privacy data by taking the hidden layer parameters as initial training parameters to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
sending the updated hidden layer parameters to the cloud end so that the cloud end integrates the updated hidden layer parameters and the updated hidden layer parameters respectively sent by other local ends to obtain integrated hidden layer parameters corresponding to the learning targets;
receiving an integrated hidden layer parameter corresponding to a learning target of the local end and sent by the cloud end, wherein the integrated hidden layer parameter is sent by the cloud end when the integrated hidden layer parameter is determined to meet a preset condition;
and determining a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
Fig. 9 is a schematic structural diagram of a private data protection-based multi-objective fusion learning device 90 according to an embodiment of the present disclosure, where the private data protection-based multi-objective fusion learning device 90 can be applied to a local end. Referring to fig. 9, in a software implementation, the private data protection-based multi-target fusion learning apparatus 90 may include: a first receiving unit 91, a learning training unit 92, a transmitting unit 93, a second receiving unit 94, and a determining unit 95, wherein:
a first receiving unit 91, configured to receive multiple hidden layer parameters sent by a cloud, where the multiple hidden layer parameters correspond to multiple learning targets during fusion learning of multiple local terminals, and are determined by the cloud based on the multiple learning targets;
a learning training unit 92, which is configured to perform learning training on the local privacy data by using the plurality of hidden layer parameters as initial training parameters based on a neural network model to obtain a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters;
a sending unit 93, configured to send the updated hidden layer parameters to the cloud, so that the cloud integrates the updated hidden layer parameters and the updated hidden layer parameters sent by other local terminals, respectively, to obtain multiple integrated hidden layer parameters corresponding to the learning targets;
a second receiving unit 94, configured to receive an integrated hidden layer parameter corresponding to the learning target of the local end, which is sent by the cloud end, where the integrated hidden layer parameter is sent by the cloud end when it is determined that the integrated hidden layer parameter meets a preset condition;
the determining unit 95 determines a target model corresponding to the learning target of the local end based on the integrated hidden layer parameter.
Optionally, the sending unit 93 sends the updated hidden layer parameters to the cloud, including:
encrypting the updated hidden layer parameters to obtain a plurality of encrypted updated hidden layer parameters;
and sending the encrypted updated hidden layer parameters to the cloud.
The private data protection-based multi-target fusion learning apparatus 90 provided in the embodiment of the present specification may further execute the method in fig. 4, and implement the functions of the private data protection-based multi-target fusion learning apparatus in the embodiment shown in fig. 4, which are not described herein again.
Fig. 10 is a schematic structural diagram of the multi-target fusion learning system 110 based on privacy data protection according to an embodiment of the present disclosure. The system 110 includes a cloud 111 and a plurality of local ends 112, wherein:
the cloud end 111 determines a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during fusion learning of the plurality of local ends 112; sending the plurality of hidden layer parameters to the plurality of local ends 112;
any one of the local ends 112 performs learning training on the local privacy data by taking the hidden layer parameters as initial training parameters based on a neural network model to obtain updated hidden layer parameters; sending the updated hidden layer parameters to the cloud 111;
the cloud end 111 integrates a plurality of updated model parameters respectively sent by the local ends 112 to obtain a plurality of integrated parameters corresponding to the learning targets; aiming at the integrated parameters corresponding to any learning target, when the integrated parameters meet preset conditions, the integrated parameters are sent to the corresponding local end;
and the local end determines a target model corresponding to the learning target of the local end based on the integrated parameters.
The plurality of local ports 112 may include a local port 1, a local port 2, … …, and a local port N, where N is an integer greater than or equal to 2 and may be determined according to the actual number of local ports.
The specific implementation of the above steps can refer to the content described in the embodiment shown in fig. 2, and the description is not repeated here.
The cloud 111 shown in fig. 10 may also execute the methods shown in fig. 2 to fig. 5, and implement the functions of the cloud in the embodiments shown in fig. 2 to fig. 5, which are not described herein again in this specification. The local ends 112 shown in fig. 10 may also execute the methods shown in fig. 2 to fig. 5, and implement the functions of the local ends in the embodiments shown in fig. 2 to fig. 5, which are not described herein again in this specification.
In short, the above description is only a preferred embodiment of the present disclosure, and is not intended to limit the scope of protection of this document. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of one or more embodiments of the present specification shall be included in the scope of protection of this document.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.

Claims (15)

1. A multi-target fusion learning method based on privacy data protection comprises the following steps:
the cloud end determines a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during fusion learning of the plurality of local ends; sending the plurality of hidden layer parameters to the plurality of local terminals;
any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model to obtain a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters; sending the updated hidden layer parameters to the cloud;
the cloud end integrates a plurality of updated hidden layer parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets; aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end;
and the local end determines a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
2. The method of claim 1, further comprising:
when the cloud end determines that the integrated hidden layer parameters do not meet the preset conditions, the cloud end sends the integrated hidden layer parameters to the local ends;
the plurality of local terminals are based on a neural network model, and learn and train local privacy data by taking the integrated hidden layer parameters as initial training parameters to obtain a plurality of updated integrated parameters corresponding to the integrated hidden layer parameters; sending the updated integration parameters to the cloud;
the cloud end integrates the updated integration parameters to obtain multiple integrated parameters; when the parameters after the multiple times of integration meet the preset conditions, sending the parameters after the multiple times of integration to a corresponding local end;
and the local end determines a target model corresponding to the learning target of the local end according to the parameters after the multiple integration.
3. The method of claim 1, wherein determining a plurality of hidden layer parameters corresponding to a plurality of learning objectives based on the plurality of learning objectives in the local-side fusion learning comprises:
determining a plurality of hidden layer parameters corresponding to the plurality of learning targets from a plurality of standard hidden layer parameters obtained in advance based on the plurality of learning targets; or the like, or, alternatively,
and based on the plurality of learning targets, learning and training the pre-stored data of the plurality of local ends by utilizing a neural network model to obtain a plurality of hidden layer parameters corresponding to the plurality of learning targets.
4. The method of claim 1, sending the plurality of hidden layer parameters to the plurality of local ends, comprising:
encrypting the hidden layer parameters to obtain a plurality of encrypted hidden layer parameters;
and sending the encrypted plurality of hidden layer parameters to the plurality of local terminals.
5. The method of claim 1, sending the plurality of updated hidden layer parameters to the cloud, comprising:
encrypting the updated hidden layer parameters to obtain a plurality of encrypted updated hidden layer parameters;
and sending the encrypted updated hidden layer parameters to the cloud.
6. The method of claim 1, wherein the cloud integrating the updated hidden layer parameters sent by the local terminals respectively to obtain integrated hidden layer parameters corresponding to the learning targets comprises:
grouping a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals according to the plurality of learning targets to obtain a plurality of updated hidden layer parameters corresponding to different learning targets;
and aiming at any learning target, carrying out weighted average on a plurality of updated hidden layer parameters corresponding to the learning target to obtain integrated hidden layer parameters corresponding to the learning target.
7. A multi-target fusion learning method based on privacy data protection is applied to a cloud end and comprises the following steps:
determining a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during the fusion learning of a plurality of local terminals;
sending the plurality of hidden layer parameters to the plurality of local terminals, so that any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model, and obtaining a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters;
receiving a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals;
integrating a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end, so that the local end determines a target model corresponding to the learning target of the local end according to the integrated hidden layer parameters.
8. A multi-target fusion learning method based on privacy data protection is applied to a local end and comprises the following steps:
receiving a plurality of hidden layer parameters sent by a cloud, wherein the hidden layer parameters correspond to a plurality of learning targets during fusion learning of a plurality of local terminals and are determined by the cloud based on the learning targets;
based on a neural network model, learning and training local privacy data by taking the hidden layer parameters as initial training parameters to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
sending the updated hidden layer parameters to the cloud end so that the cloud end integrates the updated hidden layer parameters and the updated hidden layer parameters respectively sent by other local ends to obtain integrated hidden layer parameters corresponding to the learning targets;
receiving an integrated hidden layer parameter corresponding to a learning target of the local end and sent by the cloud end, wherein the integrated hidden layer parameter is sent by the cloud end when the integrated hidden layer parameter is determined to meet a preset condition;
and determining a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
9. The utility model provides a multi-target fusion learning system based on private data protection, includes high in the clouds and a plurality of local end, wherein:
the cloud end determines a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during fusion learning of the plurality of local ends; sending the plurality of hidden layer parameters to the plurality of local terminals;
any one of the plurality of local terminals is based on a neural network model, and the plurality of hidden layer parameters are used as initial training parameters to perform learning training on local privacy data to obtain a plurality of updated hidden layer parameters; sending the updated hidden layer parameters to the cloud;
the cloud end integrates a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated parameters corresponding to the learning targets; aiming at the integrated parameters corresponding to any learning target, when the integrated parameters meet preset conditions, the integrated parameters are sent to the corresponding local end;
and the local end determines a target model corresponding to the learning target of the local end based on the integrated parameters.
10. The utility model provides a multi-target fusion learning device based on private data protection, is applied to the high in the clouds, includes:
a determining unit configured to determine, based on a plurality of learning targets at the time of a plurality of local-side fusion learning, a plurality of hidden layer parameters corresponding to the plurality of learning targets;
the first sending unit is used for sending the hidden layer parameters to the local ends so that any local end of the local ends can perform multi-target learning training on local privacy data by taking the hidden layer parameters as initial training parameters based on a neural network model to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
a receiving unit configured to receive a plurality of updated hidden layer parameters transmitted from the plurality of local terminals, respectively;
the integration unit is used for integrating a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
and a second sending unit, configured to send, for an integrated hidden layer parameter corresponding to any learning target, the integrated hidden layer parameter to a corresponding local end when it is determined that the integrated hidden layer parameter meets a preset condition, so that the local end determines, according to the integrated hidden layer parameter, a target model corresponding to a learning target of the local end.
11. An electronic device, comprising:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
determining a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during the fusion learning of a plurality of local terminals;
sending the plurality of hidden layer parameters to the plurality of local terminals, so that any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model, and obtaining a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters;
receiving a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals;
integrating a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end, so that the local end determines a target model corresponding to the learning target of the local end according to the integrated hidden layer parameters.
12. A computer readable storage medium storing one or more programs which, when executed by an electronic device including a plurality of application programs, cause the electronic device to perform a method of:
determining a plurality of hidden layer parameters corresponding to a plurality of learning targets based on the plurality of learning targets during the fusion learning of a plurality of local terminals;
sending the plurality of hidden layer parameters to the plurality of local terminals, so that any one of the plurality of local terminals performs multi-target learning training on local privacy data by taking the plurality of hidden layer parameters as initial training parameters based on a neural network model, and obtaining a plurality of updated hidden layer parameters corresponding to the plurality of hidden layer parameters;
receiving a plurality of updated hidden layer parameters respectively sent by the plurality of local terminals;
integrating a plurality of updated model parameters respectively sent by the local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
aiming at the integrated hidden layer parameters corresponding to any learning target, when the integrated hidden layer parameters are determined to meet preset conditions, the integrated hidden layer parameters are sent to the corresponding local end, so that the local end determines a target model corresponding to the learning target of the local end according to the integrated hidden layer parameters.
13. A multi-target fusion learning device based on privacy data protection is applied to a local end and comprises:
the first receiving unit is used for receiving a plurality of hidden layer parameters sent by a cloud end, wherein the hidden layer parameters correspond to a plurality of learning targets during fusion learning of a plurality of local ends, and the hidden layer parameters are determined by the cloud end based on the learning targets;
the learning training unit is used for performing learning training on local privacy data by taking the hidden layer parameters as initial training parameters based on a neural network model to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
the sending unit is used for sending the updated hidden layer parameters to the cloud end so that the cloud end integrates the updated hidden layer parameters and the updated hidden layer parameters respectively sent by other local ends to obtain a plurality of integrated hidden layer parameters corresponding to the learning targets;
a second receiving unit, configured to receive an integrated hidden layer parameter corresponding to a learning target of the local end, where the integrated hidden layer parameter is sent by the cloud end when it is determined that the integrated hidden layer parameter satisfies a preset condition;
and the determining unit is used for determining a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
14. An electronic device, comprising:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
receiving a plurality of hidden layer parameters sent by a cloud, wherein the hidden layer parameters correspond to a plurality of learning targets during fusion learning of a plurality of local terminals and are determined by the cloud based on the learning targets;
based on a neural network model, learning and training local privacy data by taking the hidden layer parameters as initial training parameters to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
sending the updated hidden layer parameters to the cloud end so that the cloud end integrates the updated hidden layer parameters and the updated hidden layer parameters respectively sent by other local ends to obtain integrated hidden layer parameters corresponding to the learning targets;
receiving an integrated hidden layer parameter corresponding to a learning target of the local end and sent by the cloud end, wherein the integrated hidden layer parameter is sent by the cloud end when the integrated hidden layer parameter is determined to meet a preset condition;
and determining a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
15. A computer readable storage medium storing one or more programs which, when executed by an electronic device including a plurality of application programs, cause the electronic device to perform a method of:
receiving a plurality of hidden layer parameters sent by a cloud, wherein the hidden layer parameters correspond to a plurality of learning targets during fusion learning of a plurality of local terminals and are determined by the cloud based on the learning targets;
based on a neural network model, learning and training local privacy data by taking the hidden layer parameters as initial training parameters to obtain a plurality of updated hidden layer parameters corresponding to the hidden layer parameters;
sending the updated hidden layer parameters to the cloud end so that the cloud end integrates the updated hidden layer parameters and the updated hidden layer parameters respectively sent by other local ends to obtain integrated hidden layer parameters corresponding to the learning targets;
receiving an integrated hidden layer parameter corresponding to a learning target of the local end and sent by the cloud end, wherein the integrated hidden layer parameter is sent by the cloud end when the integrated hidden layer parameter is determined to meet a preset condition;
and determining a target model corresponding to the learning target of the local end based on the integrated hidden layer parameters.
CN202010048787.4A 2020-01-16 2020-01-16 Multi-target fusion learning method, device and system based on privacy data protection Active CN110874637B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010048787.4A CN110874637B (en) 2020-01-16 2020-01-16 Multi-target fusion learning method, device and system based on privacy data protection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010048787.4A CN110874637B (en) 2020-01-16 2020-01-16 Multi-target fusion learning method, device and system based on privacy data protection

Publications (2)

Publication Number Publication Date
CN110874637A true CN110874637A (en) 2020-03-10
CN110874637B CN110874637B (en) 2020-04-28

Family

ID=69718311

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010048787.4A Active CN110874637B (en) 2020-01-16 2020-01-16 Multi-target fusion learning method, device and system based on privacy data protection

Country Status (1)

Country Link
CN (1) CN110874637B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111160573A (en) * 2020-04-01 2020-05-15 支付宝(杭州)信息技术有限公司 Method and device for protecting business prediction model of data privacy joint training by two parties
CN113537513A (en) * 2021-07-15 2021-10-22 青岛海尔工业智能研究院有限公司 Model training method, device, system, equipment and medium based on federal learning
CN113988254A (en) * 2020-07-27 2022-01-28 腾讯科技(深圳)有限公司 Method and device for determining neural network model for multiple environments
CN114172638A (en) * 2020-09-11 2022-03-11 军事科学院系统工程研究院网络信息研究所 Quantum encryption communication method based on multi-model data fusion
WO2022217781A1 (en) * 2021-04-15 2022-10-20 腾讯云计算(北京)有限责任公司 Data processing method, apparatus, device, and medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090271257A1 (en) * 2008-04-25 2009-10-29 Microsoft Corporation Model for early adoption and retention of sources of funding to finance award program
CN103297503A (en) * 2013-05-08 2013-09-11 南京邮电大学 Mobile terminal swarm intelligent perception structure based on layered information extraction server
CN107766889A (en) * 2017-10-26 2018-03-06 济南浪潮高新科技投资发展有限公司 A kind of the deep learning computing system and method for the fusion of high in the clouds edge calculations
CN108712260A (en) * 2018-05-09 2018-10-26 曲阜师范大学 The multi-party deep learning of privacy is protected to calculate Proxy Method under cloud environment
CN109684855A (en) * 2018-12-17 2019-04-26 电子科技大学 A kind of combined depth learning training method based on secret protection technology
CN110399742A (en) * 2019-07-29 2019-11-01 深圳前海微众银行股份有限公司 A kind of training, prediction technique and the device of federation's transfer learning model
CN110537191A (en) * 2017-03-22 2019-12-03 维萨国际服务协会 Secret protection machine learning

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090271257A1 (en) * 2008-04-25 2009-10-29 Microsoft Corporation Model for early adoption and retention of sources of funding to finance award program
CN103297503A (en) * 2013-05-08 2013-09-11 南京邮电大学 Mobile terminal swarm intelligent perception structure based on layered information extraction server
CN110537191A (en) * 2017-03-22 2019-12-03 维萨国际服务协会 Secret protection machine learning
CN107766889A (en) * 2017-10-26 2018-03-06 济南浪潮高新科技投资发展有限公司 A kind of the deep learning computing system and method for the fusion of high in the clouds edge calculations
CN108712260A (en) * 2018-05-09 2018-10-26 曲阜师范大学 The multi-party deep learning of privacy is protected to calculate Proxy Method under cloud environment
CN109684855A (en) * 2018-12-17 2019-04-26 电子科技大学 A kind of combined depth learning training method based on secret protection technology
CN110399742A (en) * 2019-07-29 2019-11-01 深圳前海微众银行股份有限公司 A kind of training, prediction technique and the device of federation's transfer learning model

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111160573A (en) * 2020-04-01 2020-05-15 支付宝(杭州)信息技术有限公司 Method and device for protecting business prediction model of data privacy joint training by two parties
CN113988254A (en) * 2020-07-27 2022-01-28 腾讯科技(深圳)有限公司 Method and device for determining neural network model for multiple environments
CN113988254B (en) * 2020-07-27 2023-07-14 腾讯科技(深圳)有限公司 Method and device for determining neural network model for multiple environments
CN114172638A (en) * 2020-09-11 2022-03-11 军事科学院系统工程研究院网络信息研究所 Quantum encryption communication method based on multi-model data fusion
CN114172638B (en) * 2020-09-11 2024-04-30 军事科学院系统工程研究院网络信息研究所 Quantum encryption communication method and system based on multi-model data fusion
WO2022217781A1 (en) * 2021-04-15 2022-10-20 腾讯云计算(北京)有限责任公司 Data processing method, apparatus, device, and medium
CN113537513A (en) * 2021-07-15 2021-10-22 青岛海尔工业智能研究院有限公司 Model training method, device, system, equipment and medium based on federal learning

Also Published As

Publication number Publication date
CN110874637B (en) 2020-04-28

Similar Documents

Publication Publication Date Title
CN110874637B (en) Multi-target fusion learning method, device and system based on privacy data protection
CN110874440B (en) Information pushing method and device, model training method and device, and electronic equipment
CN109862018B (en) Anti-crawler method and system based on user access behavior
CN107592292B (en) A kind of block chain communication method between nodes and device
CN109936525B (en) Abnormal account number prevention and control method, device and equipment based on graph structure model
CN109347787B (en) Identity information identification method and device
CN111008709A (en) Federal learning and data risk assessment method, device and system
CN110414567B (en) Data processing method and device and electronic equipment
CN112202908B (en) Method, device, electronic equipment and system for associating equipment with account
CN110874650B (en) Alliance learning method, device and system fusing public domain data and private data
CN105873049B (en) A kind of method, equipment, system and storage medium for being used to share WAP
WO2019052411A1 (en) A binding method, device and system for smart apparatus, and telecommunications system
CN111160572B (en) Multi-label-based federal learning method, device and system
CN110061930B (en) Method and device for determining data flow limitation and flow limiting values
CN110032931B (en) Method and device for generating countermeasure network training and removing reticulation and electronic equipment
CN103685140A (en) Resource sharing method and system based on cloud storage
CN112580085A (en) Model training method and device
CN109639747B (en) Data request processing method, data request processing device, query message processing method, query message processing device and equipment
CN110781153B (en) Cross-application information sharing method and system based on block chain
CN110874647A (en) Private data evaluation and league learning method, device and system in league learning
CN111832862B (en) Flow management method and system based on block chain
CN115118625B (en) Data verification method and device
CN110443746B (en) Picture processing method and device based on generation countermeasure network and electronic equipment
CN111461730B (en) Wind control method, device and system and electronic equipment
CN114638998A (en) Model updating method, device, system and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant