CN110796267A - Machine learning method and machine learning device for data sharing - Google Patents

Machine learning method and machine learning device for data sharing Download PDF

Info

Publication number
CN110796267A
CN110796267A CN201911102002.0A CN201911102002A CN110796267A CN 110796267 A CN110796267 A CN 110796267A CN 201911102002 A CN201911102002 A CN 201911102002A CN 110796267 A CN110796267 A CN 110796267A
Authority
CN
China
Prior art keywords
learning
shared
local
data
platform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911102002.0A
Other languages
Chinese (zh)
Inventor
李克鹏
王益
章海涛
朴昕阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN201911102002.0A priority Critical patent/CN110796267A/en
Publication of CN110796267A publication Critical patent/CN110796267A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the specification discloses a machine learning method and a machine learning device for data sharing. The method comprises the following steps: local learning is carried out by using local original data to obtain a local model; encrypting the local original data and/or the local model to obtain encrypted shared content; sending the encrypted shared content to a shared learning platform, so that the shared learning platform generates a shared model in a trusted execution environment according to the encrypted shared content provided by a plurality of data providers; a shared model is obtained from a shared learning platform. The method and the device of the embodiment of the specification can utilize data of a plurality of data providers to realize machine learning, and can well ensure the safety of the data and the privacy of a user in the learning process.

Description

Machine learning method and machine learning device for data sharing
Technical Field
The present specification relates to the field of machine learning technologies, and more particularly, to a machine learning method and a machine learning apparatus for data sharing.
Background
Machine Learning (Machine Learning) is an algorithm (technology) that automatically finds rules in historical data and applies (predicts) unknown data using the rules, and can help people make better decisions using the data. The quality and quantity of data used for machine learning have become one of the most important factors influencing the model effect of machine learning, so the demand for improving the model effect by expanding the data volume through multi-party data sharing is becoming stronger and stronger. In the machine learning process of data sharing, how to ensure the safety of data is a very critical thing.
Disclosure of Invention
Embodiments disclosed herein provide a machine learning scheme for data sharing.
According to a first aspect disclosed in the present specification, there is provided a machine learning method of data sharing, comprising the steps of:
local learning is carried out by using local original data to obtain a local model;
encrypting the local original data and/or the local model to obtain encrypted shared content;
sending the encrypted shared content to a shared learning platform, so that the shared learning platform generates a shared model in a trusted execution environment according to the encrypted shared content provided by a plurality of data providers;
a shared model is obtained from a shared learning platform.
Optionally or preferably, the local original data contained in the encrypted shared content is local original data of a non-sensitive part.
Optionally or preferably, the local raw data used by the local learning comprises local raw data of sensitive parts.
Optionally or preferably, the method further comprises the steps of: and generating a fusion model according to the local model and the sharing model.
Optionally or preferably, the local learning using the local raw data includes: and exchanging parameters with other data providers, and performing local learning by using the local original data and the exchanged parameters.
Optionally or preferably, the shared learning platform generates a sharing model in the trusted execution environment according to encrypted shared content provided by a plurality of data providers, including:
decrypting encrypted shared contents provided by a plurality of data providers in a trusted execution environment to obtain local original data of the plurality of data providers, and learning by using the local original data of the plurality of data providers to obtain a shared model; alternatively, the first and second electrodes may be,
and performing parameter exchange on local models provided by different data providers in the trusted execution environment to obtain a sharing model.
Optionally or preferably, the method further comprises the steps of:
authenticating codes in a trusted execution environment of the shared learning platform to monitor the use range of the shared learning platform for the encrypted shared content; and/or the presence of a gas in the gas,
and authenticating the identity of the shared learning platform.
Optionally or preferably, the local learning is triggered by a local learning task issued by the shared learning platform.
Optionally or preferably, the method further comprises the steps of: and initiating a shared learning request to the shared learning platform so that the shared learning platform generates a plurality of local learning tasks according to the shared learning request and issues the local learning tasks to different data providers.
Optionally or preferably, the data provider is a terminal loaded with an App, the App is integrated with a shared learning SDK, and the method is implemented by calling the shared learning SDK.
Optionally or preferably, the data provider is an App server, a shared learning SDK is integrated in the App server, and the method is implemented by calling the shared learning SDK.
According to a second aspect disclosed in the present specification, there is provided a machine learning apparatus including a shared learning SDK unit and a data transmission unit, the shared learning SDK unit including a local learning module and an encryption module:
the local learning module is used for performing local learning by using local original data to obtain a local model;
the encryption module is used for encrypting the local original data or the local model to obtain encrypted shared content;
the data transmission unit is used for sending the encrypted shared content to the shared learning platform, so that the shared learning platform can generate a shared model in the trusted execution environment according to the encrypted shared content provided by a plurality of data providers, and the shared model can be acquired from the shared learning platform.
Optionally or preferably, the shared learning SDK unit further comprises an authentication module;
the authentication module is used for authenticating codes in a trusted execution environment of the shared learning platform so as to supervise the use range of the shared learning platform on the encrypted shared content; and/or for authenticating the identity of the shared learning platform.
Optionally or preferably, the machine learning device is an App terminal or an App server.
According to a third aspect disclosed herein, there is provided a machine learning apparatus comprising a processor and a memory, the processor having stored therein computer instructions which, when executed by the processor, implement the method of any of the preceding claims.
Optionally or preferably, the machine learning device is an App terminal or an App server.
Features of embodiments of the present specification and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the specification and together with the description, serve to explain the principles of the embodiments of the specification.
FIG. 1 is a schematic diagram of a shared learning system provided by one embodiment of the present description;
FIGS. 2 and 3 are schematic diagrams of a machine learning method provided by one embodiment of the present description;
FIGS. 4 and 5 are schematic diagrams of a machine learning method provided by one embodiment of the present description;
FIG. 6 is a block diagram of a machine learning apparatus provided in one embodiment of the present description;
FIG. 7 is a block diagram of a machine learning apparatus provided in one embodiment of the present description;
fig. 8 is a block diagram of a machine learning apparatus provided in an embodiment of the present specification.
Detailed Description
Various exemplary embodiments of the present specification will now be described in detail with reference to the accompanying drawings.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the embodiments, their application, or uses.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
< shared learning System >
Fig. 1 is a schematic diagram of a shared learning system provided in an embodiment of the present specification. As shown in fig. 1, the shared learning system includes a shared learning platform 101 and a plurality of data providers 103. The shared learning platform 101 and the plurality of data providers 103, and the plurality of data providers 103 may communicate with each other via the wireless network 102, or may communicate via one or more other wireless or wired networks.
The shared learning platform 101 may be a server, for example, a server deployed in the cloud, and the configuration thereof may include, but is not limited to: processor 1011, memory 1012, interface 1013, communication device 1014, input device 1015, output device 1016. The processor 1011 may include, but is not limited to, a central processing unit CPU, a microprocessor MCU, or the like. The memory 1012 may include, but is not limited to, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. Interface device 1013 may include, but is not limited to, a USB interface, a serial interface, a parallel interface, and the like. The communication device 1014 is capable of wired or wireless communication, for example, and may specifically include WiFi communication, bluetooth communication, 2G/3G/4G/5G communication, and the like. Input devices 1015 include, but are not limited to, a keyboard, a mouse, a touch screen, a microphone, and the like. Output devices 1016 include, but are not limited to, a display screen, speakers, and the like. The configuration of the shared learning platform 101 may also include only some of the above devices.
The data provider 103 may be, for example, an electronic device installed with an intelligent operating system (e.g., Windows, Linux, android, IOS, etc. systems). The data provider 103 may be a user terminal including, but not limited to, a laptop, a desktop, a cell phone, a tablet, etc. The data provider 103 may also be a server, such as a server deployed in the cloud. Configurations of data provider 103 include, but are not limited to, processor 1031, memory 1032, interface device 1033, communication device 1034, GPU 1035, display device 1036, input device 1037, speaker 1038, microphone 1039, and camera 1030. The processor 1031 includes, but is not limited to, a central processing unit CPU, a microprocessor MCU, and the like. The memory 1032 includes, but is not limited to, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. Interface device 1033 includes, but is not limited to, a USB interface, a serial interface, a parallel interface, and the like. The communication device 1034 is capable of wired or wireless communication, for example, and specifically may include WiFi communication, bluetooth communication, 2G/3G/4G/5G communication, and the like. The GPU 1035 is used to process the image. The display device 1036 includes, but is not limited to, a liquid crystal screen, a touch screen, and the like. Input devices 1037 include, but are not limited to, a keyboard, a mouse, a touch screen, and the like. The configuration of the data provider 103 may include only some of the above devices.
In the embodiments of the present description, the data provider 103 may be a user terminal loaded with an App (Application), or may be an App server, and the App server may be deployed in a cloud.
In one embodiment, a shared learning SDK (software development kit) is integrated in the data provider 103, the shared learning SDK includes a plurality of software modules for implementing a shared learning function, and the data provider 103 performs shared learning by calling the shared learning SDK. When the data provider is a user terminal loaded with an App, the App is integrated with a shared learning SDK. When the data provider is an App server, a shared learning SDK is integrated in a software system of the App server. The shared learning server SDK and the shared learning terminal SDK are collectively referred to as a shared learning SDK, and the shared learning SDK of the data provider 103 may be provided by a shared learning platform or a third party in a unified manner.
For example, three data providers 103 are included in the system; the first data provider 103 is a user terminal loaded with a certain shopping App, and the shopping App is integrated with a shared learning terminal SDK; the second data provider 103 is a server of an insurance App, and a shared learning server SDK is integrated in a software system of the server of the insurance App; the third data provider is a server of a certain bank App, and a shared learning server SDK is integrated in a software system of the bank App.
For example, three data providers 103 are included in the system; the first data provider 103 is a server of a certain shopping App, and a shared learning server SDK is integrated in a software system of the server of the shopping App; the second data provider 103 is a server of a certain bank App, and a shared learning server SDK is integrated in a software system of the server of the bank App; and the third data provider is a server of a certain payment App, and a shared learning server SDK is integrated in a software system of the server of the payment App.
For example, the system includes a plurality of data providers 103; each data provider 103 is a user terminal; some data providers 103 load a certain shopping App, and the shopping App is integrated with a shared learning terminal SDK; a part of data providers 103 are loaded with a certain video App, and a shared learning terminal SDK is integrated in the video App; some data providers 103 simultaneously load the shopping App and the video App, and the shopping App and the video App are respectively integrated with a shared learning terminal SDK.
The shared learning system shown in fig. 1 is merely illustrative and is in no way intended to suggest any limitation as to the embodiments of the specification, their application, or uses. It will be appreciated by those skilled in the art that although the foregoing describes a plurality of devices sharing a learning platform and a plurality of devices of a data provider, embodiments of the present specification may refer to only some of the devices. For example, the shared learning platform may involve only a processor, memory, and communication device, and the data provider may involve only a processor, memory, communication device, display screen, and speaker. Those skilled in the art can design instructions based on the disclosed embodiments of the present specification. How the instructions control the operation of the processor is well known in the art and will not be described in detail herein.
< method for machine learning for data sharing >
The machine learning method for data sharing in the embodiments of the present specification is implemented based on a shared learning system, and may be initiated by a shared learning platform (hereinafter, referred to as "platform") or initiated by a data provider submitting a shared learning request to a platform.
The machine learning method based on the shared learning system in the embodiment of the specification comprises a first learning phase and a second learning phase.
The first learning stage is that a plurality of data providers perform local learning to obtain a local model under the triggering and coordination of the platform. In the process of local learning, a plurality of data providers can perform shared learning training in a mode of exchanging parameters.
The second learning stage is that the data provider encrypts the local original data or the local model obtained in the first learning stage into encrypted shared content and uploads the encrypted shared content to the platform, and the platform generates the shared model by using the encrypted shared content uploaded by the plurality of data providers based on a Trusted Execution Environment (TEE).
In one embodiment, the first learning phase may be entered before the second learning phase. In one embodiment, the second learning phase may be entered before the first learning phase. In one embodiment, there may be one or more first learning stages, or one or more second learning stages, and the first learning stage and the second learning stage may be executed alternately or sequentially, and the specific execution order thereof may be set according to the actual situation, which is not limited in this specification.
< first embodiment >
The machine learning method provided by the first embodiment includes a plurality of steps, wherein steps S104-S106 belong to a first learning phase, and steps S108-S112 belong to a second learning phase.
S102, a certain data provider submits a sharing learning request to the platform.
And S104, the platform generates a plurality of local learning tasks according to the shared learning request, and distributes the local learning tasks to a plurality of data providers.
In this step, the platform can disassemble the shared learning request to generate multiple local learning tasks
And S106, after receiving the local learning task, the data provider respectively reads local original data of the data provider to carry out local learning, so that a local model is obtained.
In the process of local learning by a plurality of data providers, parameters may be exchanged one or more times, and the data providers use the local original data and the exchanged parameters to perform local learning, so that multi-party shared learning is realized.
In this step, the parameter exchanged may be a statistical feature extracted by the data provider from its local raw data, may be a model parameter obtained by the data provider during a training process or a generated random number, and the like, which is not specifically limited in this specification.
In a specific example, the parameters may be encrypted based on a homomorphic encryption algorithm or a secret sharing algorithm and then exchanged, so as to ensure the security of parameter exchange.
In the process implemented in steps S104 to S106, the platform may generate and allocate local learning tasks multiple times, and the platform is responsible for coordinating the local learning tasks of multiple data providers, that is, the number of the local learning tasks responsible for multiple data providers may be the same or different, the local learning tasks responsible for multiple data providers may be started to be executed simultaneously or executed in a certain order, and the specific execution order may be set according to actual conditions, which is not limited in this specification.
It can be seen that in the first learning phase, the data provider does not provide its local raw data to the platform and other data providers, and the local raw data of the data provider is always safely stored locally at the data provider. In a first learning phase, the data provider may use all local raw data, including the local raw data of the sensitive part, for local learning.
And S108, the data provider encrypts the local original data and/or the local model to obtain encrypted shared content, and sends the encrypted shared content to the platform.
In this step, the encryption key used by the data provider may be issued by the platform. And the platform issues the encryption key to the data provider, so that the data provider can obtain the shared encrypted content by utilizing the encryption key for encryption.
The encryption key used by the data provider to encrypt and generate the shared encrypted content may be a public key, and the decryption key used by the subsequent platform to decrypt the shared encrypted content may be a private key. Alternatively, the encryption key used by the data provider to encrypt the generated shared encrypted content and the decryption key used by the subsequent platform to decrypt the shared encrypted content may be symmetric keys. The specification does not limit the specific encryption algorithm.
And S110, the platform generates a sharing model in the trusted execution environment according to the encrypted sharing content provided by the data providers.
In a specific example, the platform decrypts encrypted shared content provided by a plurality of data providers in the trusted execution environment to obtain local original data of the plurality of data providers, and performs shared learning in the trusted execution environment by using the local original data of the plurality of data providers to obtain a shared model.
In a specific example, the platform performs parameter exchange on local models provided by different data providers in a trusted execution environment to obtain a shared model. In this step, the exchanged parameters may be parameters of the local model; in this step, the exchanged parameters may also be statistical features extracted by the data provider from its local original data, or model parameters obtained by the data provider in the training process of the first learning stage or generated random numbers, and these parameters may be uploaded to the platform after being encrypted by the data provider; this is not a particular limitation of the present specification. In a specific example, the parameters may be encrypted based on a homomorphic encryption algorithm or a secret sharing algorithm and then exchanged, so as to ensure the security of parameter exchange.
In a specific example, the platform performs learning training on a local model provided by one data provider by using local original data provided by another data provider in a trusted execution environment to obtain a shared model.
In a specific example, the platform performs parameter exchange on local models provided by different data providers in a trusted execution environment to obtain a shared model, and then performs learning training on the shared model by using local original data provided by one or more data providers to optimize the shared model.
And S112, the platform issues the sharing model to a plurality of data providers.
It can be seen that in the second learning phase, the data provider may need to provide the platform with encrypted local raw data. In one embodiment, the data provider can encrypt only the non-sensitive part of the local original data and provide the encrypted data to the platform; the local original data of the sensitive part is stored in the local of the data provider all the time, and is not spread out all the time.
In this specification embodiment, the "sensitive" or "non-sensitive" of the local raw data is relative, and the data provider may define by itself which data is "sensitive" and which data is "non-sensitive". For example, some local raw data relates to the privacy of the user and belongs to sensitive local raw data.
In another embodiment, step S114 may further include that after receiving the sharing model, the data provider generates a fusion model according to the local model and the sharing model.
< second embodiment >
The machine learning method provided by the second embodiment includes a plurality of steps, wherein steps S204-S208 belong to the second learning phase, and steps S210-S212 belong to the first learning phase.
S202, a certain data provider submits a sharing learning request to the platform.
And S204, under the coordination of the platform, a plurality of data providers encrypt the local original data to obtain encrypted shared content, and the encrypted shared content is sent to the platform.
In this step, the encryption key used by the data provider may be issued by the platform. And the platform issues the encryption key to the data provider, so that the data provider can obtain the shared encrypted content by utilizing the encryption key for encryption.
The encryption key used by the data provider to encrypt and generate the shared encrypted content may be a public key, and the decryption key used by the subsequent platform to decrypt the shared encrypted content may be a private key. Alternatively, the encryption key used by the data provider to encrypt the generated shared encrypted content and the decryption key used by the subsequent platform to decrypt the shared encrypted content may be symmetric keys. The specification does not limit the specific encryption algorithm.
And S206, the platform generates a sharing model in the trusted execution environment according to the encrypted sharing content provided by the data providers.
In a specific example, the platform decrypts encrypted shared content provided by a plurality of data providers in the trusted execution environment to obtain local original data of the plurality of data providers, and performs shared learning in the trusted execution environment by using the local original data of the plurality of data providers to obtain a shared model.
And S208, the platform issues the sharing model to a plurality of data providers.
It can be seen that in the second learning phase, the data provider can provide the platform with encrypted local raw data. In one embodiment, the data provider can encrypt only the non-sensitive part of the local original data and provide the encrypted data to the platform; the local original data of the sensitive part is stored in the local of the data provider all the time, and is not spread out all the time.
S210, the platform generates a plurality of local learning tasks and distributes the local learning tasks to a plurality of data providers.
And S212, after the local learning tasks are received by the plurality of data providers, reading local original data of the data providers respectively to perform local learning, so as to obtain local models.
In the process of local learning by a plurality of data providers, parameters may be exchanged one or more times, and the data providers use the local original data and the exchanged parameters to perform local learning, so that multi-party shared learning is realized.
In this step, the parameter exchanged may be a statistical feature extracted by the data provider from its local raw data, may be a model parameter obtained by the data provider during a training process or a generated random number, and the like, which is not specifically limited in this specification.
In a specific example, the parameters may be encrypted based on a homomorphic encryption algorithm or a secret sharing algorithm and then exchanged, so as to ensure the security of parameter exchange.
In the process implemented in steps S210 to S212, the platform may generate and allocate local learning tasks multiple times, and the platform is responsible for coordinating the local learning tasks of multiple data providers, that is, the number of the local learning tasks responsible for multiple data providers may be the same or different, the local learning tasks responsible for multiple data providers may be started to be executed simultaneously or executed in a certain order, and the specific execution order may be set according to actual conditions, which is not limited in this specification.
It can be seen that in the first learning phase, the data provider does not provide its local raw data to the platform and other data providers, and the local raw data of the data provider is always safely stored locally at the data provider. In a first learning phase, the data provider may use all local raw data, including the local raw data of the sensitive part, for local learning.
And S214, the data provider generates a fusion model according to the local model and the sharing model.
< third embodiment >
The machine learning method provided by the third embodiment includes a plurality of steps, wherein steps S304-S308 belong to the second learning phase, and steps S310-S312 belong to the first learning phase.
S302, a certain data provider submits a sharing learning request to the platform.
And S304, under the coordination of the platform, a plurality of data providers encrypt the local original data to obtain encrypted shared content, and the encrypted shared content is sent to the platform.
S306, the platform generates a sharing model in the trusted execution environment according to the encrypted sharing content provided by the data providers.
And S308, the platform issues the sharing model to a plurality of data providers.
S310, the platform generates a plurality of local learning tasks and distributes the local learning tasks to a plurality of data providers.
And S312, after the local learning tasks are received by the plurality of data providers, reading local original data of the data providers respectively, and continuously performing local learning on the basis of the shared learning model to obtain the local model.
Similarly, in step S312, during the local learning process performed by the multiple data providers, the parameters may be exchanged one or more times, and the data providers perform the local learning using the local raw data and the exchanged parameters, thereby implementing the multi-party shared learning.
< fourth embodiment >
Referring to fig. 2 and fig. 3, a machine learning method based on a shared learning system according to an embodiment of the present disclosure is described.
In this embodiment, App1 and App2 are different apps, e.g., App1 is a shopping App and App2 is a music App. The data provider comprises a server of App1 (hereinafter referred to as "App 1 server") and a server of App2 (hereinafter referred to as "App 2 server"), and the App1 server and the App2 server are deployed in the cloud. Fig. 2 also schematically shows 3 user terminals, and in practical applications, the number of the user terminals is huge, and the user terminals are loaded with at least one of App1 and App 2. In this embodiment, the number of apps, the number of App servers, and the number of user terminals are merely illustrative, and this is only for convenience of explaining the machine learning method of this embodiment, and the present specification is not limited thereto.
Firstly, explaining local original data of an App server:
the local original data of the App server may include local original data originally possessed by the App server, such as registration information data, login record data, online duration data, and the like of the user, where the data is obtained by the App server in a process in which the user regularly uses the App.
The local raw data of the App server may also include local raw data held by the user terminal, which is acquired from the user terminal, and specifically, may be acquired in step S402.
S402, an App1 server acquires local original data from a user terminal loaded with App1, and an App2 server acquires local original data from a user terminal loaded with App 2.
In this step, the user terminal equipped with App1 encrypts local original data with App1, and uploads the encrypted data to the App1 server. For example, App1 is a shopping App, the local raw data may include data related to the use of App1, such as user registration information data in App1, login record data, browsing record data, name of purchased article, category, price, time, customer order, and the like, and the local raw data may also include data of model, brand, processor parameters, geographical location, and the like of the user terminal.
In this step, the user terminal equipped with App2 encrypts local original data with App2, and uploads the encrypted data to the App2 server. For example, App2 is a music App, the local raw data may include data related to the use of App2, such as registration information data of the user in App2, login record data, name, type, playing time of music, and the like, and the local raw data may also include data of the model, brand, processor parameter, geographical location, and the like of the user terminal.
The App1 server and App2 server acquire encrypted data from the user terminal and decrypt the data.
In this step, the encryption key used by the user terminal and the decryption key used by the App server may be issued by the platform. In this step, the encryption key used by the user terminal and the decryption key used by the App server may also be generated by the App server, and the App server generates the encryption key and then issues the encryption key to the user terminal, so that the user terminal can encrypt the local original data by using the encryption key. In this step, the encryption key used by the user terminal may be a public key, and the decryption key used by the App server may be a private key. In this step, the encryption key used by the user terminal and the decryption key used by the App server may also be symmetric keys. The present specification does not limit its specific encryption algorithm.
Referring to fig. 3, the learning process of the machine learning method of the present embodiment includes the following steps, wherein steps S406 to S414 belong to a first learning phase, and steps S416 to S426 belong to a second learning phase.
S402, an App1 server acquires local original data from a user terminal loaded with App1, and an App2 server acquires local original data from a user terminal loaded with App 2.
S404, the App1 server initiates a shared learning request to the platform.
And S406, the platform generates two local learning tasks according to the shared learning request.
S408, the platform distributes one local learning task to an App1 server, and distributes the other local learning task to an App2 server.
And S410, after receiving the local learning task, the App1 server reads the local original data of the App1 server to perform local learning. After receiving the local learning task, the App2 server reads local original data of the App2 server to perform local learning.
In the process of local learning by the S412, App1 server and App2 server, the two parties may exchange parameters once or more times to realize multiparty shared learning.
And S414, finishing the training task in the first learning stage, obtaining the local model 1 by the App1 server, and obtaining the local model 2 by the App2 server.
And S416, the App1 server encrypts the local model 1 and uploads the encrypted local model to the platform, and the App2 server encrypts the local model 2 and uploads the encrypted local model to the platform.
S418, the platform creates a trusted execution environment.
And S420, decrypting the encrypted contents uploaded by the App1 server and the App2 server in the trusted execution environment by the platform to obtain a local model 1 and a local model 2.
And S422, the platform exchanges parameters of the local model 1 and the local model 2 to obtain a sharing model.
And S424, the platform issues the sharing model to an App1 server and an App2 server.
And S426, destroying the trusted execution environment by the platform.
And S428, after receiving the sharing model, the App1 server generates a fusion model according to the local model 1 and the sharing model, namely a final model of the App1 server. After receiving the sharing model, the App2 server generates a fusion model, namely a final model of the App2 server, according to the local model 2 and the sharing model.
Referring to fig. 2, the software system of the platform may include a control module, an authentication module, a decryption module, and a shared learning module. The control module is used for triggering and coordinating the learning tasks of a plurality of data providers. The authentication module, decryption module, and shared learning module of the platform operate in a trusted execution environment. The authentication module is mainly responsible for management of keys, such as providing keys to data providers, providing keys to user terminals, and providing keys for decrypting shared learning content to the decryption module of the platform. The authentication module is also used for performing identity authentication with a data provider, signing software codes running in a trusted execution environment, supporting the data provider to perform signature verification on the software codes and the like. The decryption module is mainly responsible for decrypting the shared encrypted content uploaded by the data provider. The sharing learning module is mainly responsible for generating a sharing model.
The platform provides the shared learning server SDK and the shared learning terminal SDK to operators of App1 and App2, respectively. An operator of the App1 puts the shared learning server SDK into a software system of the App1 server, integrates the shared learning terminal SDK in the App1, and the user terminal acquires the shared learning terminal SDK together by updating and downloading the App 1. An operator of the App2 puts the shared learning server SDK into a software system of the App2 server, integrates the shared learning terminal SDK in the App2, and the user terminal acquires the shared learning terminal SDK together by updating and downloading the App 2. In this way, the App1 server and the App2 server are integrated with the shared learning server SDK, and the user terminal is equipped with App1 and App2, and App1 and App2 are integrated with the shared learning terminal SDK, respectively.
The shared learning server SDK comprises a local learning module, an encryption module, a decryption module and an authentication module. The encryption module is mainly responsible for encrypting the local original data or the local model to obtain the shared encrypted content. The decryption module is mainly responsible for decrypting encrypted data uploaded by the user terminal. The authentication module is mainly responsible for key management, such as obtaining a related key from a platform, generating the related key, issuing the related key to a user terminal, providing the related key to an encryption module and a decryption module of the authentication module, and the like. The authentication module is further configured to authenticate an identity of the platform, authenticate a code in a trusted execution environment of the platform, and the like. The local learning module is used for receiving local learning tasks issued by the platform and performing local learning by using local original data and parameters exchanged from other data providers to obtain a local model.
The shared learning terminal SDK comprises an authentication module and an encryption module. The encryption module is mainly used for encrypting the local original data of the user terminal. The authentication module is mainly responsible for management of keys, for example, obtaining related keys from a platform or an App server, providing the related keys to the encryption module, and the like.
< fifth embodiment >
Referring to fig. 4 and 5, a shared learning method provided in an embodiment of the present specification is described.
In this embodiment, App3, App4 and App5 are different apps. The data provider comprises a server of App4 (hereinafter referred to as "App 4 server") and a server of App5 (hereinafter referred to as "App 5 server"), and the App4 server and the App5 server are deployed in the cloud. The difference from the fourth embodiment is that the data provider further includes a user terminal D, and the user terminal D has App3 mounted therein. In this embodiment, the data provider includes not only the App server but also the user terminals D, and the number of the user terminals D in fig. 4 is only illustrative, and in practical applications, the number of the user terminals D may be huge.
Referring to fig. 5, the learning process of the machine learning method of the present embodiment includes the following steps, wherein steps S506 to S514 belong to a first learning phase, and steps S516 to S526 belong to a second learning phase.
S502, the App5 server acquires local original data from the user terminal loaded with the App 5. This step may refer to step S402.
And S504, the App4 server initiates a shared learning request to the platform.
And S506, the platform generates two local learning tasks according to the shared learning request.
And S508, the platform distributes one local learning task to an App4 server, and distributes the other local learning task to an App5 server.
And S510, after receiving the local learning task, the App4 server reads local original data of the App4 server to perform local learning. After receiving the local learning task, the App5 server reads local original data of the App5 server to perform local learning.
In the process of local learning by the S512, App4 server and App5 server, the two parties may exchange parameters once or more times to realize multiparty shared learning.
And S514, finishing the training task in the first learning stage, and obtaining a local model 4 by the App4 server and a local model 5 by the App5 server.
S516, the App4 server encrypts the local model 4 and uploads the encrypted local model 4 to the platform, the App5 server encrypts the local model 5 and uploads the encrypted local model 5 to the platform, and the user terminal D encrypts local original data of the non-sensitive part of the user terminal D through the App3 carried by the user terminal D and uploads the encrypted local original data to the platform.
S518, the platform creates a trusted execution environment.
S520, the platform decrypts the encrypted contents uploaded by the App4 server, the App5 server and the user terminal D in the trusted execution environment to obtain the local original data of the local model 4, the local model 5 and the user terminal D.
S522, the platform carries out parameter exchange on the local model 4 and the local model 5 to obtain a shared model, and the shared learning model is continuously trained through local original data of the user terminal D to optimize the shared learning model.
And S524, the platform issues the sharing model to an App4 server and an App5 server.
And S526, destroying the trusted execution environment by the platform.
Referring to fig. 4, the software system of the platform may include a control module, an authentication module, a decryption module, and a shared learning module. The control module is used for triggering and coordinating the learning tasks of a plurality of data providers. The authentication module, decryption module, and shared learning module of the platform operate in a trusted execution environment. The authentication module is mainly responsible for management of keys, such as providing keys to data providers, providing keys to user terminals, and providing keys for decrypting shared learning content to the decryption module of the platform. The authentication module is also used for performing identity authentication with a data provider, signing software codes running in a trusted execution environment, supporting the data provider to perform signature verification on the software codes and the like. The decryption module is mainly responsible for decrypting the shared encrypted content uploaded by the data provider. The sharing learning module is mainly responsible for generating a sharing model.
The shared learning server SDK comprises a local learning module, an encryption module, a decryption module and an authentication module. The encryption module is mainly responsible for encrypting the local original data or the local model to obtain the shared encrypted content. The decryption module is mainly responsible for decrypting encrypted data uploaded by the user terminal. The authentication module is mainly responsible for key management, such as obtaining a related key from a platform, generating the related key, issuing the related key to a user terminal, providing the related key to an encryption module and a decryption module of the authentication module, and the like. The authentication module is further configured to authenticate an identity of the platform, authenticate a code in a trusted execution environment of the platform, and the like. The local learning module is used for receiving local learning tasks issued by the platform and performing local learning by using local original data and parameters exchanged from other data providers to obtain a local model.
The shared learning terminal SDK comprises an authentication module and an encryption module. The encryption module is mainly used for encrypting the local original data of the user terminal. The authentication module is mainly responsible for management of keys, for example, obtaining related keys from a platform or an App server, providing related keys to an own encryption module, and the like. The authentication module may also be used to authenticate the identity of the platform, authenticate code in a trusted execution environment of the platform, and the like.
In another embodiment, the shared learning terminal SDK may also include a local learning module, and the local learning module of the shared learning terminal SDK functions similarly to the local learning module of the shared learning server SDK. And the local learning module of the shared learning terminal SDK is used for receiving a local learning task issued by the platform and performing local learning by using the local original data and parameters exchanged from other data providers to obtain a local model.
In the foregoing embodiment, if the data provider encrypts the shared encrypted content, a homomorphic encryption algorithm, a secret sharing algorithm, and the like are used, and the platform may directly use the shared encrypted content to perform training or parameter exchange, and the like, without decrypting the shared encrypted content, to obtain the fusion model.
In the foregoing embodiments, the data provider may also authenticate the identity of the platform before the shared learning begins or during the shared learning. In the foregoing embodiments, the data provider may authenticate code in the trusted execution environment of the platform to oversee the scope of use of the shared encrypted content by the platform.
In the foregoing embodiments, the decryption of the shared encrypted content by the platform and the derivation of the shared model using the shared encrypted content may be performed in one or more trusted execution environments of the platform. The platform can create and destroy the trusted execution environment according to requirements.
The data provider may be a terminal carrying an App, and the App is integrated with a shared learning SDK. The data provider can be an App server, and a shared learning SDK is integrated in the App server. The multiple data providers participating in the shared learning may include both App terminals and App servers, or may include only multiple App terminals, or may include only multiple App servers.
The shared learning SDK may further include a data alignment module, configured to sort the local raw data according to a certain format or rule, so that the local raw data finally obtained by the platform is neat data that can be directly used.
It can be seen from the above embodiments that the user terminal can encrypt the local original data and then directly upload the encrypted local original data to the cloud platform, so that the end-cloud machine sharing learning is realized.
The App server deployed at the cloud end can encrypt local original data and/or a local model and upload the encrypted local original data and/or the encrypted local model to a platform deployed at the cloud end, so that machine sharing learning of cloud-cloud is realized.
The user terminal can encrypt the local original data and upload the encrypted local original data to the App server deployed at the cloud end, and the App server deployed at the cloud end can encrypt the local original data and/or the local model and upload the encrypted local original data and/or the encrypted local model to the platform deployed at the cloud end, so that end-cloud machine sharing learning is realized.
In actual deployment, one of them or a combination of them may be adopted. These combinations should not be construed as limiting the embodiments of the present description.
The machine learning method provided by the embodiment of the specification can realize data security and privacy protection while sharing learning. The machine learning method provided by the embodiment of the specification can fully utilize data of the user terminal, realize end-cloud combined shared learning and exert data value. According to the machine learning method provided by the embodiment of the specification, the data provider can be accessed in a light weight mode, and the development cost is low. According to the machine learning method provided by the embodiment of the specification, a data provider does not need to realize algorithm, computational logic and the like of shared learning, only needs to integrate SDK (software development kit), and the deployment efficiency is greatly improved.
< device learning apparatus >
< first embodiment >
Fig. 6 is a schematic diagram of a machine learning apparatus 700 provided in an embodiment of the present specification, where the machine learning apparatus 700 includes a shared learning SDK unit 710 and a data transmission unit 720, and the shared learning SDK unit 710 includes a local learning module 711 and an encryption module 712.
The local learning module 711 is configured to perform local learning using the local raw data to obtain a local model.
And the authentication module 712 is configured to encrypt the local original data and/or the local model to obtain the encrypted shared content.
A data transmission unit 720, configured to send the encrypted shared content to the shared learning platform, so that the shared learning platform generates a shared model in the trusted execution environment according to the encrypted shared content provided by the multiple data providers, and acquires the shared model from the shared learning platform.
Optionally or preferably, the local original data contained in the encrypted shared content is local original data of a non-sensitive part.
Optionally or preferably, the local raw data used by the local learning comprises local raw data of sensitive parts.
The local learning using local raw data includes: and exchanging parameters with other data providers, and performing local learning by using the local original data and the exchanged parameters.
Optionally or preferably, the local learning module 711 is further configured to generate a fusion model according to the local model and the sharing model.
Alternatively or preferably, the machine learning device 700 is an App terminal or an App server.
< second embodiment >
Fig. 7 is a schematic diagram of a machine learning apparatus 800 provided in an embodiment of the present specification, where the machine learning apparatus 800 includes a shared learning SDK unit 810 and a data transmission unit 820, and the shared learning SDK unit 810 includes a local learning module 811, an encryption module 812, and an authentication module 813.
The local learning module 811 is configured to perform local learning using the local raw data to obtain a local model.
And an encryption module 812, configured to encrypt the local original data and/or the local model to obtain encrypted shared content.
An authentication module 813, configured to authenticate codes in the trusted execution environment of the shared learning platform to monitor a usage range of the shared learning platform for the encrypted shared content; and/or, for authenticating the identity of the shared learning platform; and/or for obtaining keys from the platform and providing encryption module 812.
A data transmission unit 820, configured to transmit the encrypted shared content to the shared learning platform, so that the shared learning platform generates a shared model in the trusted execution environment according to the encrypted shared content provided by the multiple data providers, and acquires the shared model from the shared learning platform.
Optionally or preferably, the local original data contained in the encrypted shared content is local original data of a non-sensitive part.
Optionally or preferably, the local raw data used by the local learning comprises local raw data of sensitive parts.
The local learning using local raw data includes: and exchanging parameters with other data providers, and performing local learning by using the local original data and the exchanged parameters.
Optionally or preferably, the local learning module 811 is further configured to generate a fusion model according to the local model and the sharing model.
Alternatively or preferably, the machine learning device 800 is an App terminal or an App server.
< third embodiment >
Fig. 8 is a schematic diagram of a machine learning apparatus 900 provided in an embodiment of the present specification, where the machine learning apparatus 900 includes a processor 901 and a memory 902.
The memory 902 stores a computer program which, when executed by the processor 901, implements the machine learning method disclosed in any one of the foregoing embodiments.
Alternatively or preferably, the machine learning device 900 is an App terminal or an App server.
The machine learning device provided by the embodiment of the specification can realize data security and privacy protection while sharing learning. The machine learning device provided by the embodiment of the specification can fully utilize data of the user terminal, realize end-cloud combined shared learning, and exert data value. The machine learning device provided by the embodiment of the specification can be accessed in a light weight mode, and the development cost is low. The machine learning device provided by the embodiment of the specification can be used for integrating only the SDK without realizing the algorithm, the computational logic and the like of shared learning, and the deployment efficiency is greatly improved.
< computer-readable Medium >
Embodiments of the present specification further provide a computer readable medium, on which the shared learning SDK disclosed in any one of the foregoing embodiments is stored.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the device and apparatus embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference may be made to some descriptions of the method embodiments for relevant points.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Embodiments of the present description may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement aspects of embodiments of the specification.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations for embodiments of the present description may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), can execute computer-readable program instructions to implement various aspects of embodiments of the present specification by utilizing state information of the computer-readable program instructions to personalize the electronic circuit.
Aspects of embodiments of the present specification are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present description. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
The foregoing description of the embodiments of the present specification has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (15)

1. A machine learning method of data sharing, comprising the steps of:
local learning is carried out by using local original data to obtain a local model;
encrypting the local original data and/or the local model to obtain encrypted shared content;
sending the encrypted shared content to a shared learning platform, so that the shared learning platform generates a shared model in a trusted execution environment according to the encrypted shared content provided by a plurality of data providers;
a shared model is obtained from a shared learning platform.
2. The method of claim 1, the local raw data contained in the encrypted shared content being local raw data of non-sensitive portions.
3. The method of claim 1, the local raw data used by the local learning comprising local raw data of sensitive parts.
4. The method of claim 1, further comprising the steps of:
and generating a fusion model according to the local model and the sharing model.
5. The method of claim 1, the using local raw data for local learning, comprising:
and exchanging parameters with other data providers, and performing local learning by using the local original data and the exchanged parameters.
6. The method of claim 1, the shared learning platform generating a sharing model in a trusted execution environment from encrypted shared content provided by a plurality of data providers, comprising:
decrypting encrypted shared contents provided by a plurality of data providers in a trusted execution environment to obtain local original data of the plurality of data providers, and learning by using the local original data of the plurality of data providers to obtain a shared model; alternatively, the first and second electrodes may be,
and performing parameter exchange on local models provided by different data providers in the trusted execution environment to obtain a sharing model.
7. The method of claim 1, further comprising the steps of:
authenticating codes in a trusted execution environment of the shared learning platform to monitor the use range of the shared learning platform for the encrypted shared content; and/or the presence of a gas in the gas,
and authenticating the identity of the shared learning platform.
8. The method of claim 1, wherein the local learning is triggered by a local learning task issued by a shared learning platform.
9. The method of claim 1, further comprising the steps of:
and initiating a shared learning request to the shared learning platform so that the shared learning platform generates a plurality of local learning tasks according to the shared learning request and issues the local learning tasks to different data providers.
10. The method according to any one of claims 1 to 9, wherein the data provider is a terminal loaded with an App, the App is integrated with a shared learning SDK, and the method is implemented by calling the shared learning SDK.
11. The method according to any one of claims 1 to 9, wherein the data provider is an App server, a shared learning SDK is integrated in the App server, and the method is implemented by calling the shared learning SDK.
12. A machine learning apparatus includes a shared learning SDK unit and a data transmission unit, the shared learning SDK unit including a local learning module and an encryption module:
the local learning module is used for performing local learning by using local original data to obtain a local model;
the encryption module is used for encrypting the local original data or the local model to obtain encrypted shared content;
the data transmission unit is used for sending the encrypted shared content to the shared learning platform, so that the shared learning platform can generate a shared model in the trusted execution environment according to the encrypted shared content provided by a plurality of data providers, and the shared model can be acquired from the shared learning platform.
13. The apparatus of claim 12, the shared learning SDK unit further comprising an authentication module;
the authentication module is used for authenticating codes in a trusted execution environment of the shared learning platform so as to supervise the use range of the shared learning platform on the encrypted shared content; and/or for authenticating the identity of the shared learning platform.
14. A machine learning apparatus comprising a processor and a memory, the processor having stored therein computer instructions which, when executed by the processor, implement the method of any of claims 1-11.
15. The machine learning device of any one of claims 12-14, which is an App terminal or an App server.
CN201911102002.0A 2019-11-12 2019-11-12 Machine learning method and machine learning device for data sharing Pending CN110796267A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911102002.0A CN110796267A (en) 2019-11-12 2019-11-12 Machine learning method and machine learning device for data sharing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911102002.0A CN110796267A (en) 2019-11-12 2019-11-12 Machine learning method and machine learning device for data sharing

Publications (1)

Publication Number Publication Date
CN110796267A true CN110796267A (en) 2020-02-14

Family

ID=69444214

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911102002.0A Pending CN110796267A (en) 2019-11-12 2019-11-12 Machine learning method and machine learning device for data sharing

Country Status (1)

Country Link
CN (1) CN110796267A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111368338A (en) * 2020-05-27 2020-07-03 支付宝(杭州)信息技术有限公司 Data processing method and data processing system based on multi-party privacy protection
CN111428880A (en) * 2020-03-20 2020-07-17 矩阵元技术(深圳)有限公司 Privacy machine learning implementation method, device, equipment and storage medium
CN111490995A (en) * 2020-06-12 2020-08-04 支付宝(杭州)信息技术有限公司 Model training method and device for protecting privacy, data processing method and server
CN111581663A (en) * 2020-04-30 2020-08-25 电子科技大学 Federal deep learning method for protecting privacy and facing irregular users
CN112100145A (en) * 2020-09-02 2020-12-18 南京三眼精灵信息技术有限公司 Digital model sharing learning system and method
WO2021139476A1 (en) * 2020-08-07 2021-07-15 平安科技(深圳)有限公司 Intersection data generation method, and federated model training method based on intersection data
WO2021159684A1 (en) * 2020-02-14 2021-08-19 云从科技集团股份有限公司 Data processing method, system and platform, and device and machine-readable medium
WO2022073264A1 (en) * 2020-10-09 2022-04-14 Huawei Technologies Co., Ltd. Systems and methods for secure and fast machine learning inference in trusted execution environment
WO2022151888A1 (en) * 2021-01-18 2022-07-21 中国农业科学院深圳农业基因组研究所 Data sharing method and apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109194523A (en) * 2018-10-01 2019-01-11 西安电子科技大学 The multi-party diagnostic model fusion method and system, cloud server of secret protection
CN109993308A (en) * 2019-03-29 2019-07-09 深圳先进技术研究院 Learning system and method, shared platform and method, medium are shared based on cloud platform

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109194523A (en) * 2018-10-01 2019-01-11 西安电子科技大学 The multi-party diagnostic model fusion method and system, cloud server of secret protection
CN109993308A (en) * 2019-03-29 2019-07-09 深圳先进技术研究院 Learning system and method, shared platform and method, medium are shared based on cloud platform

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021159684A1 (en) * 2020-02-14 2021-08-19 云从科技集团股份有限公司 Data processing method, system and platform, and device and machine-readable medium
CN111428880A (en) * 2020-03-20 2020-07-17 矩阵元技术(深圳)有限公司 Privacy machine learning implementation method, device, equipment and storage medium
CN111581663A (en) * 2020-04-30 2020-08-25 电子科技大学 Federal deep learning method for protecting privacy and facing irregular users
CN111581663B (en) * 2020-04-30 2022-05-03 电子科技大学 Federal deep learning method for protecting privacy and facing irregular users
CN111368338A (en) * 2020-05-27 2020-07-03 支付宝(杭州)信息技术有限公司 Data processing method and data processing system based on multi-party privacy protection
WO2021239005A1 (en) * 2020-05-27 2021-12-02 支付宝(杭州)信息技术有限公司 Data processing method and data processing system based on multi-party privacy protection
CN111490995A (en) * 2020-06-12 2020-08-04 支付宝(杭州)信息技术有限公司 Model training method and device for protecting privacy, data processing method and server
WO2021139476A1 (en) * 2020-08-07 2021-07-15 平安科技(深圳)有限公司 Intersection data generation method, and federated model training method based on intersection data
CN112100145A (en) * 2020-09-02 2020-12-18 南京三眼精灵信息技术有限公司 Digital model sharing learning system and method
CN112100145B (en) * 2020-09-02 2023-07-04 南京三眼精灵信息技术有限公司 Digital model sharing learning system and method
WO2022073264A1 (en) * 2020-10-09 2022-04-14 Huawei Technologies Co., Ltd. Systems and methods for secure and fast machine learning inference in trusted execution environment
WO2022151888A1 (en) * 2021-01-18 2022-07-21 中国农业科学院深圳农业基因组研究所 Data sharing method and apparatus

Similar Documents

Publication Publication Date Title
CN110796267A (en) Machine learning method and machine learning device for data sharing
CN110619220B (en) Method and device for encrypting neural network model and storage medium
CN104125055B (en) Encryption and decryption method and electronic equipment
US11210658B2 (en) Constructing a distributed ledger transaction on a cold hardware wallet
JP2020526050A (en) Key data processing method and apparatus, and server
CN109525989B (en) Data processing and identity authentication method and system, and terminal
KR20200027500A (en) Generate key certificates that provide device anonymity
US10999260B1 (en) Secure messaging between cryptographic hardware modules
US11108571B2 (en) Managing communications among consensus nodes and client nodes
CN110492990A (en) Private key management method, apparatus and system under block chain scene
CN107786331B (en) Data processing method, device, system and computer readable storage medium
CN113242224B (en) Authorization method and device, electronic equipment and storage medium
CN111143474B (en) One-key binding changing method for mobile phone number based on block chain technology
CN111935166B (en) Communication authentication method, system, electronic device, server, and storage medium
CN111669434B (en) Method, system, device and equipment for establishing communication group
CN111931209A (en) Contract information verification method and device based on zero knowledge certification
US20150310206A1 (en) Password management
CN108470279B (en) Electronic ticket transferring and verifying method, client, server and ticketing system
CN111130805B (en) Secure transmission method, electronic device, and computer-readable storage medium
CN112163046A (en) Block chain-based equipment data storage method, device and system
CN110414269B (en) Processing method, related device, storage medium and system of application installation package
US10785030B2 (en) System for decrypting encrypted data based upon acquired visual representation of encrypted data and related methods
CN107302519B (en) Identity authentication method and device for terminal equipment, terminal equipment and server
CN104065650A (en) Data processing system for voice communication
CN104080080A (en) Data processing system for voice communication

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200214

RJ01 Rejection of invention patent application after publication