CN110059501B - Safe outsourcing machine learning method based on differential privacy - Google Patents

Safe outsourcing machine learning method based on differential privacy Download PDF

Info

Publication number
CN110059501B
CN110059501B CN201910302716.XA CN201910302716A CN110059501B CN 110059501 B CN110059501 B CN 110059501B CN 201910302716 A CN201910302716 A CN 201910302716A CN 110059501 B CN110059501 B CN 110059501B
Authority
CN
China
Prior art keywords
machine learning
data
noise
privacy
provider
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910302716.XA
Other languages
Chinese (zh)
Other versions
CN110059501A (en
Inventor
李进
雷震光
李同
姜冲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou University
Original Assignee
Guangzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou University filed Critical Guangzhou University
Priority to CN201910302716.XA priority Critical patent/CN110059501B/en
Publication of CN110059501A publication Critical patent/CN110059501A/en
Application granted granted Critical
Publication of CN110059501B publication Critical patent/CN110059501B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a security outsourcing machine learning method based on differential privacy, which belongs to the field of network space security and realizes that a data provider uploads data to a cloud server after processing the data by using a homomorphic encryption technology on the premise of not revealing sensitive data to a third party, and the cloud server stores and adds noise to the encrypted data and performs machine learning by obtaining a query function through interaction with a machine learning model provider. The method effectively combines outsourcing calculation and differential privacy, not only ensures the safety and privacy of machine learning, but also greatly reduces the calculation overhead and the calculation cost, improves the calculation efficiency, and effectively relieves the problems of low efficiency and safety in the traditional outsourcing machine learning method.

Description

Safe outsourcing machine learning method based on differential privacy
Technical Field
The invention belongs to the field of network space security, and particularly relates to a secure outsourcing machine learning method based on differential privacy.
Background
With the development of the internet and information technology, more and more data is generated and utilized. According to statistics, the current global data growth rate is about 40% per year, and the global big data industry is developed vigorously in the five years in the future. In the face of increasingly-proliferated mass data, the cloud computing technology as a novel data computing and storing mode can greatly meet the requirements of the cloud computing technology on storage and processing. Through the storage and computing outsourcing capacity of the cloud computing technology, a user can transfer local computing and storage requirements to the cloud, and the data processing efficiency is improved by means of the powerful computing and storage capacity of the cloud server. Therefore, cloud computing technology with powerful computing power has become a good partner of big data technology.
Meanwhile, machine learning is based on efficient learning algorithms, abundant and huge data and strong computing environments, and a large amount of data accumulated by human beings are widely applied to scenes such as pattern recognition, computer vision, data mining and the like. Under the promotion of scientific research and industrial development, the fields and applications related to machine learning are also increasingly wider, especially in the fields of medicine, finance, commerce and the like. For example, in medical diagnosis, a machine learning model is trained by collecting massive case data, so that the probability of a patient suffering from a certain disease can be accurately analyzed.
Although the cloud computing outsourcing service utilizes the strong storage and computing capability of the cloud computing outsourcing service to solve the problems of difficulty in computing of users and the like, as an incompletely trusted third party, personal sensitive information of the cloud computing outsourcing service faces a plurality of new security challenges including security and privacy protection problems of services such as outsourcing data storage and computing and the like. For example: according to a report of 'New York Times' of 17 th day in 3 and 8 in 2018, Cambridge consult companies acquire over 5000 ten thousand Facebook user data access rights, so that Facebook of 20 hundred million users who are in possession is trapped in the largest personal information leakage wind waves in history.
In response to the above privacy challenges, the traditional solution is that the data provider protects the privacy of the data by using encryption technology, but the final implementation effect is far from ideal. Differential privacy has been widely used and studied as one of the most popular privacy protection techniques, with the main idea being that for two data sets that differ by only one record, the probability of querying them to obtain the same value is negligible. The most common method is to randomize the query result by adding noise to the query result that satisfies a certain distribution. As an alternative, differential privacy not only protects the privacy of data, but also improves the efficiency of data processing. Therefore, the data provider outsources the data to the cloud server, and then the cloud server interacts with the machine learning model provider to complete safe and effective machine learning tasks.
In the research of the existing method, the traditional method has at least the following problems:
1) to accommodate different applications and privacy budgets, different types of noise must be added to the data applied to different query tasks, which inevitably increases computational overhead and interaction, increasing computational cost.
2) When data providers publish their data, a common entity, the cloud server, must be present where all different types of data sets can be stored with different types of noise, posing a significant challenge to the storage space of the cloud server.
Disclosure of Invention
In order to solve the problem of low efficiency caused by adding different types of noise in a data set in the traditional scheme, the invention provides a security outsourcing machine learning method based on differential privacy, which combines a cloud computing technology and a differential privacy technology to outsource complex computing and storing tasks, thereby not only ensuring the security and privacy of machine learning, but also greatly reducing the computing overhead and cost, improving the computing efficiency and effectively relieving the problems of low efficiency and security in the traditional outsourcing machine learning method.
The invention is realized by adopting the following technical scheme: the safe outsourcing machine learning method based on the differential privacy comprises the following steps:
s1, the data provider selects a Paillier encryption algorithm with addition homomorphic encryption property, and negotiates with a machine learning model provider DE to generate a pair of keys (sk, pk), wherein the data provider holds a public key pk, and the machine learning model provider DE holds a private key sk;
s2, the data provider pre-processes its data according to attribute differentiation before uploading, and then uses the public key pk to put the pre-processed data M ═ M1,m2,...,mn) Encrypting, namely encrypting the encrypted ciphertext data | m1||pk,||m2||pk,...,||mn||pkSending to the cloud server CSP;
s3, the cloud server CSP receives the uploaded ciphertext data, obtains the query function F from the machine learning model provider DE, calculates the noise eta meeting the epsilon-differential privacy standard, adds the noise eta to the ciphertext data in the step S2 by using the Add (I M I, I eta I) algorithm, and adds the noisy data I F (M) + eta IpkSending to a machine learning model provider DE;
s4, the machine learning model provider DE receives the noise-added data and decrypts Dec (| | m)1||pk,||m2||pk,...,||mn||pkSk), noise is acquiredAnd acoustic data (F (M) + eta) as input, and a machine learning algorithm is used for analyzing the noise data to complete a machine learning task.
Compared with the traditional full homomorphic encryption, the method does not need to spend a large amount of storage space and calculation space of the cloud server; by utilizing the property of homomorphic encryption, the cloud server in the third step can safely add noise to the encrypted data, thereby solving the problem of data security. Compared with the prior art, the method has the following advantages that:
1) the data provider does not need to add noise locally, and the noise addition is completed by means of a powerful cloud server and by means of cloud computing technology.
2) By adding a homomorphic encryption technology, the property that the integrity of the data of the ciphertext data is not influenced by the addition operation of the ciphertext data is utilized, and the data is ensured not to be leaked between a cloud server and a machine learning model provider in the operation and storage processes; compared with full homomorphic encryption, the method greatly reduces the communication complexity, reduces the interactive operation in the encryption process, reduces the calculation cost and improves the calculation efficiency. Meanwhile, the cited differential privacy technology achieves privacy protection by adding noise to sensitive data.
3) The security of outsourcing machine learning is guaranteed, and machine learning is achieved on the premise that private data are not disclosed to an untrusted third party.
Drawings
FIG. 1 is a flow chart of the outsourcing machine learning method of the present invention;
FIG. 2 is a graph comparing the results of the same machine learning task performed on data using the method of the present invention and on raw, unnoised data.
Detailed Description
Data calculation based on the cloud is used as a novel data calculation and storage mode, and has very strong data processing capacity and larger storage space. According to the method, a large amount of local computing operations (including applying a differential privacy technology and adding noise) can be completed by virtue of the cloud server through a cloud computing technology; the machine learning task is completed through interaction of the cloud server and a machine learning model provider, so that the safe and efficient outsourcing of the machine learning task is realized. The present invention will be described in detail below with reference to the accompanying drawings and examples in order to facilitate the understanding of the present invention by a skilled person, but the embodiments of the present invention are not limited thereto.
Some basic concepts to which the invention relates are as follows:
1) paillier homomorphic encryption: the homomorphic encryption technology is the same as the common encryption technology in that encryption operation is carried out on an encryption side message, namely, under the condition of not decrypting a ciphertext, various calculations on plaintext data can be carried out by executing operation on the ciphertext, and the security requirement of privacy protection is met. Second, homomorphic encryption techniques have natural attributes that are not available with common encryption techniques. The data in the general encryption state can destroy the corresponding plaintext through direct calculation, and the ciphertext data encrypted in the same state can be directly operated without destroying the integrity and confidentiality of the corresponding plaintext information. In summary, homomorphic encryption is a form of encryption that allows a particular type of computation to encrypt ciphertext, and performing a matching result operation on plaintext when decrypting results in an encrypted result. The Paillier homomorphic encryption is additive homomorphic encryption, has better application in ciphertext space calculation and is also suitable for the method.
2) ε -differential privacy: is a framework for formalizing privacy in statistical databases to prevent de-anonymization. Under the definition, the calculation processing result of the database is insensitive to the change of a specific record, and the influence of a single record in the data set or not in the data set on the calculation result is very little. Since differential privacy is a probabilistic concept, any differential privacy mechanism must be random. For the method, a Laplace mechanism is adopted, and data is interfered mainly by adding Laplace noise based on delta F and privacy budget epsilon.
3) Outsourcing calculation: outsourcing computing is a technique that outsources costly, computationally complex computing to untrusted servers, allowing resource-constrained data providers to outsource their computing load to cloud servers with unlimited computing resources.
4) Machine learning: the american field of artificial intelligence expert Arthur Samuel is described for machine learning as follows: machine learning is a field of research that gives computers the ability to learn without directly programming the problem. The field of machine learning is largely divided into three sub-fields: supervised learning, unsupervised learning, and reinforcement learning. Meanwhile, as a service, large internet companies now use machine learning as a service on their cloud platforms. Such as Google predictive API, Amazon machine learning (AmazonML), Microsoft Azure machine learning (Azure ML), and the like. We can accomplish our machine learning task by using a machine learning application on the cloud platform.
As shown in fig. 1, there are 3 entities in the method, which are users (Data providers), Cloud Servers (CSP), and Data Evaluators (DE) of machine learning model providers, respectively; wherein, the user possesses the data and provides the data for machine learning; the CSP and the user interact with each other to provide cloud storage and outsourcing computing service for the CSP and the user, and noise processing is carried out on data; and the DE and the CSP interact to acquire noisy data and execute a corresponding machine learning task. The method comprises the following specific steps:
s1, the user selects a Paillier encryption algorithm with addition homomorphic encryption property, and negotiates with DE to generate a pair of keys (sk, pk). Randomly taking large prime numbers p and q, making n ═ pq, λ (n) ═ lcm (p-1, q-1), and defining function L (n) ═ n/n, then randomly taking g ∈ (Z/n2Z) · so that it satisfies gcd (L (g/n 2Z) · Lλ(n)mod n2) 1, where lcm and gcd represent the least common multiple and the greatest common divisor, respectively. The public key is pk ═ (n, g), and the private key is sk ═ p, λ; wherein the user holds the public key pk and DE holds the private key sk.
And S2, the user preprocesses the data before uploading according to attribute distinction. Subsequently, the preprocessed data M is given (M) by the public key pk1,m2,...,mn) Encrypting, namely encrypting the encrypted ciphertext data | m1||pk,||m2||pk,...,||mn||pkAnd sending the data to a cloud server.
The encryption process is as follows: the plaintext to be encrypted is M E ZnSelecting a random number r < n, and calculating the cipher text C ═ gMrnmod n2
S3, the cloud server CSP receives the ciphertext data uploaded in the step S2, obtains a query function F from a machine learning model provider DE, calculates noise eta meeting the epsilon-difference privacy standard by adopting a Laplace mechanism, adds the noise eta to the ciphertext data in the step S2 by using an Add (I M I, I eta I) algorithm, and adds the noisy data I F (M) + eta IpkSent to the DE.
Due to the adoption of the addition homomorphic encryption Paillier algorithm, the privacy of data is prevented from being disclosed by a cloud server after the data is encrypted by a user, and meanwhile, the cloud server can directly operate ciphertext data without destroying the integrity and confidentiality of corresponding plaintext information by utilizing the addition homomorphic encryption property E (x + y) ═ Eval (E (x), E (y)).
The method for calculating the noise eta meeting the epsilon-difference privacy standard by the cloud server CSP by using the Laplace mechanism comprises the following steps:
s31, the cloud server CSP firstly interacts with a machine learning model provider DE to obtain a query function F, and the sensitivity delta F is calculated.
And S32, calculating b as delta F/epsilon according to the set privacy budget parameter.
And S33, generating Laplace noise eta.
S4, DE receives the noised data uploaded in the step S3, and decrypts Dec (| | m)1||pk,||m2||pk,...,||mn||pkSk), noise data (f (m) + η) is obtained and used as input, and the noise data is analyzed by machine learning algorithm, such as classification and regression, to complete the machine learning task. Because the data is noisy, the machine learning model provider can not obtain the original data, and privacy protection in the machine learning process is realized.
The decryption process is as follows: calculating f (m) + η ═ L (C)λ(n)mod n2) μ mod n, where C is the ciphertext information.
From the implementation process, the secure outsourcing machine learning method based on the epsilon-differential privacy combines the cloud computing technology and the differential privacy, and outsourcing the noise adding operation to the cloud server by using the homomorphic encryption property. The cloud server enables data to safely participate in machine learning through interaction with a machine learning model provider, privacy of a user is protected, privacy disclosure is not generated, and operation efficiency is greatly improved compared with a traditional scheme. The method has strong applicability, and can be widely applied to various scenes in need of realizing privacy protection machine learning.
In the present embodiment, the user is a hospital, and the hospital is a data provider who desires to perform intelligent diagnosis by applying a machine learning algorithm using private data that is a personal case of a patient. However, it requires both providing useful data for machine learning and protecting the privacy of the patient. The other two entities, the cloud server and the machine learning model provider are not completely credible, and machine learning can be realized under the condition of not revealing privacy to other entities by using the method provided by the invention.
First, a hospital, as a user or a data provider, generates a pair of keys (sk, pk) by using a Paillier addition homomorphic encryption algorithm, wherein the hospital holds a public key pk, and sensitive data M (M) for training a machine learning model is defined by using the public key pk1,m2,...,mn) And carrying out encryption processing to generate a ciphertext, and handing the private key sk to a machine learning model provider for storage.
Then, the hospital uploads the encrypted ciphertext data to a cloud server, the cloud server receives the uploaded ciphertext, obtains a query function F from a machine learning model provider DE, calculates noise eta meeting an epsilon-difference privacy standard by adopting a Laplace mechanism, adds the noise eta into the generated ciphertext by using an Add (M and eta) algorithm, and adds the noisy data F (M) and + eta | into the generated ciphertextpkAnd sending the information to a machine learning model provider.
The machine learning provider decrypts the data with the private key sk to obtain noise data, and can perform machine learning tasks according to the requirements of the hospital, such as diagnosis of specific diseases.
As shown in FIG. 2, the original data without noise is compared through experiments, and in the machine learning process, the influence of different machine learning algorithms on the classification task performed by the method after the noise is added to the original data is small. Therefore, the accuracy of machine learning is not greatly influenced on the premise of ensuring safety and privacy, and experiments can prove that the method is completely effective.
The embodiments described in this patent are only some, not all embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

Claims (4)

1. A safety outsourcing machine learning method based on differential privacy is characterized by comprising the following steps:
s1, the data provider selects a Paillier encryption algorithm with addition homomorphic encryption property, and negotiates with a machine learning model provider DE to generate a pair of keys (sk, pk), wherein the data provider holds a public key pk, and the machine learning model provider DE holds a private key sk;
s2, the data provider pre-processes its data according to attribute differentiation before uploading, and then uses the public key pk to put the pre-processed data M ═ M1,m2,...,mn) Encrypting, namely encrypting the encrypted ciphertext data | m1||pk,||m2||pk,...,||mn||pkSending to the cloud server CSP;
s3, the cloud server CSP receives the uploaded ciphertext data, obtains the query function F from the machine learning model provider DE, calculates the noise eta meeting the epsilon-differential privacy standard, adds the noise eta to the ciphertext data in the step S2 by using the Add (I M I, I eta I) algorithm, and adds the noisy data I F (M) + eta IpkSending to a machine learning model provider DE;
s4, the machine learning model provider DE receives the noise-added data and decrypts Dec (| | m)1||pk,||m2||pk,...,||mn||pkAfter sk), acquiring noise data (F (M) + eta), and taking the noise data as input, and analyzing the noise data by using a machine learning algorithm to complete a machine learning task;
in step S3, the cloud server CSP calculates the noise η that meets the epsilon-difference privacy standard by using the Laplace mechanism, and the steps are as follows:
s31, the cloud server CSP firstly interacts with a machine learning model provider DE to obtain a query function F, and the sensitivity delta F is calculated;
s32, calculating b as delta F/epsilon according to the set privacy budget parameter;
and S33, generating noise eta.
2. The secure outsourcing machine learning method based on differential privacy of claim 1, wherein the generation process of the secret key (sk, pk) is: randomly taking large prime numbers p and q, letting n ═ pq, λ (n) ═ lcm (p-1, q-1), and defining a function l (n) ═ n/n; then randomly taking g e (Z/n)2Z) such that it satisfies gcd (L (g)λ(n)mod n2) 1); where lcm and gcd represent the least common multiple and the greatest common divisor, respectively, the public key pk ═ n, g, and the private key sk ═ p, λ.
3. The secure outsourcing machine learning method based on differential privacy of claim 2, wherein in step S2, the encryption process: the plaintext to be encrypted is M E ZnSelecting a random number r < n, and calculating the cipher text C ═ gMrnmod n2
4. The secure outsourcing machine learning method based on differential privacy of claim 2 or 3, wherein in step S4, the decryption process: calculating f (m) + η ═ L (C)λ(n)mod n2) μ mod n, where C is the ciphertext information.
CN201910302716.XA 2019-04-16 2019-04-16 Safe outsourcing machine learning method based on differential privacy Active CN110059501B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910302716.XA CN110059501B (en) 2019-04-16 2019-04-16 Safe outsourcing machine learning method based on differential privacy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910302716.XA CN110059501B (en) 2019-04-16 2019-04-16 Safe outsourcing machine learning method based on differential privacy

Publications (2)

Publication Number Publication Date
CN110059501A CN110059501A (en) 2019-07-26
CN110059501B true CN110059501B (en) 2021-02-02

Family

ID=67319169

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910302716.XA Active CN110059501B (en) 2019-04-16 2019-04-16 Safe outsourcing machine learning method based on differential privacy

Country Status (1)

Country Link
CN (1) CN110059501B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022214291A1 (en) * 2021-04-08 2022-10-13 Biotronik Se & Co. Kg Ai based patient assessment

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021120229A1 (en) * 2019-12-20 2021-06-24 深圳前海微众银行股份有限公司 Data processing method, apparatus and system
CN111260081B (en) * 2020-02-14 2023-03-14 广州大学 Non-interactive privacy protection multi-party machine learning method
CN111275202B (en) * 2020-02-20 2023-08-11 济南大学 Machine learning prediction method and system for data privacy protection
CN111526148B (en) * 2020-04-26 2022-02-25 中山大学 System and method for safely denoising encrypted audio in cloud computing environment
US11599806B2 (en) 2020-06-22 2023-03-07 International Business Machines Corporation Depth-constrained knowledge distillation for inference on encrypted data
CN113553610B (en) * 2021-09-22 2021-12-31 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Multi-party privacy protection machine learning method based on homomorphic encryption and trusted hardware

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108521326A (en) * 2018-04-10 2018-09-11 电子科技大学 A kind of Linear SVM model training algorithm of the secret protection based on vectorial homomorphic cryptography
CN108959958A (en) * 2018-06-14 2018-12-07 中国人民解放军战略支援部队航天工程大学 A kind of method for secret protection and system being associated with big data
CN109284626A (en) * 2018-09-07 2019-01-29 中南大学 Random forests algorithm towards difference secret protection

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108259158B (en) * 2018-01-11 2021-03-23 西安电子科技大学 Single-layer sensing machine learning method with high efficiency and privacy protection under cloud computing environment
US11475350B2 (en) * 2018-01-22 2022-10-18 Google Llc Training user-level differentially private machine-learned models
CN108717514B (en) * 2018-05-21 2020-06-16 中国人民大学 Data privacy protection method and system in machine learning
CN109376549B (en) * 2018-10-25 2021-09-10 广州电力交易中心有限责任公司 Electric power transaction big data publishing method based on differential privacy protection
CN109327304B (en) * 2018-12-18 2022-02-01 武汉大学 Lightweight homomorphic encryption method for realizing privacy protection in cloud computing
CN109992979B (en) * 2019-03-15 2020-12-11 暨南大学 Ridge regression training method, computing device and medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108521326A (en) * 2018-04-10 2018-09-11 电子科技大学 A kind of Linear SVM model training algorithm of the secret protection based on vectorial homomorphic cryptography
CN108959958A (en) * 2018-06-14 2018-12-07 中国人民解放军战略支援部队航天工程大学 A kind of method for secret protection and system being associated with big data
CN109284626A (en) * 2018-09-07 2019-01-29 中南大学 Random forests algorithm towards difference secret protection

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022214291A1 (en) * 2021-04-08 2022-10-13 Biotronik Se & Co. Kg Ai based patient assessment

Also Published As

Publication number Publication date
CN110059501A (en) 2019-07-26

Similar Documents

Publication Publication Date Title
CN110059501B (en) Safe outsourcing machine learning method based on differential privacy
Liu et al. Hybrid privacy-preserving clinical decision support system in fog–cloud computing
Wang et al. A privacy-enhanced retrieval technology for the cloud-assisted internet of things
Kwabena et al. Mscryptonet: Multi-scheme privacy-preserving deep learning in cloud computing
Mohassel et al. Practical privacy-preserving k-means clustering
Zhang et al. GELU-Net: A Globally Encrypted, Locally Unencrypted Deep Neural Network for Privacy-Preserved Learning.
Liu et al. Privacy-preserving outsourced calculation toolkit in the cloud
Vaidya et al. Privacy-preserving naive bayes classification
Liu et al. Towards practical privacy-preserving decision tree training and evaluation in the cloud
Liu et al. Toward highly secure yet efficient KNN classification scheme on outsourced cloud data
CN108259158A (en) Efficient and secret protection individual layer perceptron learning method under a kind of cloud computing environment
CN109194507A (en) The protection privacy neural net prediction method of non-interactive type
Pang et al. Privacy-preserving association rule mining using homomorphic encryption in a multikey environment
Guo et al. A privacy-preserving online medical prediagnosis scheme for cloud environment
Baryalai et al. Towards privacy-preserving classification in neural networks
Acar et al. Achieving secure and differentially private computations in multiparty settings
Owusu-Agyemeng et al. MSDP: multi-scheme privacy-preserving deep learning via differential privacy
Jiang et al. Secure neural network in federated learning with model aggregation under multiple keys
Xue et al. Secure and privacy-preserving decision tree classification with lower complexity
Das et al. A secure softwarized blockchain-based federated health alliance for next generation IoT networks
CN112347473B (en) Machine learning security aggregation prediction method and system supporting bidirectional privacy protection
Nita et al. Homomorphic Encryption
Liu et al. Secure and fast decision tree evaluation on outsourced cloud data
Chen et al. Double rainbows: A promising distributed data sharing in augmented intelligence of things
Giannopoulos et al. Privacy preserving medical data analytics using secure multi party computation. an end-to-end use case

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant