CN110059501A - A kind of safely outsourced machine learning method based on difference privacy - Google Patents
A kind of safely outsourced machine learning method based on difference privacy Download PDFInfo
- Publication number
- CN110059501A CN110059501A CN201910302716.XA CN201910302716A CN110059501A CN 110059501 A CN110059501 A CN 110059501A CN 201910302716 A CN201910302716 A CN 201910302716A CN 110059501 A CN110059501 A CN 110059501A
- Authority
- CN
- China
- Prior art keywords
- machine learning
- data
- privacy
- cloud server
- difference privacy
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/602—Providing cryptographic facilities or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Computer Security & Cryptography (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention discloses a kind of safely outsourced machine learning methods based on difference privacy, belong to cyberspace security fields, the method achieve data set providers under the premise of not revealing sensitive data to third party, by data with plus homomorphic cryptography technical treatment after be uploaded to Cloud Server, Cloud Server to encryption data store and add the operation and by interact acquisition query function with machine learning model supplier and carrying out machine learning of making an uproar.Outsourcing calculating and difference privacy are effectively combined by the method, it not only ensure that safety and the privacy of machine learning, computing cost is greatly reduced simultaneously and calculates cost and improves computational efficiency, and the inefficient and safety problem faced in traditional outsourcing machine learning method has been effectively relieved.
Description
Technical field
The invention belongs to cyberspace security fields, and in particular to a kind of safely outsourced machine learning based on difference privacy
Method.
Background technique
With the development of internet and information technology, more and more data are generated and are utilized.According to statistics, global at present
For the growth rate of data annual 40% or so, the coming five years whole world big data industry will obtain powerful development.In face of increasingly swashing
The mass data of increasing, cloud computing technology is calculated as a kind of novel data and memory module, can meet it significantly to storage
With the requirement of processing.By the storage and calculating outsourcing ability of cloud computing technology, user can turn local computing and storage demand
Cloud is moved on to, improves the efficiency to data processing by the powerful calculating of Cloud Server and storage capacity.Therefore, possess powerful
The cloud computing technology of computing capability becomes the worthy companion of big data technology.
At the same time, machine learning using efficient learning algorithm, enrich huge data and powerful calculating environment as base
Plinth is widely used in the scenes such as pattern-recognition, computer vision, data mining using the mass data that the mankind accumulate.In section
Under the promotion for learning research and industrial development, the field and application that machine learning is related to are also increasingly wider, especially in medicine, gold
Melt, business etc. fields.Such as in medical diagnosis, we train a machine learning mould by collecting magnanimity case data
Type can accurately analyze the probability that patient suffers from certain disease.
Although the problems such as cloud computing outsourcing service solves user's dyscalculia using its powerful storage and computing capability,
As incomplete believable third party, our personal sensitive information can face many new security challenges, including outer bag data
The safety and Privacy Protection of the services such as storage and calculating.Such as: on March 17th, 2018, New York Times were according to the report, Cambridge
Consulting firm obtain surpass 50,000,000 Facebook user data access authority, cause sit gather around 2,000,000,000 users Facebook fall into
Maximum leakage of personal information disturbance in history is entered.
Above-mentioned privacy challenge is coped with, Traditional solutions are that data set provider protects data by using encryption technology
Privacy, but the effect finally realized is very unsatisfactory.Difference privacy is extensive as a kind of most popular secret protection technology
Application and research, main thought are the two datasets for there was only a record for difference, inquire them and obtain identical value
Probability is insignificant.Most common method be on query result addition meet certain distribution noise, make query result with
Machine.As an alternative, difference privacy not only protects the privacy of data, but also improves the efficiency to data processing.Cause
This, data are contracted out to Cloud Server by data set provider, then are interacted by Cloud Server and machine learning model supplier
It can complete safely and effectively machine learning task.
Found in the research to existing method, conventional method the prior art has at least the following problems:
1) in order to adapt to different application and privacy budget, the data applied to different query tasks must add inhomogeneity
The noise of type, this inevitably increases computing cost and interaction, improves calculating cost.
2) when data set provider issues their data, public entities, that is, Cloud Server is necessarily present in can be with difference
The noise of type stores the place of all different types of data collection, and very big challenge is proposed to the memory space of Cloud Server.
Summary of the invention
Low efficiency problem caused by adding different types of noise to solve traditional scheme in data set, the present invention
There is provided a kind of safely outsourced machine learning method based on difference privacy will answer in conjunction with cloud computing technology and difference privacy technology
It is contracted out outside miscellaneous calculating and store tasks, not only ensure that safety and the privacy of machine learning, greatly reduce simultaneously
Computing cost and cost simultaneously improve computational efficiency, and the inefficient and peace faced in traditional outsourcing machine learning method has been effectively relieved
Full problem.
The present invention adopts the following technical scheme that realization: based on the safely outsourced machine learning method of difference privacy, including
Step:
S1, data set provider choose the Paillier Encryption Algorithm with additive homomorphism encryption property, with machine learning mould
Type supplier DE negotiates to generate a pair of secret keys (sk, pk), and wherein data set provider holds public key pk, machine learning model supplier
DE holds private key sk;
S2, data set provider will pre-process its data according to attribute differentiation before upload, then use public key pk
By pretreated data M=(m1,m2,...,mn) encryption, by encrypted ciphertext data | | m1||pk, | | m2||pk..., | | mn
||pkIt is sent to Cloud Server CSP;
S3, Cloud Server CSP receive the ciphertext data uploaded, and inquiry letter is obtained from machine learning model supplier DE
Number F, calculate and meet ε-difference privacy standard noise η, with Add (| | M | |, | | η | |) algorithm is added in step S2
Ciphertext data, the data after making an uproar will be added | | F (M)+η | |pkIt is sent to machine learning model supplier DE;
S4, machine learning model supplier DE are received plus the data after making an uproar, decryption Dec (| | m1||pk, | | m2||pk..., |
|mn||pk, sk) after, it obtains noise data (F (M)+η), and as input, noise data is divided with machine learning algorithm
Machine learning task is completed in analysis.
The present invention does not need to spend a large amount of memory space of Cloud Server and calculates space compared to the full homomorphic cryptography of tradition;
Cloud Server in the third step can be made safely to encryption data plus noise, to solve data peace using the property of homomorphic cryptography
Full problem.Compared with the conventional method, the beneficial effect obtained mainly has the following:
1) data set provider does not need locally adding noise, and the addition of noise uses cloud meter by powerful Cloud Server
Calculation technology is completed.
2) by adding homomorphic cryptography technology, the property on the add operation of ciphertext data without influencing its data integrity is utilized
Matter ensure that data will not revealed in operation and storing process between Cloud Server and machine learning model supplier;And
Communication complexity is greatly reduced again compared to full homomorphic cryptography, reduces the interactive operation in ciphering process, is reduced calculating and is opened
Pin, improves computational efficiency.Meanwhile it quoting difference privacy technology and privacy guarantor is achieved by the plus noise to sensitive data
Shield.
3) it ensure that the safety of outsourcing machine learning, private data is in the premise that do not reveal to incredible third party
Under realize machine learning.
Detailed description of the invention
Fig. 1 is the flow chart of outsourcing machine learning method of the present invention;
Fig. 2 is in the data and the original effect that data of making an uproar are not added and carry out same machines learning tasks with the method for the present invention
Comparison diagram.
Specific embodiment
Data based on cloud are calculated as a kind of novel data calculating and memory module, have very powerful data
Processing capacity and bigger memory space.For the present invention by cloud computing technology, a large amount of local computing operation (including uses difference
Privacy technology, which adds, makes an uproar) it can go to complete by Cloud Server;It is complete by Cloud Server and the interaction of machine learning model supplier
At machine learning task, to realize safe and efficient outsourcing machine learning tasks.For the ease of technical staff to the present invention
Understanding, the present invention is described in detail with reference to the accompanying drawings and examples, and embodiments of the present invention are not limited thereto.
Basic concepts of the present invention are as follows:
1) Paillier homomorphic cryptography: homomorphic cryptography technology is implemented to add as general encryption technology to encryption side's message
Close operation, i.e., under conditions of non-decrypting ciphertext, by executing operation to ciphertext, it will be able to accomplish the various meters to clear data
It calculates, meets the security requirement of secret protection.Secondly, homomorphic cryptography technology has the natural category that general encryption technology does not have
Property.The data of general encrypted state, which directly calculate, will destroy corresponding plaintext, and utilize the ciphertext data of homomorphic cryptography can be direct
Integrality and confidentiality of the operation without destroying corresponding cleartext information.In short, homomorphic cryptography is a kind of encrypted form, it allows
Certain types of calculating encrypts ciphertext, and the operation for executing matching result to plaintext when decryption can obtain one and encrypt
As a result.Paillier homomorphic cryptography is addition property homomorphic cryptography, has preferable application in cryptogram space calculating, is also applied for
The method of the present invention.
2) ε-difference privacy: being a kind of for formalizing the frame of privacy in staqtistical data base, for preventing anonymity
The technology of change.Under this definition, the calculation processing result to database is insensitive, list for the variation of some specific record
It is a to be recorded in data set or not in data set, the influence to calculated result is very little.Since difference privacy is one
Probability conception, any difference privacy mechanism are necessarily random.For this method, we use Laplace mechanism, mainly logical
It crosses addition and data is interfered based on the Laplace noise of Δ F and privacy budget ε.
3) outsourcing calculates: outsourcing calculate be it is a kind of expense is big, calculate complicated calculating and be contracted out to not trusted service
The technology of device, it allows resource-constrained data set provider that its computational load is contracted out to the cloud service with unlimited computing resource
Device.
4) machine learning: U.S. artificial intelligence field expert Arthur Samuel describes machine learning in this way: machine
Device study is to assign a research field of computer learning ability in the case where not being programmed directly against problem.Machine
Device learning areas is generally divided into three subdomains: supervised learning, unsupervised learning and intensified learning.Machine learning conduct simultaneously
A kind of service, Large-Scale Interconnected net company is now using machine learning as a service in its cloud platform.Such as Google prediction
API, Amazon machine learning (AmazonML), Microsoft Azure machine learning (Azure ML) etc..We can pass through
Our machine learning task is completed using the machine learning application in cloud platform.
As shown in Figure 1, there are 3 entities, respectively user (data set provider), Cloud Server (CSP), machine in this method
Device learning model supplier Data Evaluators (DE);Wherein, user possesses data, and provides these data applications in machine
Device study;CSP is interacted with user provides cloud storage and outsourcing calculating service, and processing that data are carried out plus made an uproar for it;DE and CSP
Interaction obtains noisy data and executes corresponding machine learning task.Specific step is as follows:
S1, user choose the Paillier Encryption Algorithm with additive homomorphism encryption property, negotiate to generate with DE a pair of close
Key (sk, pk).Big prime number p and q is taken at random, enables n=pq, λ (n)=lcm (p-1, q-1), and defined function L (n)=(n-
1) then/n takes g ∈ (Z/n2Z) * that it is made to meet gcd (L (g at randomλ(n)mod n2))=1, wherein lcm and gcd are respectively indicated most
Small common multiple and greatest common divisor.Public key is pk=(n, g), and private key is sk=(p, λ);Wherein user holds public key pk, and DE is held
There is private key sk.
S2, user will pre-process its data according to attribute differentiation before upload.Then, will be located in advance with public key pk
Data M=(the m of reason1,m2,...,mn) encryption, by encrypted ciphertext data | | m1||pk, | | m2||pk..., | | mn||pkHair
It is sent to Cloud Server.
Ciphering process are as follows: the plaintext of Yao Jiami is M ∈ Zn, a random number r < n is selected, ciphertext C=g is then calculatedMrn
mod n2。
The ciphertext data that S3, Cloud Server CSP receiving step S2 are uploaded, and obtained from machine learning model supplier DE
Query function F is calculated using Laplace mechanism and is met ε-difference privacy standard noise η, with Add (| | M | |, | | η | |) calculate
Method is added to the ciphertext data in step S2, will add the data after making an uproar | | F (M)+η | |pkIt is sent to DE.
Due to encrypting Paillier algorithm using additive homomorphism, it ensure that data after by user encryption by Cloud Server
It is not leaked privacy, while keeping Cloud Server straight using additive homomorphism encryption property E (x+y)=Eval (E (x), E (y))
It connects and integrality and confidentiality of the operation without destroying corresponding cleartext information is carried out to ciphertext data.
It is as follows that Cloud Server CSP calculates the step of meeting ε-difference privacy standard noise η with Laplace mechanism:
S31, Cloud Server CSP interact to obtain query function F first with machine learning model supplier DE, calculate susceptibility
ΔF。
S32, b=Δ F/ ε is calculated according to the privacy budget parameter set.
S33, Laplace noise η is generated.
The data added after making an uproar that S4, DE receiving step S3 are uploaded, decryption Dec (| | m1||pk, | | m2||pk..., | | mn|
|pk, sk) after, it obtains noise data (F (M)+η), and as input, classified with machine learning algorithm to noise data,
The analysis such as recurrence, to complete machine learning task.Because the data for being plus making an uproar, machine learning model supplier can not learn it
Initial data realizes the secret protection in machine-learning process.
Decrypting process are as follows: calculate F (M)+η=L (Cλ(n)mod n2) * μm of od n, wherein C is cipher-text information.
From the above implementation process it is found that the present invention is based on the safely outsourced machine learning methods of ε-difference privacy, by cloud meter
Calculation technology combines with difference privacy, operation of making an uproar will be added to be contracted out to Cloud Server using homomorphic cryptography property.Cloud Server is logical
The interaction between machine learning model supplier is crossed, allows the participation machine learning of data safety, not only protects user
Privacy, without generate privacy leakage, also greatly improve operation efficiency compared to traditional scheme.This method has very strong application
Property, it can be widely applied to the various scenes for needing to realize secret protection machine learning.
In the present embodiment, user is hospital, and hospital is i.e. hidden as the personal case that data set provider is desirable with patient
Private data application machine learning algorithm realizes Intelligence Diagnosis.But it both needs to provide useful data for engineering
It practises, protects the privacy of patient again.Other two entity, Cloud Server and machine learning model supplier are not exclusively credible
, we may be implemented to realize engineering in the case where not revealing privacy to other entities using method provided by the invention
It practises.
Firstly, hospital generates a pair of secret keys as user, that is, data set provider, with Paillier additive homomorphism Encryption Algorithm
(sk, pk), institute of traditional Chinese medicine hold public key pk, using public key pk to the sensitive data M=(m for training machine learning model1,
m2,...,mn) generation ciphertext is encrypted, private key sk transfers to machine learning model supplier to take care of.
Encrypted ciphertext data are uploaded to Cloud Server by subsequent hospital, and Cloud Server receives uploaded ciphertext, and
Query function F is obtained from machine learning model supplier DE, is calculated using Laplace mechanism and is met ε-difference privacy standard
Noise η, with Add (| | M | |, | | η | |) algorithm is added in ciphertext generated, the data after making an uproar will be added | | F (M)+η
||pkIt is sent to machine learning model supplier.
Machine learning supplier obtains noise data after being decrypted with private key sk, can carry out machine learning according to the demand of hospital
Task, such as the diagnosis to specified disease.
As shown in Fig. 2, the data by the original not plus noise of Experimental comparison use this method in machine-learning process
The influence very little of classification task is carried out with different machine learning algorithms to initial data plus after making an uproar.Therefore, this method is being protected
There is no the too big accuracy for influencing machine learning under the premise of card security privacy, it is fully effective to test provable this method.
The described embodiments are merely a part of the embodiments of the present invention for this patent, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts all other
Embodiment shall fall within the protection scope of the present invention.
Claims (5)
1. a kind of safely outsourced machine learning method based on difference privacy, which is characterized in that comprising steps of
S1, data set provider choose the Paillier Encryption Algorithm with additive homomorphism encryption property, mention with machine learning model
Donor DE negotiates to generate a pair of secret keys (sk, pk), and wherein data set provider holds public key pk, and machine learning model supplier DE is held
There is private key sk;
S2, data set provider will pre-process its data according to attribute differentiation before upload, then will be pre- with public key pk
Data M=(the m of processing1,m2,...,mn) encryption, by encrypted ciphertext data ∥ m1∥pk, ∥ m2∥pk..., ∥ mn∥pk
It is sent to Cloud Server CSP;
S3, Cloud Server CSP receive the ciphertext data uploaded, and query function F is obtained from machine learning model supplier DE,
It calculates and meets ε-difference privacy standard noise η, be added to Add (∥ M ∥, ∥ η ∥) algorithm close in step S2
Literary data will add data ∥ F (the M)+η ∥ after making an uproarpkIt is sent to machine learning model supplier DE;
S4, machine learning model supplier DE receive the data added after making an uproar, decrypt Dec (∥ m1∥pk, ∥ m2∥pk..., ∥ mn
∥pk, sk) after, it obtains noise data (F (M)+η), and as input, noise data is analyzed with machine learning algorithm,
Complete machine learning task.
2. the safely outsourced machine learning method according to claim 1 based on difference privacy, which is characterized in that step S3
Middle Cloud Server CSP is calculated with Laplace mechanism meets ε-difference privacy standard noise η, and steps are as follows:
S31, Cloud Server CSP interact to obtain query function F first with machine learning model supplier DE, calculate susceptibility Δ F;
S32, b=Δ F/ ε is calculated according to the privacy budget parameter set;
S33, noise η is generated.
3. the safely outsourced machine learning method according to claim 1 based on difference privacy, which is characterized in that key
The generating process of (sk, pk) are as follows: take big prime number p and q at random, enable n=pq, λ (n)=lcm (p-1, q-1), and defined function L
(n)=(n-1)/n;Then g ∈ (Z/n is taken at random2Z) * makes it meet gcd (L (gλ(n)mod n2))=1;Wherein lcm and gcd points
Not Biao Shi least common multiple and greatest common divisor, public key be pk=(n, g), private key be sk=(p, λ).
4. the safely outsourced machine learning method according to claim 3 based on difference privacy, which is characterized in that step S2
Middle ciphering process: the plaintext of Yao Jiami is M ∈ Zn, a random number r < n is selected, ciphertext C=g is then calculatedMrnmod n2。
5. the safely outsourced machine learning method according to claim 3 or 4 based on difference privacy, which is characterized in that step
Decrypting process in rapid S4: F (M)+η=L (C is calculatedλ(n)mod n2) * μm of od n, wherein C is cipher-text information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910302716.XA CN110059501B (en) | 2019-04-16 | 2019-04-16 | Safe outsourcing machine learning method based on differential privacy |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910302716.XA CN110059501B (en) | 2019-04-16 | 2019-04-16 | Safe outsourcing machine learning method based on differential privacy |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110059501A true CN110059501A (en) | 2019-07-26 |
CN110059501B CN110059501B (en) | 2021-02-02 |
Family
ID=67319169
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910302716.XA Active CN110059501B (en) | 2019-04-16 | 2019-04-16 | Safe outsourcing machine learning method based on differential privacy |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110059501B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111260081A (en) * | 2020-02-14 | 2020-06-09 | 广州大学 | Non-interactive privacy protection multi-party machine learning method |
CN111275202A (en) * | 2020-02-20 | 2020-06-12 | 济南大学 | Machine learning prediction method and system for data privacy protection |
CN111526148A (en) * | 2020-04-26 | 2020-08-11 | 中山大学 | System and method for safely denoising encrypted audio in cloud computing environment |
WO2021120229A1 (en) * | 2019-12-20 | 2021-06-24 | 深圳前海微众银行股份有限公司 | Data processing method, apparatus and system |
CN113553610A (en) * | 2021-09-22 | 2021-10-26 | 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) | Multi-party privacy protection machine learning method based on homomorphic encryption and trusted hardware |
WO2021260451A1 (en) * | 2020-06-22 | 2021-12-30 | International Business Machines Corporation | Depth-constrained knowledge distillation for inference on encrypted data |
CN116248260A (en) * | 2022-11-29 | 2023-06-09 | 中国电子科技集团公司第十五研究所 | Quantum security outsourcing machine learning method and system |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20240194347A1 (en) * | 2021-04-08 | 2024-06-13 | Biotronik Se & Co. Kg | Private AI Training |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108259158A (en) * | 2018-01-11 | 2018-07-06 | 西安电子科技大学 | Efficient and secret protection individual layer perceptron learning method under a kind of cloud computing environment |
CN108521326A (en) * | 2018-04-10 | 2018-09-11 | 电子科技大学 | A kind of Linear SVM model training algorithm of the secret protection based on vectorial homomorphic cryptography |
CN108717514A (en) * | 2018-05-21 | 2018-10-30 | 中国人民大学 | A kind of data-privacy guard method in machine learning and system |
CN108959958A (en) * | 2018-06-14 | 2018-12-07 | 中国人民解放军战略支援部队航天工程大学 | A kind of method for secret protection and system being associated with big data |
CN109284626A (en) * | 2018-09-07 | 2019-01-29 | 中南大学 | Random forests algorithm towards difference secret protection |
CN109327304A (en) * | 2018-12-18 | 2019-02-12 | 武汉大学 | The lightweight homomorphic cryptography method of secret protection is realized in a kind of cloud computing |
CN109376549A (en) * | 2018-10-25 | 2019-02-22 | 广州电力交易中心有限责任公司 | A kind of electricity transaction big data dissemination method based on difference secret protection |
CN109992979A (en) * | 2019-03-15 | 2019-07-09 | 暨南大学 | A kind of ridge regression training method calculates equipment, medium |
US20190227980A1 (en) * | 2018-01-22 | 2019-07-25 | Google Llc | Training User-Level Differentially Private Machine-Learned Models |
-
2019
- 2019-04-16 CN CN201910302716.XA patent/CN110059501B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108259158A (en) * | 2018-01-11 | 2018-07-06 | 西安电子科技大学 | Efficient and secret protection individual layer perceptron learning method under a kind of cloud computing environment |
US20190227980A1 (en) * | 2018-01-22 | 2019-07-25 | Google Llc | Training User-Level Differentially Private Machine-Learned Models |
CN108521326A (en) * | 2018-04-10 | 2018-09-11 | 电子科技大学 | A kind of Linear SVM model training algorithm of the secret protection based on vectorial homomorphic cryptography |
CN108717514A (en) * | 2018-05-21 | 2018-10-30 | 中国人民大学 | A kind of data-privacy guard method in machine learning and system |
CN108959958A (en) * | 2018-06-14 | 2018-12-07 | 中国人民解放军战略支援部队航天工程大学 | A kind of method for secret protection and system being associated with big data |
CN109284626A (en) * | 2018-09-07 | 2019-01-29 | 中南大学 | Random forests algorithm towards difference secret protection |
CN109376549A (en) * | 2018-10-25 | 2019-02-22 | 广州电力交易中心有限责任公司 | A kind of electricity transaction big data dissemination method based on difference secret protection |
CN109327304A (en) * | 2018-12-18 | 2019-02-12 | 武汉大学 | The lightweight homomorphic cryptography method of secret protection is realized in a kind of cloud computing |
CN109992979A (en) * | 2019-03-15 | 2019-07-09 | 暨南大学 | A kind of ridge regression training method calculates equipment, medium |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021120229A1 (en) * | 2019-12-20 | 2021-06-24 | 深圳前海微众银行股份有限公司 | Data processing method, apparatus and system |
CN111260081A (en) * | 2020-02-14 | 2020-06-09 | 广州大学 | Non-interactive privacy protection multi-party machine learning method |
CN111260081B (en) * | 2020-02-14 | 2023-03-14 | 广州大学 | Non-interactive privacy protection multi-party machine learning method |
CN111275202A (en) * | 2020-02-20 | 2020-06-12 | 济南大学 | Machine learning prediction method and system for data privacy protection |
CN111275202B (en) * | 2020-02-20 | 2023-08-11 | 济南大学 | Machine learning prediction method and system for data privacy protection |
CN111526148A (en) * | 2020-04-26 | 2020-08-11 | 中山大学 | System and method for safely denoising encrypted audio in cloud computing environment |
WO2021260451A1 (en) * | 2020-06-22 | 2021-12-30 | International Business Machines Corporation | Depth-constrained knowledge distillation for inference on encrypted data |
US11599806B2 (en) | 2020-06-22 | 2023-03-07 | International Business Machines Corporation | Depth-constrained knowledge distillation for inference on encrypted data |
GB2611686A (en) * | 2020-06-22 | 2023-04-12 | Ibm | Depth-constrained knowledge distillation for inference on encrypted data |
CN113553610A (en) * | 2021-09-22 | 2021-10-26 | 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) | Multi-party privacy protection machine learning method based on homomorphic encryption and trusted hardware |
CN116248260A (en) * | 2022-11-29 | 2023-06-09 | 中国电子科技集团公司第十五研究所 | Quantum security outsourcing machine learning method and system |
CN116248260B (en) * | 2022-11-29 | 2024-09-20 | 中国电子科技集团公司第十五研究所 | Quantum security outsourcing machine learning method and system |
Also Published As
Publication number | Publication date |
---|---|
CN110059501B (en) | 2021-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110059501A (en) | A kind of safely outsourced machine learning method based on difference privacy | |
Kwabena et al. | Mscryptonet: Multi-scheme privacy-preserving deep learning in cloud computing | |
EP3075098B1 (en) | Server-aided private set intersection (psi) with data transfer | |
CN104521178B (en) | The method and system of the multi-party cloud computing of safety | |
CN109992979A (en) | A kind of ridge regression training method calculates equipment, medium | |
Tang et al. | Lightweight and privacy-preserving fog-assisted information sharing scheme for health big data | |
Huang et al. | Secure and flexible cloud-assisted association rule mining over horizontally partitioned databases | |
CN114598472A (en) | Conditional-hidden searchable agent re-encryption method based on block chain and storage medium | |
Zhang et al. | Privacy-preserving multikey computing framework for encrypted data in the cloud | |
Sun et al. | Privacy‐preserving self‐helped medical diagnosis scheme based on secure two‐party computation in wireless sensor networks | |
Kibiwott et al. | Privacy Preservation for eHealth Big Data in Cloud Accessed Using Resource-Constrained Devices: Survey. | |
Xu et al. | A privacy-preserving and efficient data sharing scheme with trust authentication based on blockchain for mHealth | |
Ren et al. | Efficiency boosting of secure cross-platform recommender systems over sparse data | |
Hsu et al. | Private data preprocessing for privacy-preserving Federated Learning | |
Vamsi et al. | Electronic health record security in cloud: Medical data protection using homomorphic encryption schemes | |
Zhang | The Current Situation and Trend of Blockchain Technology in the Financial Field | |
CN113965310B (en) | Method for realizing mixed privacy calculation processing based on label capable of being controlled to be de-identified | |
Nagesh et al. | Study on encryption methods to secure the privacy of the data and computation on encrypted data present at cloud | |
Devi et al. | A comparative study on homomorphic encryption algorithms for data security in cloud environment | |
Hu et al. | An e-commerce agreement based on the points system of the blockchain and the secure multi-party platform | |
Chen et al. | JEDI: Joint and Effective Privacy Preserving Outsourced Set Intersection and Data Integration Protocols | |
Jia et al. | Privacy-preserving nonlinear SVM classifier training based on blockchain | |
Chen et al. | Privacy-preserving Computing: for Big Data Analytics and AI | |
Ganesh et al. | A Deep Learning Framework to Preserve Privacy in Federated (Collaborative) Learning | |
Dasari et al. | Privacy-Preserving sensitive data on Medical diagnosis using Federated Learning and Homomorphic Re-encryption |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |