CN113191396B - Modeling method and device based on data privacy security protection - Google Patents

Modeling method and device based on data privacy security protection Download PDF

Info

Publication number
CN113191396B
CN113191396B CN202110381449.7A CN202110381449A CN113191396B CN 113191396 B CN113191396 B CN 113191396B CN 202110381449 A CN202110381449 A CN 202110381449A CN 113191396 B CN113191396 B CN 113191396B
Authority
CN
China
Prior art keywords
data
model
training set
encrypted
key matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110381449.7A
Other languages
Chinese (zh)
Other versions
CN113191396A (en
Inventor
袁烨
华丰
孙川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN202110381449.7A priority Critical patent/CN113191396B/en
Publication of CN113191396A publication Critical patent/CN113191396A/en
Application granted granted Critical
Publication of CN113191396B publication Critical patent/CN113191396B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Bioethics (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Computer Hardware Design (AREA)
  • Medical Informatics (AREA)
  • Computer Security & Cryptography (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Storage Device Security (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)

Abstract

The invention discloses a modeling method and a device based on data privacy security protection, belonging to the technical field of data privacy security protection, wherein the method comprises the following steps: s1: the data owner carries out encryption conversion on the training set by using the key matrix to obtain an encrypted training set; sending the encrypted training set to a model and an algorithm provider, and defining a key matrix based on the characteristics of the training set; s2: the model and computing power provider builds a linear regression model based on the encryption training set, and trains the linear regression model to obtain an encryption data model; s3: and the data owner receives the model and the encrypted data model fed back by the computing power provider, and calls the key matrix to decrypt the encrypted data model to obtain the target data model. The invention can realize safe, efficient and simple data model establishment, thereby solving the technical problems of difficult security guarantee, complicated encryption and decryption processes and the like in the existing data privacy security protection calculation framework.

Description

Modeling method and device based on data privacy security protection
Technical Field
The invention belongs to the technical field of data privacy security protection, and particularly relates to a modeling method and device based on data privacy security protection.
Background
In recent years, machine learning has been widely used in practical scenes such as life and production, and the construction and training of these models rely on a large amount of data collected from multiple sources. Meanwhile, the collection and aggregation of a large amount of data causes concern to people about data security, privacy disclosure and other problems. The data privacy protection technology is put forward and developed, so that users and enterprises can be helped to organize better data privacy security, and safer sharing and intercommunication of data and models are promoted. Therefore, the data privacy security protection technology has important significance in practical application.
The currently used methods of data privacy protection technology are encryption and perturbation, differential privacy is usually used in perturbation, homomorphic encryption and secure multiparty computation are usually used in encryption. The difference privacy is usually realized by adding noise into a return value of a query function, and the added amount of the noise affects the usability of data, so that an algorithm model obtained by training is not accurate enough, and the learning of data features has deviation. The homomorphic encryption method can enable a user to perform algebraic operation of a specific form on a ciphertext to obtain a still encrypted result, and the user does not need to know a secret key to decrypt to obtain a plaintext and then perform calculation. The processing speed of this method to encrypt data and the data storage capacity of the encryption scheme increase the cost of data processing and learning. The safe multi-party calculation means that a user completes data collaborative calculation without data collection, and simultaneously protects the original data privacy of all parties of data, however, compared with multi-party data aggregation, the local data set has the problems of non-independent and same distribution and the like, the mapping relation of data features is not comprehensive, and the local model learning has deviation; meanwhile, high requirements are also placed on all data parties in the requirements of model construction, training and calculation.
Generally, the existing mechanism based on data privacy security protection discovers that a computing framework also has the problems that the data delivery privacy security is difficult to guarantee, the availability of the encrypted data is limited, the encryption and decryption processes are complicated, the unilateral calculation force requirement is high, and the like.
Disclosure of Invention
Aiming at the defects or improvement requirements of the prior art, the invention provides a modeling method and a device based on data privacy security protection, aiming at realizing safe, efficient and simple data model establishment, thereby solving the technical problems of difficult security guarantee, complicated encryption and decryption processes and the like in the existing data privacy security protection computing framework.
To achieve the above object, according to an aspect of the present invention, there is provided a modeling method based on data privacy security protection, including:
s1: the data owner carries out encryption conversion on the training set by using the key matrix to obtain an encrypted training set; sending the encrypted training set to a model and computing power provider;
s2: the model and computing power provider builds a linear regression model based on the encryption training set, and trains the linear regression model to obtain an encryption data model;
s3: and the data owner receives the model and the encrypted data model fed back by the computing power provider, and calls the key matrix to decrypt the encrypted data model to obtain a target data model.
In one embodiment, before S1, the method further includes:
when the dimension of the training set is m × n, the key matrix defined by the data owner includes an orthogonal matrix P with the dimension of m × m and an orthogonal matrix Q with the dimension of n × n;
wherein, PP T =I,QQ T I, m is the number of samples contained in the training set, and n is the number of sample features contained in each of the samples.
In one embodiment, the S1 includes:
s11: the data owner defines the training set as A { (x) i ,y i ) I belongs to R }, let
Figure BDA0003013163360000021
Then
Figure BDA0003013163360000022
The training set is then encrypted using the key matrix
Figure BDA0003013163360000023
Obtaining the encrypted training set
Figure BDA0003013163360000024
Wherein the content of the first and second substances,
Figure BDA0003013163360000025
s12: the encrypted training set
Figure BDA0003013163360000031
And sending the data to the model and the computing power provider.
In one embodiment, the S2 includes:
s21: aiming at the received encrypted training set, the model and the calculation force provider use a least square method to obtain an objective function of the linear regression model
Figure BDA0003013163360000032
Figure BDA0003013163360000033
Wherein the content of the first and second substances,
Figure BDA0003013163360000034
refer to
Figure BDA0003013163360000035
Obtaining the value of theta at the minimum value;
s22: using the objective function
Figure BDA0003013163360000036
Training the linear regression model, and obtaining the encrypted data model
Figure BDA0003013163360000037
And feeding back to the data owner.
In one embodiment, the S3 includes:
the data owner receives the encrypted data model
Figure BDA0003013163360000038
Modeling the encrypted data using the key matrix
Figure BDA0003013163360000039
Performing decryption inverse conversion to obtain the target data model describing the characteristics of the real data set
Figure BDA00030131633600000310
Wherein the content of the first and second substances,
Figure BDA00030131633600000311
in one embodiment, before S1, the method further includes: s0: the data owner divides a data set into the training set and the test set;
after S3, the method further includes: s4: and the data owner inputs the test set into the target data model to obtain a visual result so as to test the performance of the target data model.
According to another aspect of the present invention, there is provided a modeling apparatus based on data privacy security protection, including:
the data owner is used for carrying out encryption conversion on the training set by using a key matrix to obtain an encrypted training set, and the key matrix is defined based on the characteristics of the training set; the key matrix is also used for calling the key matrix to decrypt the received encrypted data model so as to obtain a target data model;
and the model and computing power provider is in communication connection with the data owner and is used for receiving the encrypted training set, constructing a linear regression model based on the encrypted training set, training the linear regression model to obtain the encrypted data model and sending the encrypted data model to the data owner.
In general, compared with the prior art, the above technical solution contemplated by the present invention can achieve the following beneficial effects:
(1) the modeling method provided by the invention can effectively protect the data privacy security of all data parties, and solves the problem that the privacy security is difficult to guarantee after key data come out. The data owner can be helped to safely interact with the data and the model with the cooperation party, and the computing power and related professional knowledge requirements of the data owner in unilateral modeling are reduced. The invention can realize safe, efficient and simple data model establishment, thereby solving the technical problems of difficult security guarantee, complicated encryption and decryption processes and the like in the existing data privacy security protection calculation framework.
(2) The invention simplifies the steps of data encryption and decryption, the model and the computing power provider only need to train the encrypted data to obtain the linear regression model and feed the linear regression model back to the data provider, and the model and the computing power provider do not need to train the encrypted data after decrypting the encrypted data set, so that the model and the computing power provider do not know the real data and can not obtain the model describing the real data so as to deduce the mechanism relation in the real data, and the data privacy safety of the data provider is ensured.
(3) The key matrix encryption method provided by the invention improves the calculation efficiency, ensures the availability of data and is not restricted by the data storage amount in the encryption scheme, and is more suitable for lossless mechanism discovery and linear model training in a big data scene.
Drawings
FIG. 1 is a flow chart of a modeling method based on data privacy security protection of the present invention;
FIG. 2 is a schematic diagram illustrating a modeling method based on data privacy security protection according to the present invention;
FIG. 3a is a validation of the prediction result of the linear regression model training based on the Boston's house-price dataset according to the method of the present invention;
FIG. 3b is a test of a training prediction result after adding an L2 regular term to a linear regression model based on a Boston's price data set according to the method of the present invention;
FIG. 4a is a verification of the prediction results of the linear regression model training based on the data set of the series index of diabetes mellitus of the present invention;
FIG. 4b is a verification of the training prediction result after adding L2 regular term to the linear regression model based on the diabetes disease progression index dataset according to the method of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
As shown in fig. 1 and fig. 2, the present invention provides a modeling method based on data privacy security protection, including:
s1: the data owner encrypts and converts the training set by using the key matrix to obtain an encrypted training set, and sends the encrypted training set to the model and the computing power provider; the key matrix is defined based on the characteristics of the training set;
s2: the model and computing power provider builds a linear regression model based on the encryption training set, and trains the linear regression model to obtain an encryption data model;
s3: and the data owner receives the model and the encrypted data model fed back by the computing power provider, and calls the key matrix to decrypt the encrypted data model to obtain the target data model.
Specifically, first, based on the training set characteristic data, all parties autonomously define a key matrix P, Q according to the setting conditions in the method; then, carrying out encryption conversion on the training set samples according to the encryption method in the invention; secondly, the encrypted training set is sent to a model and computing power provider safely, so that the model and the computing power provider build and train a linear regression algorithm model by using a least square method based on the training set; and finally, the model and the computing power provider feeds the trained model back to the data provider, and the data provider calls the matrix key to decrypt and convert the model to obtain the target data model.
In one embodiment, before S1, the modeling method based on data privacy security protection further includes: s0: dividing a data set into a training set and a test set by a data owner; after S3, the modeling method based on data privacy protection further includes: s4: and the data owner inputs the test set into the target data model to obtain a visual result so as to test the performance of the target data model. Wherein the division of the training set and the test set is autonomously defined by the data provider.
In one embodiment, before S1, the modeling method based on data privacy security protection further includes: when the dimension of the training set is m multiplied by n, the key matrix defined by the data owner comprises an orthogonal matrix P with the dimension of m multiplied by m and an orthogonal matrix Q with the dimension of n multiplied by n; wherein, PP T =I,QQ T I, m is the number of samples contained in the training set, and n is the number of sample features contained in each sample.
Specifically, the setting condition of the key matrix P, Q is based on the characteristics of the divided training set, and is specifically defined as: the training set is denoted A { (x) i ,y i ) I belongs to R }, and the dimension is m multiplied by n, wherein m is a training set packetThe number of samples contained, and n is the number of sample features. The P and Q are set with the condition of an orthogonal matrix PP T =I,QQ T I, P has a matrix dimension of m × m, and Q has a matrix dimension of n × n.
In one embodiment, S1 includes: s11: data owner encrypts training set by using key matrix
Figure BDA0003013163360000061
Obtaining a ciphered training set
Figure BDA0003013163360000062
Wherein the content of the first and second substances,
Figure BDA0003013163360000063
s12: will encrypt the training set
Figure BDA0003013163360000064
And sending the data to a model and an computing power provider.
Specifically, the encryption method for the training set sample is specifically defined as: training set A { (x) i ,y i ),i∈R},
Figure BDA0003013163360000065
Then
Figure BDA0003013163360000066
The encryption algorithm is expressed as a method
Figure BDA0003013163360000067
Then get the encrypted training set
Figure BDA0003013163360000068
In one embodiment, S2 includes: s21: for a received encrypted training set, a model and a calculation power provider utilize an objective function of a least squares linear regression model
Figure BDA0003013163360000069
Figure BDA00030131633600000610
Wherein the content of the first and second substances,
Figure BDA00030131633600000611
refer to
Figure BDA00030131633600000612
Obtaining the value of theta at the minimum value; s22: using an objective function
Figure BDA00030131633600000613
Training a linear regression model, and modeling the encrypted data obtained by training
Figure BDA00030131633600000614
And feeding back to the data owner.
Specifically, a model and computing power provider constructs and trains a linear regression model based on a least square method aiming at an encryption training set, and the specific method comprises the following steps: an objective function for training a linear regression model using least squares sense:
Figure BDA00030131633600000615
training linear regression model to obtain encrypted data model
Figure BDA00030131633600000616
And will encrypt the data model
Figure BDA00030131633600000617
And sending the feedback to the data provider safely.
In one embodiment, S3 includes: data owner receiving encrypted data model
Figure BDA0003013163360000071
Modeling encrypted data using a key matrix pair
Figure BDA0003013163360000072
Carrying out decryption inverse conversion to obtain a target data model for describing characteristics of a real data set
Figure BDA0003013163360000073
Wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003013163360000074
specifically, the linear regression model equation is:
Figure BDA0003013163360000075
multiplying both sides of the equation by the key matrix P to obtain
Figure BDA0003013163360000076
The key matrix Q is an orthogonal matrix, and Q is multiplied by the inverse matrix of the key matrix Q to obtain the result of an identity matrix unchanged equation:
Figure BDA0003013163360000077
according to the training data, the encryption method comprises
Figure BDA0003013163360000078
Figure BDA0003013163360000079
Will be provided with
Figure BDA00030131633600000710
Substituting the above equation yields the equation:
Figure BDA00030131633600000711
this can be deduced:
Figure BDA00030131633600000712
in order to verify the method of the invention, experiments are carried out, and public standard data sets selected in the experiments are industrial steam quantity prediction and diabetes disease series indexes. Fig. 3a and fig. 4a show the left side of the prediction result of the decrypted model of the method on the test set, and the right side of the prediction result of the linear model trained on the real data set on the test set. Fig. 3b and 4b show that the left side shows the prediction result of the decryption model added with the L2 regular term in the method on the test set, and the right side shows the prediction result of the linear model added with the L2 regular term on the test set obtained by training on the real data set.
The final predicted results of the comparative experiments are shown in tables 1 and 2:
industrial steam quantity prediction Training (five-fold mean + -standard deviation) Testing
Least squares non-encryption 0.24±0.01 0.27±0.02
Least squares encryption 0.25±0.01 0.27±0.02
Least squares L2 regularized unencrypted 0.24±0.01 0.27±0.02
Least squares L2 regularized encryption 0.25±0.01 0.27±0.02
TABLE 1
Prediction of diabetic disease progression index Training (five-fold mean + -standard deviation) Testing
Least squares non-encryption 43.15±0.37 44.29±1.44
Least squares encryption 42.89±0.42 44.29±1.44
Least squares L2 regularized unencrypted 48.43±0.36 48.81±2.22
Least squares L2 regularized encryption 46.17±0.34 48.81±2.22
TABLE 2
The lossless mechanism discovery method based on data privacy security protection provided by the invention realizes privacy security encryption of data under the condition of not influencing data availability, simplifies the complicated process of data encryption and decryption, realizes encryption data training based on a linear regression model and lossless decryption conversion of the model, and solves the problem that the data is difficult to train and calculate after being encrypted. The invention provides a reference scheme aiming at the scene that the factory safety of key data is difficult to guarantee in practical application, and the data owner can efficiently and simply realize the one-way transparent privacy safety protection of the data by using the method, and safely interact the data and the model with the model and the computing power provider to obtain a lossless and real mechanism model.
The invention also provides a modeling device based on data privacy security protection, which comprises: the system comprises a data owner, a model and a calculation force provider, wherein the data owner is used for carrying out encryption conversion on a training set by using a key matrix to obtain an encrypted training set, and the key matrix is defined based on the characteristics of the training set; the key matrix is also used for calling the key matrix to decrypt the received encrypted data model so as to obtain a target data model; and the model and computing power provider is in communication connection with the data owner and is used for receiving the encrypted training set, constructing a linear regression model based on the encrypted training set, training the linear regression model to obtain an encrypted data model, and sending the encrypted data model to the data owner.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (5)

1. A modeling method based on data privacy security protection is characterized by comprising the following steps:
s1: the data owner encrypts and converts the training set by using the key matrix to obtain an encrypted training set, and sends the encrypted training set to the model and the computing power provider; the key matrix is defined based on characteristics of the training set;
s2: the model and computing power provider builds a linear regression model based on the encryption training set, and trains the linear regression model to obtain an encryption data model;
s3: the data owner receives the model and the encrypted data model fed back by the computing power provider, and calls the key matrix to decrypt the encrypted data model to obtain a target data model;
before the S1The method also comprises the following steps: when the dimension of the training set is m × n, the key matrix defined by the data owner includes an orthogonal matrix P with the dimension of m × m and an orthogonal matrix Q with the dimension of n × n; PP (polypropylene) T =I,QQ T I, m is the number of samples contained in the training set, and n is the number of sample features contained in each of the samples;
the S1 includes: s11: the data owner defines the training set as A { (x) i ,y i ) I is equal to R }, set
Figure FDA0003765435870000011
y=y i Then
Figure FDA0003765435870000012
The training set is then encrypted using the key matrix
Figure FDA0003765435870000013
Obtaining the encrypted training set
Figure FDA0003765435870000014
Wherein the content of the first and second substances,
Figure FDA0003765435870000015
s12: the encrypted training set
Figure FDA0003765435870000016
And sending the data to the model and the computing power provider.
2. The modeling method based on data privacy security protection according to claim 1, wherein the S2 includes:
s21: aiming at the received encrypted training set, the model and the calculation force provider use a least square method to obtain an objective function of the linear regression model
Figure FDA0003765435870000017
Wherein the content of the first and second substances,
Figure FDA0003765435870000018
refer to
Figure FDA0003765435870000019
Obtaining the value of theta at the minimum value;
s22: using the objective function
Figure FDA00037654358700000110
Training the linear regression model, and obtaining the encrypted data model
Figure FDA00037654358700000111
And feeding back to the data owner.
3. The modeling method based on data privacy security protection according to claim 2, wherein the S3 includes:
the data owner receives the encrypted data model
Figure FDA0003765435870000021
Modeling the encrypted data using the key matrix
Figure FDA0003765435870000022
Performing decryption inverse conversion to obtain the target data model describing the characteristics of the real data set
Figure FDA0003765435870000023
Wherein the content of the first and second substances,
Figure FDA0003765435870000024
4. modeling method based on data privacy security protection according to any of claims 1-3,
before S1, the method further includes: s0: the data owner divides a data set into the training set and the test set;
after S3, the method further includes: s4: and the data owner inputs the test set into the target data model to obtain a visual result so as to test the performance of the target data model.
5. A modeling device based on data privacy security protection is characterized in that the modeling device is used for executing the modeling method based on data privacy security protection of any one of claims 1-4, and comprises the following steps:
the data owner is used for carrying out encryption conversion on the training set by using a key matrix to obtain an encrypted training set, and the key matrix is defined based on the characteristics of the training set; the key matrix is also used for calling the key matrix to decrypt the received encrypted data model so as to obtain a target data model;
and the model and computing power provider is in communication connection with the data owner and is used for receiving the encrypted training set, constructing a linear regression model based on the encrypted training set, training the linear regression model to obtain the encrypted data model and sending the encrypted data model to the data owner.
CN202110381449.7A 2021-04-09 2021-04-09 Modeling method and device based on data privacy security protection Active CN113191396B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110381449.7A CN113191396B (en) 2021-04-09 2021-04-09 Modeling method and device based on data privacy security protection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110381449.7A CN113191396B (en) 2021-04-09 2021-04-09 Modeling method and device based on data privacy security protection

Publications (2)

Publication Number Publication Date
CN113191396A CN113191396A (en) 2021-07-30
CN113191396B true CN113191396B (en) 2022-09-20

Family

ID=76975260

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110381449.7A Active CN113191396B (en) 2021-04-09 2021-04-09 Modeling method and device based on data privacy security protection

Country Status (1)

Country Link
CN (1) CN113191396B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117371558B (en) * 2023-12-04 2024-03-08 环球数科集团有限公司 System for executing machine learning in privacy protection environment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015057854A1 (en) * 2013-10-15 2015-04-23 University Of Florida Research Foundation, Inc. Privacy-preserving data collection, publication, and analysis
CN104836657B (en) * 2015-05-27 2018-01-26 华中科技大学 A kind of identity-based anonymity broadcast encryption method with efficient decryption features
EP3602422B1 (en) * 2017-03-22 2022-03-16 Visa International Service Association Privacy-preserving machine learning
CN112182649B (en) * 2020-09-22 2024-02-02 上海海洋大学 Data privacy protection system based on safe two-party calculation linear regression algorithm

Also Published As

Publication number Publication date
CN113191396A (en) 2021-07-30

Similar Documents

Publication Publication Date Title
CN110008717B (en) Decision tree classification service system and method supporting privacy protection
WO2022057631A1 (en) Data processing method and system based on node group, and device and medium
CN110011784B (en) KNN classification service system and method supporting privacy protection
Wang et al. Privacy-preserving pattern matching over encrypted genetic data in cloud computing
CN112182649A (en) Data privacy protection system based on safe two-party calculation linear regression algorithm
CN111275202A (en) Machine learning prediction method and system for data privacy protection
CN111162906B (en) Collaborative secret sharing method, device, system and medium based on vast transmission algorithm
CN104158880B (en) User-end cloud data sharing solution
CN113688999A (en) Training method of transverse federated xgboost decision tree
CN102710661B (en) Cloud storage and aggregation architecture and data storage and aggregation method by using same
CN104967693A (en) Document similarity calculation method facing cloud storage based on fully homomorphic password technology
Fan et al. PPMCK: Privacy-preserving multi-party computing for K-means clustering
Li et al. Homopai: A secure collaborative machine learning platform based on homomorphic encryption
CN109688143A (en) A kind of cluster data mining method towards secret protection in cloud environment
CN112052466A (en) Support vector machine user data prediction method based on multi-party secure computing protocol
CN113191396B (en) Modeling method and device based on data privacy security protection
DE112022003853T5 (en) PRIVACY-PRECIOUS COMPUTING WITH THIRD PARTY SERVICES
Yao et al. An efficient and robust system for vertically federated random forest
CN115664629A (en) Homomorphic encryption-based data privacy protection method for intelligent Internet of things platform
CN108282328A (en) A kind of ciphertext statistical method based on homomorphic cryptography
CN115630713A (en) Longitudinal federated learning method, device and medium under condition of different sample identifiers
US20220191178A1 (en) Method and system for secure information distribution based on group shared key
CN108880782B (en) Minimum value secret computing method under cloud computing platform
CN114697042A (en) Block chain-based Internet of things security data sharing proxy re-encryption method
CN117171779B (en) Data processing device based on intersection protection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant