CN110968893A - Privacy protection method for associated classified data sequence based on Pufferfish framework - Google Patents

Privacy protection method for associated classified data sequence based on Pufferfish framework Download PDF

Info

Publication number
CN110968893A
CN110968893A CN201911148569.1A CN201911148569A CN110968893A CN 110968893 A CN110968893 A CN 110968893A CN 201911148569 A CN201911148569 A CN 201911148569A CN 110968893 A CN110968893 A CN 110968893A
Authority
CN
China
Prior art keywords
privacy
data
privacy protection
pufferfish
framework
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911148569.1A
Other languages
Chinese (zh)
Inventor
习芷铖
桑应朋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
Original Assignee
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sun Yat Sen University filed Critical Sun Yat Sen University
Priority to CN201911148569.1A priority Critical patent/CN110968893A/en
Publication of CN110968893A publication Critical patent/CN110968893A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6227Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database where protection concerns the structure of data, e.g. records, types, queries

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to a privacy protection method for associated classified data sequences based on a Pufferfish framework. Firstly, a Pufferfish framework is introduced to formulate a strict privacy protection definition, then two correlations between data are accurately described by utilizing a multi-dimensional Markov chain model, and finally, an achievable privacy protection mechanism is provided to add proper noise protection privacy. The invention formulates privacy protection definition under the scene of two-dimensional associated data based on a pufferfish privacy protection framework, reasonably represents the two-dimensional association between the data by adopting a multi-dimensional Markov chain model, and provides a mechanism capable of realizing noise addition by combining the privacy protection definition, thereby ensuring that the state of each individual at each moment is the privacy data and an attacker cannot distinguish the privacy data while realizing the aggregate query and analysis of the overall trend. The invention simultaneously considers two types of privacy protection of the correlation between individuals and the correlation inside each sequence, and can protect the privacy data of the individuals while making the associated data sets available.

Description

Privacy protection method for associated classified data sequence based on Pufferfish framework
Technical Field
The invention belongs to the field of privacy protection and information security, and particularly relates to a privacy protection method for an associated classified data sequence based on a Pufferfish framework.
Background
Although the differential privacy is a privacy definition widely applied at present, the differential privacy cannot be applied to the data association condition, because the model assumes that each individual in the data set is independent from each other, the initial set privacy definition cannot be met when the differential privacy is directly applied to the associated data scene, and privacy disclosure is caused.
The proposal of the Pufferfish can solve the problem of correlation between data because it can use the set D to represent all the background knowledge possessed by the attacker, i.e. all the possible probability distributions of the data set can be generated, but it has the disadvantage of lacking a specific realizable mechanism because all the possible probability distributions need to be considered, the computational complexity is too high, and it is difficult to represent all the distributions in full. There are some practical mechanisms for specific data sets, but only the correlation between single sequence attributes can be protected, and the method is not suitable for the scenario provided by the present invention.
The existing privacy protection method for the associated data has the disadvantage that only one-dimension association is considered, such as association between individuals or between attributes. However, many real data sets are composed of a plurality of related sequences, such as time series data of different people, the sequences themselves are highly related, and the sequences of different people are also related to each other, so that the existing privacy method for related data cannot be applied to the scene provided by the invention.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a privacy protection method for the associated classified data sequences based on the Pufferfish framework, which can solve the privacy protection problem of a plurality of associated classified sequences and make up the defects of the existing scheme.
In order to solve the technical problems, the invention adopts the technical scheme that: a privacy protection method for associated classified data sequences based on a Pufferfish framework is characterized in that the Pufferfish framework is introduced to formulate a strict privacy protection definition, then two kinds of association between data are accurately described by using a multi-dimensional Markov chain model, and finally, an achievable privacy protection mechanism is provided to add proper noise protection privacy. The method provided by the invention can solve the privacy protection problem of a plurality of associated classification sequences, overcomes the defects of the existing scheme, simultaneously considers the association between individuals and the self-association of the individual data sequences for the first time in the privacy protection process, introduces a Pufferfish framework, makes a privacy protection definition under the scene of two-dimensional associated data based on the Pufferfish privacy protection framework, reasonably represents the two-dimensional association between the data by adopting a multi-dimensional Markov chain model, and provides an achievable noise adding mechanism by combining with the privacy protection definition, thereby ensuring that the state of each individual at each moment is the privacy data and an attacker cannot distinguish the privacy data while realizing the overall trend of aggregate query analysis. Meanwhile, the efficiency problem of the realization mechanism is considered, and the bottom layer structure realization efficiency optimization of the multi-dimensional Markov chain is explored. The method simultaneously considers two types of privacy protection, namely the correlation between individuals and the correlation inside each sequence, can make the associated data set available and protect the privacy data of the individuals, and has great practical significance.
Further, the bufferfish framework comprises three parts: secret S: representing a collection of sensitive information that needs to be protected, namely S representing a set of private information related to personal private data; secret pair S _ pairs: discrimination pair
Figure BDA0002282908310000021
The collection of (A) represents how to protect the private information, i.e. it needs to be ensured that an attacker cannot distinguish the discrimination pair s thereini,sj(ii) a D represents the collection of all possible probability distributions that can be generated for the data set, representing how much background knowledge an attacker possesses, and each θ ∈ D represents one possible probability distribution that generates the data set, and the correlation between the data can be considered in D.
Further, the bufferfish framework is defined as follows:
∈-Pufferfish(S,Spairsd) Privacy: given S, SpairsD, and privacy budget e for
Figure BDA0002282908310000022
If the privacy protection mechanism M satisfies the following formula, the privacy protection mechanism M satisfies the E-Pufferfish (S, S)pairs,D)Privacy:
Figure BDA0002282908310000023
In the formula, pX,M(M(X)=w|siTheta) and pX,M(M(X)=w|sjAnd theta) represents that the discrimination pairs are respectively s in the known probability distribution thetaiAnd sjThen, after the data is processed through a privacy protection mechanism M, the obtained query result is the conditional probability of w;
the final noisy query result needs to meet the privacy definition, wherein the element belongs to the degree of protecting privacy, and the smaller the element belongs to, the more the privacy is protected; and the three parts-S, S of the punfferfish are defined according to specific datapairs,D。
Further, the method for accurately describing two correlations between data by using the multi-dimensional Markov chain model comprises the following steps:
assuming that the data set has s sequences, the sequences are highly correlated and the sequences are also interdependent; by using
Figure BDA0002282908310000031
Representing the state probability distribution of the k-th sequence at n more times, i.e.
Figure BDA0002282908310000032
The state distribution of the jth sequence at time n +1 is related to the state distribution of the s sequences at time n, and is not related to the state before time n, so that the multidimensional Markov chain satisfies the following relation:
Figure BDA0002282908310000033
wherein the content of the first and second substances,
Figure BDA0002282908310000034
P(jk)for inter-column transition probability matrices, λjkIs the inter-column weight;
the above relationship indicates that the state probability distribution of the jth sequence at time n +1 depends on the weighted average of the transition probability matrix and the state product at the previous time, which can be written in matrix form as:
Figure BDA0002282908310000035
by defining a parametric initial probability distribution y1Inter-column transition probability matrix P(jk)And an inter-column weight λjkA set D of possible probability distributions for the productive data set may be defined, while the most likely probability distribution is estimated using the steady state distribution properties of the markov chain.
Further, the step of adding suitable noise protection privacy by the achievable privacy protection mechanism comprises the following steps:
the distance between two distributions is first measured by the earth movement distance EMD, which represents the minimum amount of movement between the two distributions and is defined as follows:
Figure BDA0002282908310000036
wherein gamma (mu, v) is the joint probability distribution of mu, v;
in the mechanism of the invention, firstly, a multi-dimensional Markov chain is used for accurately measuring the incidence relation between variables to obtain possible probability distribution, and then the soil moving distance EMD is used for measuring two conditional probability distributions P (F (X) siθ) and P (F (X) sjθ), final addition and Distance to the result of query F(P(F(X)|si,θ),P(F(X)|sjθ)) proportional laplacian noise.
Compared with the prior art, the beneficial effects are: the privacy protection method for the associated classified data sequences based on the Pufferfish framework considers the data sets with two kinds of correlation for the first time, namely the correlation between individuals and the correlation inside each sequence; the invention adopts a pufferfish framework as privacy definition of us, provides a mechanism, and is suitable for the condition that the correlation between data sets can be described by using a multi-dimensional Markov chain; the privacy protection method provided by the invention can protect the privacy of individuals while ensuring the usefulness of the data set, and can be applied to the associated classification sequences.
Drawings
FIG. 1 shows the present invention pufThe difference between two conditional probabilities is less than eWhere e is the privacy budget.
FIG. 2 is a diagram of a plurality of sorted associated data sequences in the present invention.
Fig. 3 is a schematic diagram of the privacy preserving mechanism of the present invention.
Detailed Description
The drawings are for illustration purposes only and are not to be construed as limiting the invention; for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted. The positional relationships depicted in the drawings are for illustrative purposes only and are not to be construed as limiting the invention.
Example 1:
a privacy protection method for associated classified data sequences based on a Pufferfish frame can be used for privacy protection of a plurality of associated classified sequences, the technical scheme is divided into three parts, firstly, the Pufferfish frame is introduced to make a strict privacy protection definition, then, a multi-dimensional Markov chain model is utilized to accurately describe two kinds of association between data, and finally, an achievable privacy protection mechanism is provided to add proper noise protection privacy.
A first part: a privacy protection definition (pusherfish privacy protection framework) is formulated, as shown in fig. 1.
The Pufferfish privacy protection framework is a generalization of differential privacy proposed in 2014, can adapt to the relevance condition of data, and can customize the content of privacy protection according to the needs. The bufferfish framework consists of three parts, Secret S: representing a sensitive collection of information that needs to be protected, i.e., S is used to represent a set of private information related to private data of an individual, e.g., Alice has a flu, Bob 10 is sleeping, etc.; secret pair S _ pairs: discrimination pair
Figure BDA0002282908310000041
Represents how to protect private information, i.e. to ensure that attackers cannot distinguishWherein the discrimination pair si,sjFor example, (Alice has a flu, Alice is healthy), (Bob 10 is sleeping, Bob 10 is exercising), etc.; finally, D represents the collection of all possible probability distributions that can be generated for the data set, representing how much background knowledge the attacker has, and each θ ∈ D represents one possible probability distribution that generates the data set, so the correlation between the data can be considered in D. Such as a markov chain (initial state + state transition matrix). After defining the above three parts, the bufferfish framework is defined as follows:
∈-Pufferfish(S,Spairsd) Privacy: given S, SpairsD, and privacy budget e for
Figure BDA0002282908310000051
If the privacy protection mechanism M satisfies the following formula, the privacy protection mechanism M satisfies the E-Pufferfish (S, S)pairs,D)Privacy:
Figure BDA0002282908310000052
Wherein p isX,M(M(X)=w|siTheta) and pX,M(M(X)=w|sjAnd theta) represents that the discrimination pairs are respectively s in the known probability distribution thetaiAnd sjThen, after the data is processed through a privacy protection mechanism M, the obtained query result is the conditional probability of w;
the final noisy query result needs to satisfy the privacy definition, wherein e represents the degree of privacy protection, and the smaller e the more privacy protection. And the three parts-S, S of the punfferfish are defined according to specific datapairs,D。
Examples are as follows:
data is activity monitoring data of a group of people, is a collection of a plurality of classification data sequences, each classification data sequence is highly correlated due to time sequence data, and activities between people also influence each other due to the group of people. A represents an active set { walking, sleeping, working, running },
Figure BDA0002282908310000053
representing the activity state of the kth individual at time t as a, i.e.
Figure BDA0002282908310000054
In the bufferfish framework, S is
Figure BDA0002282908310000055
The activity state of each person at any time t is sensitive information, namely the information which needs to be protected; spairsIs all the discrimination pairs
Figure BDA0002282908310000056
Ensuring that an attacker cannot distinguish whether the person is performing activity a or activity b at any time; d represents the collective set of probability distributions for all possible generated data sets, where a reasonable probability distribution is a multi-dimensional Markov chain, as will be described in more detail in the following paragraphs.
A second part: two correlations between data (using a multi-dimensional markov chain model) are described, as shown in fig. 2.
The second part of the invention is to accurately describe the relevance among data and reasonably express all possible probability distribution D, and because the two-dimensional relevance of the data needs to be considered, a multi-dimensional Markov chain model is adopted to model the data.
Assuming that the data set has s sequences, the sequences themselves are highly related and the sequences are also interdependent. By using
Figure BDA0002282908310000057
Representing the state probability distribution of the k-th sequence at n more times, i.e.
Figure BDA0002282908310000058
The state distribution of the jth sequence at time n +1 is related to the state distribution of the s sequences at time n, and is not related to the state before time n, so that the multidimensional Markov chain satisfies the following relation:
Figure BDA0002282908310000061
wherein the content of the first and second substances,
Figure BDA0002282908310000062
P(jk)for inter-column transition probability matrices, λjkAre inter-column weights.
The above relationship indicates that the state probability distribution of the jth sequence at time n +1 depends on the weighted average of the transition probability matrix and the state product at the previous time, which can be written in matrix form as:
Figure BDA0002282908310000063
by defining a parametric initial probability distribution y1Inter-column transition probability matrix P(jk)And an inter-column weight λjkA set D of possible probability distributions for the productive data set may be defined, while the most likely probability distribution may be estimated using the steady state distribution properties of the markov chain.
Examples are as follows:
a set of possible probability distributions, D, for a set of activity monitoring data is represented by a set of multi-dimensional markov chains, assuming that we have two activity classes, { walking, working }, and there are two classification sequences in the data set, the possible probability distributions, Θ e, D, can be represented by the following tuple:
Figure BDA0002282908310000064
the possible composition is:
Figure BDA0002282908310000065
and a third part: appropriate noise protection privacy is added with the privacy protection mechanism that can be implemented, as shown in fig. 3.
The third part of the invention is to provide a mechanism which can be realized and realize the frame based on the PufferfishPrivacy protection of the shelf for associated classified data sequences, the basic idea is to distribute P (F (X) s by calculationiθ) and P (F (X) sjθ) to add an appropriate proportion of laplacian noise on top of the correct query result.
The distance between two distributions is first measured by the Earth Movement Distance (EMD), which is the minimum amount of movement between two distributions and is defined as follows:
μ, ν is the probability distribution over R, Γ (μ, ν) represents the set of all possible joint distributions, the distance of μ, ν is:
Figure BDA0002282908310000071
wherein gamma (mu, v) is the joint probability distribution of mu, v;
in the mechanism of the invention, firstly, the relevance relation between variables is accurately measured by using a multi-dimensional Markov chain to obtain a possible probability distribution, and then two conditional probability distributions P (F (X) s are measured by using EMDiθ) and P (F (X) sjθ), final addition and Distance to the result of query F(P(F(X)|si,θ),P(F(X)|sjθ)) proportional laplacian noise, the specific algorithm is as follows:
Figure BDA0002282908310000072
through the three steps, the invention can ensure the privacy protection of the associated classified data sequence and ensure the accuracy, namely the query result is still available. A
The following takes a specific example to implement the specific scheme of the present invention, taking analog data as an example.
1. Generating a data set:
the simulated dataset is generated from a set of multidimensional markov chains, where the sequence length is T100, and contains a total of two sequences (s 2), and the states are {0, 1 }. The initial probability distribution is determined by two parameters,
Figure BDA0002282908310000073
and
Figure BDA0002282908310000074
respectively representing the probability that the initial state of the first sequence and the initial state of the second sequence are 0; the probability transition matrix is composed of 4 matrices, P(11),P(12),P(21)And P(22)From the parameters
Figure BDA0002282908310000076
And
Figure BDA0002282908310000075
determining that the probability of the last state being 0 to the next state and 0 and the probability of the last state being 1 to the next state and 1 respectively represent; inter-column weight λjkAll are 0.5, indicating the same effect factor.
2. Formulating privacy preserving definitions using multi-dimensional Markov chains to represent associations between data
Based on the bufferfish framework, the invention is applied to the data set, and S represents privacy information needing to be protected, which is referred to as the privacy information
Figure BDA0002282908310000081
SpairsIs all the discrimination pairs
Figure BDA0002282908310000082
That is, the single individual state in the two sequences is 0 or 1, which cannot be distinguished by an attacker, D is a combination of multidimensional markov chains, and is generated by using the parameters in 1.
3. Noise with proper proportion is added, different E-comparison privacy protection degrees are used, and the error between a true value and a result given by a mechanism is measured
The expression for query F is:
Figure BDA0002282908310000083
in this example, s is 2, T is 100,
Figure BDA0002282908310000084
is 0 or 1;
calculate the distribution P (F (X) s according to an algorithmiθ) and P (F (X) sjθ) Distance, add and Distance(P(F(X)|si,θ),P(F(X)|sjθ)) proportional noise;
the privacy budget belongs to the value of {0.2, 0.5, 1, 2, 5}, and simultaneously, the value L between the output value and the true value is calculated1The error is measured to ensure the accuracy of the invention while protecting privacy, and the table below shows the test result of the simulation data.
Table 1 simulation data test results
Figure BDA0002282908310000085
The experimental result shows that the error is reduced along with the increase of the privacy budget epsilon, namely, the smaller privacy budget can ensure more privacy, and meanwhile, the invention can ensure the privacy of the user while making the data set available, and can be applied to the privacy protection of the associated classification sequence.
It should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (5)

1. A privacy protection method for associated classified data sequences based on a Pufferfish framework is characterized in that the Pufferfish framework is introduced to formulate a strict privacy protection definition, then a multi-dimensional Markov chain model is utilized to accurately describe two kinds of association between data, and finally an achievable privacy protection mechanism is provided to add proper noise protection privacy.
2. The privacy protection method for associated classified data sequences based on the Pufferfish framework according to claim 1, wherein the Pufferfish framework comprises three parts: secret S: representing a collection of sensitive information that needs to be protected, namely S representing a set of private information related to personal private data; secret pair S _ pairs: discrimination pair
Figure FDA0002282908300000011
The collection of (A) represents how to protect the private information, i.e. it needs to be ensured that an attacker cannot distinguish the discrimination pair s thereini,sj(ii) a D represents the collection of all possible probability distributions that can be generated for the data set, representing how much background knowledge an attacker possesses, and each θ ∈ D represents one possible probability distribution that generates the data set, and the correlation between the data can be considered in D.
3. The privacy protection method for associated classified data sequences based on the Pufferfish framework as claimed in claim 2, wherein the Pufferfish framework is defined as follows:
∈-Pufferfish(S,Spairsd) Privacy: given S, SpairsD, and privacy budget e for
Figure FDA0002282908300000012
X to theta, if the privacy protection mechanism M meets the following formula, the privacy protection mechanism M meets the requirement of E-Pufferfish (S, S)pairs,D)Privacy:
Figure FDA0002282908300000013
In the formula, pX,M(M(X)=w|siTheta) and pX,M(M(X)=w|sjAnd theta) represents that the discrimination pairs are respectively s in the known probability distribution thetaiAnd sjIn time, through privacy protection machineAfter the M processes the data, obtaining a query result which is the conditional probability of w;
the final noisy query result needs to meet the privacy definition, wherein the element belongs to the degree of protecting privacy, and the smaller the element belongs to, the more the privacy is protected; and the three parts-S, S of the punfferfish are defined according to specific datapairs,D。
4. The privacy protection method for associated classified data sequences based on Pufferfish framework according to claim 2, wherein said accurately describing two correlations between data by using multi-dimensional Markov chain model comprises the following steps:
assuming that the data set has s sequences, the sequences are highly correlated and the sequences are also interdependent; by using
Figure FDA0002282908300000014
Representing the state probability distribution of the k-th sequence at n more times, i.e.
Figure FDA0002282908300000021
The state distribution of the jth sequence at time n +1 is related to the state distribution of the s sequences at time n, and is not related to the state before time n, so that the multidimensional Markov chain satisfies the following relation:
Figure FDA0002282908300000022
wherein the content of the first and second substances,
Figure FDA0002282908300000023
λjk≥0,1≤j,k≤s,P(jk)for inter-column transition probability matrices, λjkIs the inter-column weight;
the above relationship indicates that the state probability distribution of the jth sequence at time n +1 depends on the weighted average of the transition probability matrix and the state product at the previous time, which can be written in matrix form as:
Figure FDA0002282908300000024
by defining a parametric initial probability distribution y1Inter-column transition probability matrix P(jk)And an inter-column weight λjkA set D of possible probability distributions for the productive data set may be defined, while the most likely probability distribution is estimated using the steady state distribution properties of the markov chain.
5. The privacy protection method for associated classified data sequences based on Pufferfish framework as claimed in claim 2, wherein said realizable privacy protection mechanism adding proper noise protection privacy comprises the following steps:
the distance between two distributions is first measured by the earth movement distance EMD, which represents the minimum amount of movement between the two distributions and is defined as follows:
Figure FDA0002282908300000025
wherein gamma (mu, v) is the joint probability distribution of mu, v;
in the mechanism of the invention, firstly, a multi-dimensional Markov chain is used for accurately measuring the incidence relation between variables to obtain possible probability distribution, and then the soil moving distance EMD is used for measuring two conditional probability distributions P (F (X) siθ) and P (F (X) sjθ), final addition and Distance to the result of query F(P(F(X)|si,θ),P(F(X)|sjθ)) proportional laplacian noise.
CN201911148569.1A 2019-11-21 2019-11-21 Privacy protection method for associated classified data sequence based on Pufferfish framework Pending CN110968893A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911148569.1A CN110968893A (en) 2019-11-21 2019-11-21 Privacy protection method for associated classified data sequence based on Pufferfish framework

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911148569.1A CN110968893A (en) 2019-11-21 2019-11-21 Privacy protection method for associated classified data sequence based on Pufferfish framework

Publications (1)

Publication Number Publication Date
CN110968893A true CN110968893A (en) 2020-04-07

Family

ID=70031097

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911148569.1A Pending CN110968893A (en) 2019-11-21 2019-11-21 Privacy protection method for associated classified data sequence based on Pufferfish framework

Country Status (1)

Country Link
CN (1) CN110968893A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111475852A (en) * 2020-06-19 2020-07-31 支付宝(杭州)信息技术有限公司 Method and device for preprocessing data aiming at business model based on privacy protection
CN112016123A (en) * 2020-09-04 2020-12-01 支付宝(杭州)信息技术有限公司 Verification method and device of privacy protection algorithm and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105612529A (en) * 2013-08-19 2016-05-25 汤姆逊许可公司 Method and apparatus for utility-aware privacy preserving mapping in view of collusion and composition

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105612529A (en) * 2013-08-19 2016-05-25 汤姆逊许可公司 Method and apparatus for utility-aware privacy preserving mapping in view of collusion and composition

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SHUANG SONG等: "《2017 55th Annual Allerton Conference on Communication, Control, and Computing (Allerton)》", 31 January 2018 *
SHUANG SONG等: "《In Proceedings of the 2017》", 31 December 2017 *
王豪等: "CLM:面向轨迹发布的差分隐私保护方法", 《通信学报》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111475852A (en) * 2020-06-19 2020-07-31 支付宝(杭州)信息技术有限公司 Method and device for preprocessing data aiming at business model based on privacy protection
CN111475852B (en) * 2020-06-19 2020-09-15 支付宝(杭州)信息技术有限公司 Method and device for preprocessing data aiming at business model based on privacy protection
CN112016123A (en) * 2020-09-04 2020-12-01 支付宝(杭州)信息技术有限公司 Verification method and device of privacy protection algorithm and electronic equipment

Similar Documents

Publication Publication Date Title
Alzahrani et al. Forecasting the spread of the COVID-19 pandemic in Saudi Arabia using ARIMA prediction model under current public health interventions
Xu et al. GANobfuscator: Mitigating information leakage under GAN via differential privacy
CN110085327A (en) Multichannel LSTM neural network Influenza epidemic situation prediction technique based on attention mechanism
WO2022160623A1 (en) Teacher consensus aggregation learning method based on randomized response differential privacy technology
WO2023000794A1 (en) Service prediction model training method and apparatus for protecting data privacy
CN110968893A (en) Privacy protection method for associated classified data sequence based on Pufferfish framework
Tchoumi et al. Dynamic of a two-strain COVID-19 model with vaccination
CN114332545A (en) Image data classification method and device based on low-bit pulse neural network
Zhang et al. Precursory pattern based feature extraction techniques for earthquake prediction
Verma et al. A new backpropagation neural network classification model for prediction of incidence of malaria
Duan Performance evaluation and practical use of supervised data mining algorithms for credit card approval
Chang et al. Sufficient and necessary conditions of near-optimal controls for a diffusion dengue model with Lévy noise
CN104573726B (en) Facial image recognition method based on the quartering and each ingredient reconstructed error optimum combination
Prasad et al. Multimodeling approach to evaluating the efficacy of layering pharmaceutical and nonpharmaceutical interventions for influenza pandemics
Shook et al. Investigating the influence of spatial and temporal granularities on agent‐based modeling
Sun et al. Dpauc: Differentially private auc computation in federated learning
Adedotun et al. Relationship between the standard of living, economic situation, and security situation of Nigerians during the Covid-19 pandemic. A non-parametric analysis approach
CN106529562A (en) OSS (Open Source software) project developer prediction method based on Email networks
Alsulaimawi A privacy filter framework for internet of robotic things applications
CN108876210A (en) A kind of recognition methods, system and the device of land use and change causal structure
Hu et al. Who are the ‘silent spreaders’?: Contact tracing in spatio-temporal memory models
Xie et al. Federated Learning With Personalized Differential Privacy Combining Client Selection
Rinott et al. A smoothing model for sample disclosure risk estimation
Khademolghorani An effective algorithm for mining association rules based on imperialist competitive algorithm
Lorenzi et al. Hierarchical infinite factor models for improving the prediction of surgical complications for geriatric patients

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200407

RJ01 Rejection of invention patent application after publication