CN110334539A - A kind of personalized method for secret protection and device based on random response - Google Patents
A kind of personalized method for secret protection and device based on random response Download PDFInfo
- Publication number
- CN110334539A CN110334539A CN201910507049.9A CN201910507049A CN110334539A CN 110334539 A CN110334539 A CN 110334539A CN 201910507049 A CN201910507049 A CN 201910507049A CN 110334539 A CN110334539 A CN 110334539A
- Authority
- CN
- China
- Prior art keywords
- value
- sensitivity value
- sensitivity
- weight
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Databases & Information Systems (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Storage Device Security (AREA)
Abstract
The embodiment of the invention provides a kind of personalized method for secret protection and device based on random response; wherein; the method is applied to client; it include: to obtain user to treat the target susceptibility value that collection target private data is inputted; the target private data corresponds to multiple sensitivity values, and the target susceptibility value is the sensitivity value in the multiple sensitivity value;Using preset personalized random response model, the target susceptibility value is disturbed, the sensitivity value after being disturbed;Wherein, the model is the introduction of the response model of the weight of each sensitivity value about target private data, and the model makes: probability of the probability of the sensitivity value output true value high to weight lower than the sensitivity value output true value low to weight.By this programme can solve it is existing based on the relevant technologies of random response privacy collect during, to certain private data overprotections, to the problem of certain private data guard deficiencies.
Description
Technical field
The present invention relates to private data guard technical fields, more particularly to a kind of personalized privacy based on random response
Guard method and device.
Background technique
Data be in actual life by the reflection of respondent's concrete condition.In order to carry out various investigation, investigate
Personnel (data collector) are usually by modes such as questionnaire surveys to by respondent (data set provider/participant/interviewee)
It is putd question to, survey data required for obtaining.However, being usually contained the personal letter of extreme sensitivity in the answer of respondent
Breath, data collector, which directly collects these data, may result in serious individual privacy leakage.
In the prior art, Warner is put forward for the first time in nineteen sixty-five utilizes random response (Randomized Response, RR)
The collection of technology (hereinafter referred to as W-RR) progress privacy-sensitive data.Localization Privacy Preservation Mechanism based on RR model be intended to by
Privacy leakage is controlled in source, while paying close attention to the quality of statistical data.Classical way in the technological borrowing statistical research, mould
Quasi- investigator is collected into the process of valuable statistical data under the premise of not invading interviewee's privacy as far as possible.Random response skill
Main current turbulence mechanism of the art as the localization secret protection technology based on data distortion, model simple is intuitive and is easy to real
It is existing, and its level of disruption can directly quantify, the function admirable in terms of the estimation of statistical property, therefore receive extensive pass
Note.
Since RR protects the privacy of data set provider by the way of answering according to probability, guarantee that tender subject is answered and had
Very strong falsifiability has Privacy Safeguarding, in the secret protection tool and Apple System of Google Chrome
Middle application.Meanwhile RR has fully considered a possibility that data collector steals or reveals privacy of user in data-gathering process, it should
Interviewee independently can carry out privacy processing to individual data items in model, even data collector can not also obtain definitely
Original private data, excite significantly interviewee participate in data collection enthusiasm.Therefore, it is protected different from the privacy of centralization
Protection mechanism is for trusted third party it is assumed that the localization Privacy Preservation Mechanism based on RR no longer needs Jie of trusted third party
Enter, while also eliminating the possible privacy leakage of insincere third party data collector and privacy attack.
Based on the W-RR model of above-mentioned classics, it is modeled usually using discrete memoryless binary channel, and is had
Scholar is generalized to the symmetrical privacy information source of Multivariate Discrete, successively proposes the Privacy Preservation Mechanisms such as K-RR, O-RR, and has been obtained wide
General application.
Difference privacy in 2006 is suggested as a kind of generally acknowledged strong secret protection model, receives many scholars'
Favor.This development trend is complied with, many scholars have made intensive studies the localization difference privacy mechanism under RR model, should
The secret protection degree of mechanism is measured in research by single privacy budget parameters ε, is concentrated mainly on difference optimal under RR model
The design and analysis for dividing privacy mechanism pays close attention to the privacy-effectiveness compromise of whole system objectively.
It is above routine random response model (abbreviation CRR model, Conventional Randomized Response) and
In its related application and research, concern is primarily with the objective secret protection degree of whole system.The model defaults different privacies
Data are of equal importance, and secret protection demand is identical, and then are carried out identical privacy preserving operations to all private datas.
However, there are great differences for the subjective feeling (i.e. susceptibility) due to data set provider to private data, and privacy
The susceptibility of data reflects the demand height of secret protection to a certain extent;Therefore, the secret protection of data set provider needs
It asks and is not quite similar, varies with each individual.In other words, in the collection of actual private data, private data is not of equal importance.For this
Kind of situation, existing CRR model default that different private datas are of equal importance, secret protection demand is identical, this is likely to result in certain
A little private data overprotections, and the shortcoming protection of certain private datas, it is especially excessive to the private data of certain low sensitivities
Protection is short of protection to the private data of certain high sensitives.
Summary of the invention
The embodiment of the present invention is designed to provide a kind of personalized method for secret protection and device based on random response,
To solve the prior art during privacy is collected, to certain private data overprotections, not to certain private data guards
The problem of foot.Specific technical solution is as follows:
In a first aspect, the embodiment of the invention provides a kind of, the personalized method for secret protection based on random response is applied to
Client, which comprises
It obtains user and treats the target susceptibility value that collection target private data is inputted, the target private data correspondence is more
A sensitivity value, the target susceptibility value are the sensitivity value in the multiple sensitivity value;
Using preset personalized random response model, the target susceptibility value is disturbed, it is quick after being disturbed
Inductance value;Wherein, the personalized random response model is the introduction of the weight of each sensitivity value about target private data
Response model, and the personalized random response model makes: the probability of the sensitivity value output true value high to weight lower than pair
The probability of the low sensitivity value output true value of weight;Wherein, the weight of the high sensitivity value of secret protection demand is higher than secret protection
The weight of the low sensitivity value of demand.
Optionally, the expression formula of the personalized random response model are as follows:
Wherein, xiFor sensitivity value to be disturbed;
M is the number of each sensitivity value about target private data, m >=2;
wiFor the sensitivity value xiWeight;
Pi,j(wi) it is about the weight wiSubtraction function;
yjFor the sensitivity value after the corresponding disturbance of the sensitivity value wait disturb.
Optionally, in the personalized random response model,
Wherein, μiFor the adjustment parameter of preset secret protection degree.
Optionally, described using preset personalized random response model, the target susceptibility value is disturbed, is obtained
Sensitivity value after disturbance, comprising:
According to P=Pi,j(wi) probability to the target susceptibility value respond output true value, according to's
Probability responds output disturbance value to the target susceptibility value, the sensitivity value after being disturbed.
Optionally, in the personalized random response model,And 0 < wi<m。
Second aspect, the personalized privacy protection device based on random response that the embodiment of the invention also provides a kind of, is answered
For client, described device includes:
Module is obtained, treats the target susceptibility value collecting target private data and being inputted, the target for obtaining user
Private data corresponds to multiple sensitivity values, and the target susceptibility value is the sensitivity value in the multiple sensitivity value;
Module is disturbed, for disturbing, obtaining to the target susceptibility value using preset personalized random response model
Sensitivity value after to disturbance;Wherein, the personalized random response model is the introduction of about each quick of target private data
The response model of the weight of inductance value, and the personalized random response model makes: the sensitivity value output true value high to weight
Probability lower than low to weight sensitivity value output true value probability;Wherein, the weight of the high sensitivity value of secret protection demand
The weight of the sensitivity value low higher than secret protection demand.
Optionally, the expression formula of the personalized random response model are as follows:
Wherein, xiFor sensitivity value to be disturbed;
M is the number of each sensitivity value about target private data, m >=2;
wiFor the sensitivity value xiWeight;
Pi,j(wi) it is about the weight wiSubtraction function;
yjFor the sensitivity value after the corresponding disturbance of the sensitivity value wait disturb.
Optionally, in the personalized random response model,
Wherein, μiFor the adjustment parameter of preset secret protection degree.
Optionally, the disturbance module is specifically used for according to P=Pi,j(wi) probability the target susceptibility value is responded
True value is exported, according toProbability to the target susceptibility value respond output disturbance value, it is quick after being disturbed
Inductance value.
Optionally, in the personalized random response model,And 0 < wi<m。
The third aspect, the embodiment of the invention also provides a kind of electronic equipment, including processor, communication interface, memory
And communication bus, wherein processor, communication interface, memory complete mutual communication by communication bus;
Memory, for storing computer program;
Processor when for executing the program stored on memory, realizes method and step described in above-mentioned first aspect.
Fourth aspect, it is described computer-readable to deposit the embodiment of the invention also provides a kind of computer readable storage medium
It is stored with computer program in storage media, when the computer program is executed by processor, realizes described in above-mentioned first aspect
Method and step.
The difference of present invention method prominent high sensitive privacy and low sensitivity privacy, fully considers different sensitivities
It is worth personalized secret protection demand, introduces sensitivity value weight, and be introduced into the decision of random response, based on random
During the localization data of response is collected, data set provider can carry out the hidden of personalization to private data according to the privacy requirements of oneself
Privates reason: carrying out high-intensitive random perturbation to the sensitive Value Data of high sensitive, and the data of low sensitivity carry out lower strong
The random perturbation of degree;Respective desired secret protection degree can be reached by ensuring different sensitivity value groups, realize individual character
The secret protection of change, this method can solve the prior art during privacy is collected, right to certain private data overprotections
The problem of certain private data guard deficiencies.
In addition, present invention method is capable of providing the higher noisy data of availability, has excellent statistical property.
Under conditions of the subjective privacy leakage degree of mechanism is certain, the personalized random response model that is used in the present embodiment method
The quality for the statistical data that (abbreviation PRR model, Personalized Randomized Response) is collected into is compared to CRR
Model is higher, that is, the statistical estimate error of PRR model is smaller, and the accuracy of statistical estimate is higher, is one kind towards statistics
With the practical approach of analysis, there is stronger practical significance.The noisy data that data collector is collected by statistics with analysis, just
The distribution situation of sensitive value information can be obtained, that is, obtain effective statistical estimate result.
In addition, the embodiment of the present invention can not only be to high sensitive in the survey based on polynary random response model
The sensitivity value of degree carries out sufficiently high secret protection, and carries out relatively small secret protection to the sensitivity value of low sensitivity and (reach
To its privacy requirements);And the privacy leakage degree (or secret protection grade) of each sensitivity value can intuitively be fed back
Interviewee (local end user) is given, interviewee is made to reacquire certain power control data (such as access right, limitation disposal right),
The enthusiasm of interviewee is excited, excitation user more objectively submits the data after the disturbance of random response mechanism, therefore has
Hope the trouble and worry for fundamentally solving data set provider, so be collected into compared to CRR model it is more objective, it is quality, available
Property higher (distribution of privacy information estimation accuracy is higher) data.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this
Some embodiments of invention for those of ordinary skill in the art without creative efforts, can be with
It obtains other drawings based on these drawings.
Fig. 1 is a kind of process of the personalized method for secret protection based on random response provided by the embodiment of the present invention
Figure;
Fig. 2 shows for a kind of structure of the personalized privacy protection device based on random response provided by the embodiment of the present invention
It is intended to;
Fig. 3 is the structural schematic diagram of a kind of electronic equipment provided by the embodiment of the present invention;
Fig. 4 is the schematic diagram that the localization privacy based on random response model collects process;
Fig. 5 is the objective privacy leakage degree figure of each sensitivity value under 3-CRR model;
Fig. 6 is the subjective privacy leakage degree figure of each sensitivity value under 3-PRR model;
Fig. 7 is the schematic diagram of the evaluated error of two kinds of models of 3-CRR and 3-PRR.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
In order to solve the prior art during privacy is collected, to certain private data overprotections, to certain privacy numbers
According to the problem that protection is insufficient, the embodiment of the present invention provides a kind of personalized method for secret protection and device based on random response.
The personalized method for secret protection based on random response is provided for the embodiments of the invention first below to be situated between
It continues.
Before introducing the method for the present invention, the application scenarios for collecting privacy information are briefly explained first.
When carrying out privacy collection, user logs in client, and the display interface of client is shown and the target to be collected
The related problem of private data and various options, user submit after selecting for problem option as answer, wherein each
Kind option is multiple sensitivity values of target private data, and the target susceptibility value that the answer is chosen by user.Client obtains
The answer that user submits, and the disturbance based on random response is carried out to the answer, the answer after being disturbed, i.e. noisy data.
Data analyst obtains noisy data of a large number of users about target private data, by a large amount of noisy datas and individual character being collected into
It is for statistical analysis to change random response model, the distribution situation of sensitive value information can be obtained, that is, obtain effective statistical estimate
As a result.It should be noted that the disturbance focused on to target susceptibility value of the invention, realizes the target susceptibility of high privacy requirements
Value executes the disturbance of higher-strength, and the target susceptibility value of low privacy requirements executes more low intensive disturbance, to realize personalization
Secret protection.As for by disturbance after a large amount of noisy datas it is for statistical analysis, by being analyzed in a large amount of noisy data
The statistical property of its group belongs to the prior art out, is not focus of the invention.
It is understood that method provided by the embodiment of the present invention is that the answer submitted to user of client is disturbed
Dynamic method.And provided method can solve the prior art during privacy is collected through the embodiment of the present invention, to certain
A little private data overprotections, to the problem of certain private data guard deficiencies.
As shown in Figure 1, the present embodiment provides a kind of personalized method for secret protection based on random response, is applied to client
End, the method may include:
S101 obtains user and treats the target susceptibility value that collection target private data is inputted, the target private data
Corresponding multiple sensitivity values, the target susceptibility value are the sensitivity value in the multiple sensitivity value;
It should be noted that target private data to be collected refers to that the individual privacy of user, privacy can be by questionnaire tune
A problem in looking into discloses.For example, for " you suffer from AIDS? " the problem of, the answer "Yes" or "No" of acquisition is
The individual privacy of user is contained.
User couple and the possible answer of target private data relevant issues, the as corresponding sensitivity value of target private data,
The quantity of sensitivity value is at least two.Target susceptibility value is some sensitivity value in multiple sensitivity values, refers to that user asks in answer
The practical answer made when topic, the quantity for single user its target susceptibility value is one.
Still with " you suffer from AIDS? " the problem of for, "Yes" or "No" is target private data corresponding two
A sensitivity value;If user answers "Yes", then "Yes" is target susceptibility value.It is, of course, understood that target susceptibility value is not
It is confined to "Yes" or "No".For example, when " are you with AIDS, cancer, which kind of disease in influenza? " is proposed for target group
The problem of when, it is " AIDS ", " cancer " and " influenza " respectively, and target that the corresponding sensitivity value of target private data, which is three,
Sensitivity value, the i.e. answer of user then can be " AIDS ", " cancer ", one in " influenza ".
S102 disturbs the target susceptibility value, after obtaining disturbance using preset personalized random response model
Sensitivity value;
Wherein, the personalized random response model is the introduction of the weight of each sensitivity value about target private data
Response model, and the personalized random response model makes: the probability of the sensitivity value output true value high to weight is lower than
The probability of the sensitivity value output true value low to weight;Wherein, the weight of the high sensitivity value of secret protection demand is protected higher than privacy
The weight of the low sensitivity value of shield demand.
Illustratively, in one implementation, the expression formula of personalized random response model can be with are as follows:
Wherein, xiFor sensitivity value to be disturbed;
M is the number of each sensitivity value about target private data, m >=2;
wiFor the sensitivity value xiWeight;
Pi,j(wi) it is about the weight wiSubtraction function;
yjFor the sensitivity value after the corresponding disturbance of the sensitivity value wait disturb.
For sensitivity value to be disturbed, user prompts according to the display interface of client, inputs some sensitivity to be disturbed
Value is used as target susceptibility value xi, the target susceptibility value x of client acquisition user's submissioni, to target susceptibility value xiIt is disturbed.Such as
Fruit target susceptibility value xiWeight wiGreatly, then output true value is responded to the target susceptibility value with lesser probability, that is, carried out high
The random perturbation of intensity;If target susceptibility value xiWeight wiIt is small, then the target susceptibility value is responded with biggish probability defeated
True value out carries out more low intensive random perturbation;Sensitivity value y after finally obtaining disturbancej。
Based on the expression formula of above-mentioned personalized random response model, the specific implementation of step S102 may include:
According to P=Pi,j(wi) probability to the target susceptibility value respond output true value, according toProbability to institute
State target susceptibility value response output disturbance value, the sensitivity value after being disturbed.
It should be noted that the weight of the sensitivity value of private data is big in personalization random response model of the embodiment of the present invention
The height of the small secret protection demand for reflecting individual: the weight of sensitivity value is bigger, and the demand of secret protection is higher.Weight is set
Surely follow two principles: the susceptibility of principle one, sensitivity value is higher, and the weight of corresponding sensitivity value is bigger;Principle two, sensitivity value
Frequency it is lower, corresponding sensitivity value weight is bigger.The weight of sensitivity value and specific application scenarios, individual subjective feeling,
The factors such as individual preference, economic benefit are related.In addition, the setting of weight is generally by proposing ask related with target private data
The problem of topic distribution platform or the comprehensive each factor of data analyst determine, naturally it is also possible to open permission is customized by the user.
The difference of present invention method prominent high sensitive privacy and low sensitivity privacy, fully considers different sensitivities
It is worth personalized secret protection demand, introduces sensitivity value weight, and be introduced into the decision of random response, based on random
During the localization data of response is collected, data set provider can carry out the hidden of personalization to private data according to the privacy requirements of oneself
Privates reason: carrying out high-intensitive random perturbation to the sensitive Value Data of high sensitive, and the data of low sensitivity carry out lower strong
The random perturbation of degree;Respective desired secret protection degree can be reached by ensuring different sensitivity value groups, realize individual character
The secret protection of change, the method overcome CRR models to be carried out muting sensitive sense brought by identical operation to all sensitivity values
The sensitivity value overprotection of degree, and the insufficient disadvantage of sensitive value protection of high sensitive can solve the prior art and receive in privacy
During collection, to certain private data overprotections, to the problem of certain private data guard deficiencies.
In order to meet sensitivity value weight wiBigger, secret protection demand is higher, and tendency response exports the probability of its true value
Relatively low trend, in one implementation, in personalized random response model:
Wherein, μiFor the adjustment parameter of preset secret protection degree, and μi≥0。
It should be noted that can be according to sensitivity value xiParameter μ is arranged in the specific secret protection demand of groupi.In practice may be used
To pass through adjustment parameter μiTo realize personalized secret protection, while the subjective privacy leakage degree of Controlling model: privacy is protected
Shield demand is higher, it is contemplated that lesser μ is arrangediValue, vice versa.
Optionally, in personalized random response model in the above-described embodiment,And 0 < wi<m.It can
With understanding, sensitivity value weight wiMeetAnd 0 < wiWhen < m, i.e. satisfaction " returning mization " condition, it is based on PRR model
All theory deductions and analysis be suitable for CRR model, it can compatible CRR model.Stated differently, since CRR model is simultaneously
The sensitivity for not considering each sensitivity value is considered as a kind of equal situation of sensitivity value weight, i.e., to institute in PRR model
Some sensitivity value xi, sensitivity value weight is wi≡ 1 (i=1,2 ..., m), after returning mization to handle, CRR model is PRR mould
A kind of special case of type.
As shown in Fig. 2, corresponding to above method embodiment, the embodiment of the invention also provides a kind of based on random response
Personalized privacy protection device, is applied to client, and described device includes:
Module 201 is obtained, treats the target susceptibility value collecting target private data and being inputted, the mesh for obtaining user
Mark private data corresponds to multiple sensitivity values, and the target susceptibility value is the sensitivity value in the multiple sensitivity value;
Module 202 is disturbed, for being disturbed to the target susceptibility value using preset personalized random response model
It is dynamic, the sensitivity value after being disturbed;Wherein, the personalized random response model is the introduction of about each of target private data
The response model of the weight of a sensitivity value, and the personalized random response model makes: the sensitivity value output high to weight is true
Probability of the probability of real value lower than the sensitivity value output true value low to weight;Wherein, secret protection demand high sensitivity value
Weight is higher than the weight of the low sensitivity value of secret protection demand.
Optionally, the expression formula of the personalized random response model are as follows:
Wherein, xiFor sensitivity value to be disturbed;
M is the number of each sensitivity value about target private data, m >=2;
wiFor the sensitivity value xiWeight;
Pi,j(wi) it is about the weight wiSubtraction function;
yjFor the sensitivity value after the corresponding disturbance of the sensitivity value wait disturb.
Optionally, in the personalized random response model,
Wherein, μiFor the adjustment parameter of preset secret protection degree.
Optionally, the disturbance module 202 is specifically used for according to P=Pi,j(wi) probability to the target susceptibility value ring
True value should be exported, according toProbability to the target susceptibility value respond output disturbance value, after being disturbed
Sensitivity value.
Optionally, in the personalized random response model,And 0 < wi<m。
As shown in figure 3, additionally providing a kind of electronic equipment, including processor in another embodiment provided by the invention
301, communication interface 302, memory 303 and communication bus 304, wherein processor 301, communication interface 302, memory 303 are logical
It crosses communication bus 304 and completes mutual communication;
Memory 303, for storing computer program;
Processor 301 when for executing the program stored on memory, realizes that the embodiment of the present invention is based on random response
Personalized method for secret protection.
In another embodiment provided by the invention, a kind of computer readable storage medium, the computer are additionally provided
It is stored with computer program in readable storage medium storing program for executing, when the computer program is executed by processor, realizes the embodiment of the present invention
Personalized method for secret protection based on random response.
Acquired beneficial effect in order to further illustrate the present invention, below by the PRR model of the invention used and now
There is the CRR model in technology to compare, the design and analysis of PRR model of the present invention itself is described in further detail.
For CRR model in the prior art, default privacy data be it is of equal importance, then different sensitivity values is hidden
Private protection demand is identical.The secret protection model for setting CRR model is PCRR, it can be indicated by the row stochastic matrix of m × m:
Identical privacy preserving operations are carried out to every kind of sensitivity value --- it is (i.e. true with the probability output true value of P
Sensitivity value), one of other m-1 kind sensitivity values are exported with the probability respondence of (1-P)/(m-1), which is referred to as
M-CRR model, wherein PCRRFor the symmetrical matrix of m × m, and meet | PCRR| ≠ 0, i.e. PCRRThere are invertible matrix.
The parameter designing of K-RR model is used for reference, parameter lambda >=0 is introduced, in order to meet all sensitivity value xiSecret protection need
It asks, while requiring the privacy leakage risk of data minimum, to all sensitivity value xiWithProbability
Response output true value, substitutes into structural formula (1), then PCRRSecret protection model expression formula are as follows:
In practice, lambda parameter can be according to sensitivity value xiSpecific secret protection demand is configured.Given sensitivity value weight
wi1, secret protection demand is bigger, then corresponding λ should be arranged smaller.
And in personalized random response model of the invention, for sensitivity value xi, subjective sensitivity is higher, corresponding quick
Inductance value weight wiBigger, the demand to secret protection is also higher.PRR model considers personalized secret protection demand, will be sensitive
It is worth weight wiIt is introduced into sensitivity value xiIn the decision of random response, then the secret protection model P of PRR modelPRRIt is described as
The PRR model is referred to as m-PRR model.Wherein, Pi,i(wi) can vividly describe to possess sensitivity value xiIndividual
The probability told the truth.By formula (3) it is found that sensitivity value x under m-PRR modeliThe probability for exporting true value is Pi,i(wi), and Pi,i
(wi) it is about wiSubtraction function.This is because working as Pi,i(wi) when meeting some requirements, sensitivity value x is reflected from sidei
Secret protection degree: Pi,i(wi) smaller, sensitivity value xiSecret protection degree it is higher, corresponding privacy leakage risk is smaller,
Vice versa.
For sensitivity value xi, according to sensitivity value weight wiWith secret protection model PPRRRelationship, to all sensitivity value xiIt is (quick
Inductance value group) withProbability respondence its true value, meet simultaneously: Pi1,i1≤Pi2,i2≤...≤
Pim,im, substitute into structural formula (2), then PPRRThe expression formula of secret protection model be
Wherein, μiIt can be according to sensitivity value xiThe specific secret protection demand of group is configured, and mainly overcomes CRR mould
Type is carried out this problem of identical privacy preserving operations for all sensitivity values, realizes the personalization towards more sensitivity values
Secret protection.For sensitivity value xiFor, sensitivity value weight wiIt is bigger, it is meant that secret protection demand is higher, tendency response output
The probability of its true value is relatively low.
The embodiment of the present invention passes through personalized random response model in view of the susceptibility of different sensitivity values is different, to privacy
The sensitivity value x of dataiPersonalized random perturbation is carried out, realizes personalized secret protection;To the sensitivity value number of high sensitive
According to the random perturbation for carrying out high intensity, and more low intensive random perturbation is carried out to the data of low sensitivity, that is, susceptibility is high
Sensitivity value individual, with lower probability output true value;And the sensitivity value that susceptibility is low, with higher probability output true value;
It is mutually indepedent between each individual;Obtain noisy data.The noisy data that data collector is collected by statistics with analysis,
The distribution situation of sensitive value information is obtained, that is, obtains effective statistical estimate result.
Due to weighting maximum a posteriori probability (PMAP) criterion according to privacy, the subjective privacy leakage of PRR model can be obtained
Degree Lcw, each sensitivity value subjective privacy leakage degree Lcwi, so that the privacy for measuring whole system and each sensitivity value is protected
Therefore shield degree is defined as follows first.
Define 1PMAP criterion:
Select reconstruction of functionIt is allowed to meet condition
Wherein,Indicate weighting posteriority transition probability wiPX|Y(xi|yj) weight of target susceptibility value is corresponded to when taking maximum,
Sensitivity value y after indicating disturbancejSensitive value information after being denoised by reconstruction of function.Above-mentioned criterion is known as " privacy weighting maximum
Posterior probability (PMAP) criterion " or " privacy weighting minimum total error probability criterion ".
Define 2 privacies weighting accuracy/Bayes's effectiveness:
Pass through weight wiUnder conditions of (i=1,2 ..., m) portrays privacy scheme taker acquisition posterior information, using PMAP standard
Then, it enables
It then can correctly deduce the probability-weighted with subjective feeling privacy information
Claim the probability PcwAccuracy is weighted for privacy or privacy weights Bayes's effectiveness, wherein PY(yj) indicate that disturbance is defeated
Sensitivity value y outjProbability.PcwModel subjective data safety: P can be measuredcwBigger, the subjective safety of model is lower.
In addition, PcwVisual interpretation are as follows: in primary conjecture, consider that sensitivity value weight bring influences, benefit weighted most according to privacy
Bigization principle, the correct maximum weighted probability guessed (or reasoning) and go out to participate in the true privacy information of individual.
Define 3 subjective privacy leakage degree:
The subjective privacy leakage degree of PRR model be defined as privacy weighting accuracy under the conditions of known posterior information with only
The ratio of privacy weighting accuracy under the conditions of known prior information, is described as
Wherein,Lcw>=1 reflection is to consider that sensitivity value weight bring influences, PMAP criterion
Under by secret protection model bring with respect to privacy leakage degree, value is bigger, and corresponding privacy leakage degree is higher.
For sensitivity value xiGroup for, that focus more on is sensitivity value xiPrivacy leakage degree or secret protection
Degree, and the subjective secret protection degree of not entire secret protection system.For PRR model, sensitivity value is under PMAP criterion
xiAnd correctly deduce xiPrivacy probability-weighted be
Wherein,It indicates to Hamming distance inversion operation, ifThenOtherwise
Pcw(xi) meet:
Define 4 sensitivity value xiSubjective privacy leakage degree:
Under PRR model, sensitivity value xiSubjective privacy leakage degree LcwiBeing defined as sensitivity value under PMAP criterion is xiAnd just
Really deduce xiPrivacy probability-weighted and sensitivity value xiPrivacy weighting prior probability ratio, be described as
Wherein, PX(xi) indicate sensitivity value x to be collectediProbability, LcwiMeet: 0≤Lcwi≤1。LcwiIt is bigger, illustrate sensitivity
Value xiSubjective privacy leakage degree is higher, it is meant that PRR model is to sensitivity value xiDegree of protection is lower.In practice, it is contemplated that
The susceptibility of each sensitivity value is different, the sensitivity value high for susceptibility, and subjective privacy leakage degree should be relatively small, such as
Higher secret protection degree may be implemented in this;And the sensitivity value low for susceptibility, subjective privacy leakage degree should can be with
It is relatively large, because its demand to secret protection degree is lower.
For ease of description, following simplification: P is done to the notation of partial symbolsX(xi) it is abbreviated as Pi(i ∈ 1,2 ... m }),
CRR and PRR model is denoted as P respectivelyCRR、PPRR.Sensitivity value weight is arranged according to sequence from big to small: wi1≥wi2≥...≥
wim, the secret protection demand of corresponding sensitivity value is sequentially reduced, according to sensitivity value weight wiWith the relationship (formula of channel transition probability
(2)), should meet by the probability that RR model exports its true value accordingly: Pi1,i1≤Pi2,i2≤...≤Pim,im, wherein quick
Inductance value weight wi1Corresponding sensitivity value xi1Group's secret protection demand is maximum, and by PRR model, its secret protection degree also should
It is maximum.
According to sensitivity value weight wiWith secret protection model PPRRRelationship (formula (2)), each sensitivity value by PRR model ring
The probability that its true value should be exported meets: Pi1,i1≤Pi2,i2≤...≤Pim,im.At this point, P under note PRR modeliExperience estimation
Value is denoted asPrior distribution PXExperience estimation be denoted asBy PXPPRR=PY, in statistical estimate model, give
PPRR, and | PPRR| ≠ 0, thenAlso it can be described as:
Wherein,For PPRRInverse matrix.
The assessment thought for using for reference the privacy profile estimation performance of K-RR model, usesTo assess PRR model
Privacy profile estimation problem, referred to as evaluated error: evaluated error is smaller, and the accuracy of privacy profile estimation is higher, i.e. data
Availability is higher.
For any m, N and Pi,i≠ 1/m (i=1,2 ..., m, N > > m), sensitive value information distribution estimates under PRR model
Meter error can be described as:
Wherein,Expression pairIts desired value is sought after carrying out two norm operations,Indicate PRR
Model lower channel transition probability inverse matrixThe i-th row, jth column element,Indicate PRR
Output data Y under modelNMiddle yjShared actual ratio,Indicate posterior probabilityDesired value.
Due to that can not deriveAnalytic solutions, formula (12) only providesThe expression formula of accuracy in the case of numerical solution.
In practice, PPRRIt is usually known, i.e. its inverse matrixBe it is known, therefore formula (12) be can be obtained by calculation.
Under CRR model, different sensitivity values are of equal importance, then the privacy requirements of each sensitivity value are identical.At this point, to each sensitivity
Value is performed both by identical privacy preserving operations, and the probability that response exports its true sensitivity value meets:
Pi1,i1=Pi2,i2=...=Pim,im=P.
In view of sensitivity value xi1Sensitivity value weight wi1Maximum, the demand highest of secret protection, response output true value
Probability Pi1,i1=P is minimum.
Remember P under CRR modeliExperience estimated value be denoted asPrior distribution PXExperience estimation be denoted asThen its
The evaluated error of privacy profile can be described as:
Smaller, i.e., evaluated error is smaller, illustrates that the accuracy of privacy profile estimation is higher.
The privacy profile estimated value of PRR modelPrivacy profile estimated value relative to CRR modelEfficiency be defined as
Relative efficiency Re:
Under certain subjective privacy leakage degree, the availability of data is maximized, that is, maximizes the statistical quality of data,
Relative efficiency ReMeet: Re≥1.This is because CRR model all uses identical privacy preserving operations to all sensitivity values, it is
Meet respective privacy requirements while minimizing privacy leakage degree, generally uses secret protection demand maximum sensitivity value institute
The secret protection parameter setting needed so will cause the low sensitivity value group overprotection of other susceptibilitys.And PRR model is
Its privacy parameters is set according to the personalized privacy requirements of specific sensitivity value, realizes personalized secret protection.On the other hand,
Demand of the sensitivity value group of low sensitivity to secret protection is lower, and the probability that tendency response exports its true value is relatively large,
So biggish contribution is made for the availability of statistical data.Obviously, the PRR model towards more sensitivity values that the present invention is mentioned
Estimated valueIt is more efficient compared to CRR model.
In order to make it easy to understand, being emulated below with reference to example by MATLAB, simulation investigator is come using random response model
It carries out data collection and then estimates the process of privacy information distribution.
The prior distribution of privacy information source is PX=[P1,P2,...,Pm], the sample size of participant is N (N > > m), sample
X1,X2,...,Xn,...,XN(n=1,2 ..., N) come from distribution PX, between Different Individual private data it is mutually indepedent and with point
Cloth, and assume that each individual only possesses a kind of sensitivity value.
As shown in figure 4, the sample of participant passes through RR model PY|XRandom perturbation processing (treat the sensitivity value of disturbance
Disturbed, the sensitivity value after obtaining corresponding disturbance), to obtain noisy data.In Fig. 4, the sample of participant is X1,
X2,...,Xn,...,XN(n=1,2 ..., N), sensitivity value to be disturbed are x1,…xi,…xm(i=1,2 ..., m), correspondence are disturbed
Sensitivity value after dynamic is y1,…yj,…ym(j=1,2 ..., m), the sample (i.e. noisy data) after disturbance are Y1,…Yn,…YN
(n=1,2 ..., N).
When emulation, setting participates in sample number N=105, sample value set X={ x1,x2,x3}={ AIDS, Cancer,
Flu }, i.e. m=3, every kind of disease represents a kind of sensitivity value, and samples sources are in the prior distribution P of privacy information sourceX=[P1,P2,P3]
=[0.2,0.3,0.5] (note: prior distribution P in practiceXIt is unknown, and m=3 is set herein, PX=[0.2,0.3,0.5]
An only special case, the PRR model that the present invention is mentioned are suitable for the individual character based on RR under any m >=2 yuan Discrete Finite sensitivity value
Change secret protection).
The secret protection model of formula (2) and formula (4) is respectively adopted in CRR model and PRR model, and wherein m=3, is denoted as 3-CRR
Model and 3-PRR model.
For the ease of analysis, μ is set in PRR modeli=μj=μ (i, j=1,2,3).
Wherein, according to sensitivity value weight wiSetting principle, three kinds of sensitivity value weights are as shown in table 1.
As shown in Table 1, sensitivity value={ AIDS, cancer, flu }, sensitivity value weight successively reduces, i.e. sensitivity value x1It is right
" AIDS " the susceptibility highest answered, the maximum of sensitivity value weight setting;And sensitivity value x3Corresponding " Flu " susceptibility is minimum,
The setting of its sensitivity value weight is minimum.
1 three kinds of sensitivity value weights of table and its prior probability distribution
Sensitivity value X | x1 | x2 | x3 |
Disease type | AIDS | Cancer | Flu |
Sensitivity value weight wi(×3) | 0.6 | 0.3 | 0.1 |
Probability | 0.2 | 0.3 | 0.5 |
In PRR model, the high sensitivity value individual of susceptibility, with lower probability output true value, and low quick of susceptibility
Inductance value is mutually indepedent between each individual with higher probability output true value.As shown in table 1, individual is submitting sensitive data
When, possess the individual of " AIDS " with 20% probability output true value, is in other two kinds of sensitivity values with 40% probability transfer
Any one;Similarly, possess the individual of " cancer " with the output true value of 30% probability, exported with 35% probability respondence
For any one in other two kinds of sensitivity values;Possess the individual of " flu " with the output true value of 50% probability, with 25%
Probability respondence output be other two kinds of sensitivity values in any one.
Fig. 5 gives the objective privacy leakage degree L of 3-CRR model each sensitivity value under CMAP criterionciWith parameter lambda
Change curve.
As shown in Figure 5, as λ≤1.65 (critical value is related with specific prior distribution), privacy leakage degree is by low
X is followed successively by height1、x2、x3, show the highest sensitivity value x of susceptibility1Secret protection degree highest, substantially to sensitivity value x1It is real
Complete protection is showed.
But as λ > 1.65, x1、x2、x3Subjective privacy leakage degree LciCurve co-insides illustrate CRR model at this time
Lower three kinds of sensitivity values are the same in secret protection degree objectively.In practice, x3Corresponding " Flu " group is perhaps to hidden
The demand of private protection is not high, and x1Demand of corresponding " AIDS " group to secret protection is relatively high, so causes low
The sensitivity value x of susceptibility3Overprotection, and the sensitivity value x of high sensitive1Shortcoming protection.
Therefore, under certain conditions, CRR model is identical to the secret protection degree of all sensitivity values, this be its not
Foot place.
Fig. 6 gives the subjective privacy leakage degree L of 3-PRR model each sensitivity value under PMAP criterioncwiWith parameter μ
Change curve.
It will be appreciated from fig. 6 that the subjective privacy leakage degree of three sensitivity values increases with the increase of parameter μ.
As one timing of parameter μ, the sensitivity value x being sequentially reduced for susceptibility1、x2、x3(be corresponding in turn to AIDS, Cancer,
Flu), subjective secret protection leak degree is sequentially increased, it is meant that sensitivity value x1、x2、x3Subjective secret protection degree successively
Increase, this is because the susceptibility of its sensitivity value is lower, corresponding secret protection demand is lower.In other words, PRR model is to quick
The highest sensitivity value x of sensitivity1Secret protection degree highest, the sensitivity value x minimum to susceptibility3Secret protection degree it is opposite
Minimum, this is also the PRR model of the invention mentioned compared to where the advantage of CRR model.
It is better than CRR model in terms of statistical quality to verify the PRR model that the present invention is mentioned, needs in identical condition
Under be compared and analyze, the subjective privacy leakage degree (define 3) of emulation setting of the embodiment of the present invention under two kinds of models
Centainly.
Fig. 7 provides the simulation value (i.e. calculated value) of the evaluated error of 3-CRR model and 3-PRR model privacy profile, and gives
The theoretical value of evaluated error is used as with reference to (Monte Carlo simulation is set as 10 out4It is secondary).
As shown in Figure 7, the simulation value and calculated value of privacy profile evaluated error are almost the same under two kinds of models, verifying
The above-mentioned correctness for quality of data theory analysis.
In addition, as subjective privacy leakage degree LcwOne timing, the evaluated error of CRR model privacy profile are above PRR mould
The accuracy of type, i.e. PRR modeling statistics estimation is higher, and performance is more preferably.
On the other hand, under two kinds of models the evaluated error of privacy profile with subjective privacy leakage degree LcwIncrease and
Reduce.But in survey and analysis, it is generally desirable to its subjective privacy leakage degree is the smaller the better but such by interviewee
The availability of data that will cause statistical analysis reduces, and illustrates it is conflicting between availability of data and privacy.In practice, it answers
The accuracy and secret protection demand according to specific privacy profile carry out good compromise.
Table 2 illustrates the respectively required parameter setting and privacy profile of two kinds of models under five kinds of subjective privacy leakage degree
The relative efficiency R of estimationeSimulation value (formula (14)), and provide theoretical value be used as with reference to (Monte Carlo simulation setting 104It is secondary).
As shown in Table 2, when the subjective privacy leakage degree of two kinds of models is identical, relative efficiency ReCalculated value and
Simulation calculation value is almost the same, and (conclusion of this and Fig. 7 are consistent to the correctness and validity for demonstrating theory analysis of the present invention
).
The parameter setting and its relative efficiency R of two kinds of models under conditions of the subjective privacy leakage degree of table 2 is certaineEmulation
Value and theoretical value
On the other hand, relative efficiency ReIt is all satisfied: Re>=1, it is meant that when private data is not of equal importance, consider individual character
The privacy requirements of change, under the premise of subjective privacy leakage degree is certain, PRR model is better than in terms of statistical data availability
CRR model, and PRR model can be directed to different sensitivity value flexible modulation privacy parameters μ, realize personalized secret protection,
This above-mentioned theory analysis result is consistent.I.e. under certain subjective privacy leakage, PRR model can be hidden according to each sensitivity value
The height of private demand realizes the personalized secret protection data collection based on random response.
Furthermore the subjective privacy leakage degree of two kinds of models increases with the increase of parameter lambda and μ, at this time privacy number
According to safety constantly reducing.Therefore, it needs that suitable λ and μ value is arranged according to specific secret protection demand in practice, in turn
By privacy leakage extent control in reasonable range, realization carries out good compromise between Information Security and statistical quality.
In short, CRR model is enough to solve the survey of problems and analysis if private data is of equal importance.But
Data in practice are not of equal importance, and secret protection demand is also not quite similar.Compared to CRR model, in the present embodiment method
The PRR model of use is by introducing sensitivity value weight, it is contemplated that the personalized privacy requirements of different sensitivity values overcome CRR mould
In type all sensitivity values are carried out with the operation of identical secret protection, it can be ensured that different sensitivity value groups is attained by one periodically
The secret protection degree of prestige, and compared to the sensitivity value of low sensitivity, the secret protection degree of the sensitivity value of high sensitive is higher,
Personalized secret protection is realized, there is important practical significance.
It should be noted that, in this document, relational terms such as first and second and the like are used merely to a reality
Body or operation are distinguished with another entity or operation, are deposited without necessarily requiring or implying between these entities or operation
In any actual relationship or order or sequence.Moreover, the terms "include", "comprise" or its any other variant are intended to
Non-exclusive inclusion, so that the process, method, article or equipment including a series of elements is not only wanted including those
Element, but also including other elements that are not explicitly listed, or further include for this process, method, article or equipment
Intrinsic element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that
There is also other identical elements in process, method, article or equipment including the element.
Each embodiment in this specification is all made of relevant mode and describes, same and similar portion between each embodiment
Dividing may refer to each other, and each embodiment focuses on the differences from other embodiments.Especially for system reality
For applying example, since it is substantially similar to the method embodiment, so being described relatively simple, related place is referring to embodiment of the method
Part explanation.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the scope of the present invention.It is all
Any modification, equivalent replacement, improvement and so within the spirit and principles in the present invention, are all contained in protection scope of the present invention
It is interior.
Claims (10)
1. a kind of personalized method for secret protection based on random response is applied to client, which is characterized in that the method packet
It includes:
It obtains user and treats the target susceptibility value that collection target private data is inputted, the target private data corresponds to multiple quick
Inductance value, the target susceptibility value are the sensitivity value in the multiple sensitivity value;
Using preset personalized random response model, the target susceptibility value is disturbed, the sensitivity value after being disturbed;
Wherein, the personalized random response model is the introduction of the response mould of the weight of each sensitivity value about target private data
Type, and the personalized random response model makes: the probability of the sensitivity value output true value high to weight is lower than low to weight
Sensitivity value output true value probability;Wherein, it is low to be higher than secret protection demand for the weight of the high sensitivity value of secret protection demand
Sensitivity value weight.
2. the method according to claim 1, wherein
The expression formula of the personalization random response model are as follows:
Wherein, xiFor sensitivity value to be disturbed;
M is the number of each sensitivity value about target private data, m >=2;
wiFor the sensitivity value xiWeight;
Pi,j(wi) it is about the weight wiSubtraction function;
yjFor the sensitivity value after the corresponding disturbance of the sensitivity value wait disturb.
3. according to the method described in claim 2, it is characterized in that, in the personalization random response model,
Wherein, μiFor the adjustment parameter of preset secret protection degree.
4. according to the method in claim 2 or 3, which is characterized in that it is described to utilize preset personalized random response model,
The target susceptibility value is disturbed, the sensitivity value after being disturbed, comprising:
According to P=Pi,j(wi) probability to the target susceptibility value respond output true value, according toProbability
Output disturbance value is responded to the target susceptibility value, the sensitivity value after being disturbed.
5. according to the method in claim 2 or 3, which is characterized in that in the personalization random response model,And 0 < wi<m。
6. a kind of personalized privacy protection device based on random response is applied to client, which is characterized in that described device packet
It includes:
Module is obtained, treats the target susceptibility value collecting target private data and being inputted, the target privacy for obtaining user
Data correspond to multiple sensitivity values, and the target susceptibility value is the sensitivity value in the multiple sensitivity value;
Module is disturbed, for disturbing, being disturbed to the target susceptibility value using preset personalized random response model
Sensitivity value after dynamic;Wherein, the personalized random response model is the introduction of each sensitivity value about target private data
Weight response model, and the personalized random response model makes: the sensitivity value output true value high to weight it is general
Probability of the rate lower than the sensitivity value output true value low to weight;Wherein, the weight of the high sensitivity value of secret protection demand is higher than
The weight of the low sensitivity value of secret protection demand.
7. device according to claim 6, which is characterized in that
The expression formula of the personalization random response model are as follows:
Wherein, xiFor sensitivity value to be disturbed;
M is the number of each sensitivity value about target private data, m >=2;
wiFor the sensitivity value xiWeight;
Pi,j(wi) it is about the weight wiSubtraction function;
yjFor the sensitivity value after the corresponding disturbance of the sensitivity value wait disturb.
8. device according to claim 7, which is characterized in that in the personalization random response model,
Wherein, μiFor the adjustment parameter of preset secret protection degree.
9. device according to claim 7 or 8, which is characterized in that
The disturbance module is specifically used for according to P=Pi,j(wi) probability to the target susceptibility value respond output true value, press
According toProbability output disturbance value, sensitivity value after being disturbed are responded to the target susceptibility value.
10. device according to claim 7 or 8, which is characterized in that in the personalization random response model,And 0 < wi<m。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910507049.9A CN110334539B (en) | 2019-06-12 | 2019-06-12 | Personalized privacy protection method and device based on random response |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910507049.9A CN110334539B (en) | 2019-06-12 | 2019-06-12 | Personalized privacy protection method and device based on random response |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110334539A true CN110334539A (en) | 2019-10-15 |
CN110334539B CN110334539B (en) | 2021-06-22 |
Family
ID=68140983
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910507049.9A Expired - Fee Related CN110334539B (en) | 2019-06-12 | 2019-06-12 | Personalized privacy protection method and device based on random response |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110334539B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111242194A (en) * | 2020-01-06 | 2020-06-05 | 广西师范大学 | Differential privacy protection method for affinity propagation clustering |
CN112084493A (en) * | 2020-09-18 | 2020-12-15 | 支付宝(杭州)信息技术有限公司 | Content risk applet identification method and device based on differential privacy protection |
CN112528607A (en) * | 2020-12-18 | 2021-03-19 | 辽宁工程技术大学 | Questionnaire survey system and method for sensitive questions |
CN112541574A (en) * | 2020-12-03 | 2021-03-23 | 支付宝(杭州)信息技术有限公司 | Privacy-protecting business prediction method and device |
CN113704827A (en) * | 2021-09-17 | 2021-11-26 | 支付宝(杭州)信息技术有限公司 | Privacy protection method and device in biological identification process |
CN117349896A (en) * | 2023-12-05 | 2024-01-05 | 中国电子科技集团公司第十研究所 | Data collection method, analysis method and analysis system based on sensitivity classification |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107871087A (en) * | 2017-11-08 | 2018-04-03 | 广西师范大学 | The personalized difference method for secret protection that high dimensional data is issued under distributed environment |
US20190065775A1 (en) * | 2017-08-25 | 2019-02-28 | Immuta, Inc. | Calculating differentially private queries using local sensitivity on time variant databases |
CN109829320A (en) * | 2019-01-14 | 2019-05-31 | 珠海天燕科技有限公司 | A kind for the treatment of method and apparatus of information |
-
2019
- 2019-06-12 CN CN201910507049.9A patent/CN110334539B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190065775A1 (en) * | 2017-08-25 | 2019-02-28 | Immuta, Inc. | Calculating differentially private queries using local sensitivity on time variant databases |
CN107871087A (en) * | 2017-11-08 | 2018-04-03 | 广西师范大学 | The personalized difference method for secret protection that high dimensional data is issued under distributed environment |
CN109829320A (en) * | 2019-01-14 | 2019-05-31 | 珠海天燕科技有限公司 | A kind for the treatment of method and apparatus of information |
Non-Patent Citations (4)
Title |
---|
JINFEI LIU,等: "Rating: Privacy Preservation for Multiple Attributes with Different Sensitivity", 《2011 11TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS》 * |
YUE WANG: "Using Randomized Response for Differential Privacy", 《WORKSHOP PROCEEDINGS OF THE EDBT/ICDT 2016 JOINT CONFERENCE》 * |
孙崇敬: "面向属性与关系的隐私保护数据挖掘理论研究", 《中国优秀博士学位论文全文数据库 信息科技辑》 * |
杨高明: "局部差分隐私约束的关联属性不变后随机响应扰动", 《电子学报》 * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111242194A (en) * | 2020-01-06 | 2020-06-05 | 广西师范大学 | Differential privacy protection method for affinity propagation clustering |
CN111242194B (en) * | 2020-01-06 | 2022-03-08 | 广西师范大学 | Differential privacy protection method for affinity propagation clustering |
CN112084493A (en) * | 2020-09-18 | 2020-12-15 | 支付宝(杭州)信息技术有限公司 | Content risk applet identification method and device based on differential privacy protection |
CN112084493B (en) * | 2020-09-18 | 2024-03-26 | 支付宝(杭州)信息技术有限公司 | Content risk applet identification method and device based on differential privacy protection |
CN112541574A (en) * | 2020-12-03 | 2021-03-23 | 支付宝(杭州)信息技术有限公司 | Privacy-protecting business prediction method and device |
CN112541574B (en) * | 2020-12-03 | 2022-05-17 | 支付宝(杭州)信息技术有限公司 | Privacy-protecting business prediction method and device |
CN112528607A (en) * | 2020-12-18 | 2021-03-19 | 辽宁工程技术大学 | Questionnaire survey system and method for sensitive questions |
CN112528607B (en) * | 2020-12-18 | 2024-05-14 | 辽宁工程技术大学 | Questionnaire investigation system and method for sensitive problems |
CN113704827A (en) * | 2021-09-17 | 2021-11-26 | 支付宝(杭州)信息技术有限公司 | Privacy protection method and device in biological identification process |
CN113704827B (en) * | 2021-09-17 | 2024-03-29 | 支付宝(杭州)信息技术有限公司 | Privacy protection method and device in biological identification process |
CN117349896A (en) * | 2023-12-05 | 2024-01-05 | 中国电子科技集团公司第十研究所 | Data collection method, analysis method and analysis system based on sensitivity classification |
CN117349896B (en) * | 2023-12-05 | 2024-02-06 | 中国电子科技集团公司第十研究所 | Data collection method, analysis method and analysis system based on sensitivity classification |
Also Published As
Publication number | Publication date |
---|---|
CN110334539B (en) | 2021-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110334539A (en) | A kind of personalized method for secret protection and device based on random response | |
Birkeland et al. | The network approach to posttraumatic stress disorder: A systematic review | |
Lanza et al. | Latent class analysis with distal outcomes: A flexible model-based approach | |
Yang | Health expenditure, human capital, and economic growth: an empirical study of developing countries | |
King et al. | The balance‐sample size frontier in matching methods for causal inference | |
Franck et al. | Accurate characterization of delay discounting: A multiple model approach using approximate Bayesian model selection and a unified discounting measure | |
Newman et al. | Mixture models and exploratory analysis in networks | |
Acock | Working with missing values | |
Feng et al. | On the simulation repetition and temporal discretization of stochastic occupant behaviour models in building performance simulation | |
Olawumi Israel‐Akinbo et al. | An investigation of multidimensional energy poverty among South African low‐income households | |
Thomson | Same effects in different worlds: the transposition of EU directives | |
Hotz et al. | Balancing data privacy and usability in the federal statistical system | |
Ozdemir et al. | Measuring development levels of NUTS-2 regions in Turkey based on capabilities approach and multi-criteria decision-making | |
Bang et al. | Average cost-effectiveness ratio with censored data | |
Zheng et al. | A solution approach to the weak linear bilevel programming problems | |
Ding et al. | How and when does perceived supervisor support for strengths use influence employee strengths use? The roles of positive affect and self-efficacy | |
Moodie et al. | Causal inference: Critical developments, past and future | |
Musiałkowska et al. | Successes & failures in EU cohesion policy: An introduction to EU cohesion policy in Eastern, Central, and Southern Europe | |
Ştefănescu et al. | Performance audit in the vision of public sector management. The case of Romania | |
Greyling et al. | Access to micro-and informal loans: Evaluating the impact on the quality of life of poor females in South Africa | |
Guha et al. | Bayesian causal inference with bipartite record linkage | |
Chiu et al. | Statistical inference for food webs with emphasis on ecological networks via Bayesian melding | |
Blom | Nonresponse bias adjustments: what can process data contribute? | |
Costa et al. | Deriving a preference-based utility measure for cancer patients from the European Organisation for the Research and Treatment of Cancer’s Quality of Life Questionnaire C30: a confirmatory versus exploratory approach | |
Acar et al. | Assessing robustness against potential publication bias in coordinate based fMRI meta-analyses using the Fail-Safe N |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20210622 |