CN108959961B - Privacy protection method for inquiring average score - Google Patents

Privacy protection method for inquiring average score Download PDF

Info

Publication number
CN108959961B
CN108959961B CN201810671583.9A CN201810671583A CN108959961B CN 108959961 B CN108959961 B CN 108959961B CN 201810671583 A CN201810671583 A CN 201810671583A CN 108959961 B CN108959961 B CN 108959961B
Authority
CN
China
Prior art keywords
noise
score
processing
privacy budget
privacy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810671583.9A
Other languages
Chinese (zh)
Other versions
CN108959961A (en
Inventor
陈志立
钱伟
张海龙
江婉榕
张宝根
姚明明
王宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui University
Original Assignee
Anhui University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui University filed Critical Anhui University
Priority to CN201810671583.9A priority Critical patent/CN108959961B/en
Publication of CN108959961A publication Critical patent/CN108959961A/en
Application granted granted Critical
Publication of CN108959961B publication Critical patent/CN108959961B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Storage Device Security (AREA)

Abstract

The invention discloses a privacy protection method, a privacy protection device, computer equipment and a storage medium for inquiring average score, wherein the method comprises the following steps: calculating a personal privacy budget threshold; generating a query condition, and judging whether the current privacy budget of the user exceeds a personal privacy budget threshold value; if the personal privacy budget threshold is not exceeded, calculating the sensitivity of the system; carrying out noise adding processing on the real score according to the system sensitivity; and performing numerical value processing on the real score subjected to the noise processing, and sending score information subjected to the numerical value processing to the user. By calculating the personal privacy budget threshold value, calculating the sensitivity of the system when the current privacy budget does not exceed the personal privacy preset threshold value, carrying out noise processing on the real result, carrying out numerical processing on the real result subjected to the noise processing, and then sending the real result to the user, the safety of the result inquiry system is improved when the system faces to the situation that multiple persons attack simultaneously in an organized way.

Description

Privacy protection method for inquiring average score
Technical Field
The invention relates to the technical field of computers, in particular to a privacy protection method and device for inquiring average score, computer equipment and a storage medium.
Background
With the increasing of enrollment scale, data in the educational administration management system is increased sharply, and a common problem is that the data volume of student results is too large, but at present, the processing of the data still remains in the primary data backup, query and simple statistics stages, so that a third-party service (a metrological platform, statistical software, a data analysis mechanism and the like) is needed to further mine the data (entrance prediction, teaching analysis, result evaluation and the like). Due to the existence of a third-party service organization, privacy disclosure of partial student achievements can be caused for published data. And malignant events caused by student information leakage also continuously occur. As more and more privacy disclosure events are generated, society becomes more and more aware of the importance of privacy, privacy awareness of organizations such as enterprises and schools is enhanced, and the difficulty of individual attack of attackers is increased, so that attack is not individual behavior any more, but multiple simultaneous attacks of multiple persons are performed organically, and a new challenge is provided for privacy protection.
Disclosure of Invention
The invention provides a privacy protection method and device for inquiring average scores, computer equipment and a storage medium, and aims to further improve the safety performance of an average score inquiring system.
In a first aspect, the present application provides a privacy protection method for querying an average performance, including:
calculating a personal privacy budget threshold according to a preset collusion threshold and a preset global security privacy budget;
generating a query condition, and judging whether the current privacy budget queried by the user under the query condition exceeds the personal privacy budget threshold value;
if the current privacy budget queried by the user under the query condition does not exceed the personal privacy budget threshold, calculating the system sensitivity;
carrying out noise adding processing on the real score according to the system sensitivity;
and carrying out numerical value processing on the real score after the noise processing, and sending score information after the numerical value processing to a user.
In a second aspect, the present application provides a privacy protecting apparatus for querying average performance, comprising:
the first calculating unit is used for calculating the personal privacy budget threshold according to the preset collusion threshold and the preset global security privacy budget;
the judging unit is used for generating a query condition and judging whether the current privacy budget queried by the user under the query condition exceeds the personal privacy budget threshold value;
the second calculating unit is used for calculating the sensitivity of the system if the current privacy budget queried by the user under the query condition does not exceed the personal privacy budget threshold;
the first execution unit is used for carrying out noise adding processing on the real score according to the system sensitivity;
and the second execution unit is used for carrying out numerical value processing on the real score after the noise processing and sending the score information after the numerical value processing to the user.
In a third aspect, the present application further provides a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the steps of the privacy protection method for querying the average score provided in any one of the embodiments of the present application when executing the program.
In a fourth aspect, the present application further provides a storage medium, wherein the storage medium stores a computer program, the computer program comprises program instructions, which when executed by a processor, cause the processor to execute the steps of the privacy protection method for querying average performance according to any embodiment provided in the present application.
The embodiment of the invention judges whether the current privacy budget of the user exceeds the personal privacy budget threshold value or not by calculating the personal privacy budget threshold value, calculates the sensitivity of the system when the current privacy budget does not exceed the personal privacy preset threshold value, performs noise processing on the real score according to the sensitivity of the system, and transmits the real score subjected to the noise processing after numerical processing to the user, thereby improving the safety of the score inquiry system in the case of organically performing multi-user simultaneous attack for multiple times.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of a privacy protection method for querying an average score according to an embodiment of the present invention;
FIG. 2 is a graph of distribution according to the Laplace density function;
FIG. 3 is a graph of a distribution of probability cumulative functions consistent with Laplace;
FIG. 4 is a block diagram illustrating an exemplary privacy preserving apparatus for querying average performance according to an embodiment of the disclosure;
fig. 5 is a schematic block diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
The embodiment of the application provides a privacy protection method and device for inquiring average scores, computer equipment and a storage medium.
The privacy protection method for inquiring the average score is applicable to application scenes comprising a user terminal and a server.
The user terminal can be an electronic device such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a personal digital assistant and a wearable device; the server may be an independent server or a server cluster composed of a plurality of servers.
Referring to fig. 1, fig. 1 is a schematic flow chart of a privacy protection method for querying an average performance according to an embodiment of the present application. As shown in fig. 1, the privacy protecting method for the average score includes steps S101 to S105.
Step S101: and calculating the personal privacy budget threshold according to the preset collusion threshold and the preset global security privacy budget.
Specifically, the global privacy budget ε is set by the system administrator. Considering the worst case scenario of the collusion of the attacker, the collusion threshold is set to be k, and the personal privacy budget is set to be
Figure GDA0003470213940000041
Step S102: and generating a query condition, and judging whether the current privacy budget queried by the user under the query condition exceeds the personal privacy budget threshold.
Specifically, the query conditions are set, in the system, a plurality of institutions and a plurality of class scores are set, and the user can set the query conditions according to the requirements of the user. Let the privacy budget per query belong to the set M ═ ε123,…,εi,…,εnP is related to epsilon0If queried by the need, must satisfy
Figure GDA0003470213940000042
If not, the user is terminated to continue the query.
In one embodiment, a user can set different query conditions according to the requirement of the user before querying. Such as setting up the average score of an exam querying 2015 class computer science and technology institute computer science and technology specialty second class data structure. Wherein if the query condition is not changed, the privacy budget is increased until the privacy budget is exhausted. If the user changes the query conditions, the privacy budget that the user has used is reset to 0.
Step S103: and if the current privacy budget queried by the user under the query condition does not exceed the personal privacy budget threshold, calculating the sensitivity of the system.
Specifically, for two data sets D and D ', which have the same attribute structure and only one record at most, D and D ' are different, D and D ' are called a pair of bounded sibling sets. For example, in this system, if there is an achievement data set D ═ {98, 69, 78, 80}, then changing one of the values results in a set D ═ {98, 69, 78, 10}, where D and D' are Bounded siblings. Let D, D 'be a bounded set of siblings, suppose D, D' is any pair of adjacent scores in the present system, according to the sensitivity formula
Figure GDA0003470213940000043
The sensitivity is calculated. Where | | f (D) -f (D ') | is the 1-order norm distance between f (D) and f (D'). For example, suppose the highest score is 100, the test scores of 5 students are {30,70,80,90,100}, and the average score of 5 persons is 74; if the first student record is replaced, the score becomes {30,70,80,90,0}, and the average score of 5 persons is 54, the sensitivity is 20. The maximum impact on the result of the average score query when actually replacing a record is
Figure GDA0003470213940000051
I.e. sensitivity
Figure GDA0003470213940000052
Wherein N is the number of students and tau is the highest score.
Step S104: and (4) carrying out noise processing on the real score according to the system sensitivity.
In order to better protect the personal privacy of the user, random noise which obeys Laplace distribution needs to be added in each query result, so that the whole algorithm meets the epsilon-difference privacy epsilon.
Laplace (Laplace) noise is added, and in the field of differential privacy, Laplace noise is the most common noise addition mode, and the density function of the distribution is as follows:
Figure GDA0003470213940000053
the distribution image is shown in fig. 2.
The Laplace probability accumulation function is as follows:
Figure GDA0003470213940000054
for the Laplace distribution function, the expectation and variance are μ and 2b, respectively2
Therefore, a probability accumulation function image corresponding to Lap (0,1) can be made as shown in fig. 3.
Epsilon-differential privacy protection:
given a pair of bounded sibling sets D and D', there is at most one data difference between them. Given a mapping function f: D → Rd. Which is a mapping representing the data set D to a D-dimensional space. After which the function f (d) ═ x is obtained1,x2,x3,…,xn)TLaplace noise is added to obtain an output function a (d).
Then there are:
Figure GDA0003470213940000055
wherein
Figure GDA0003470213940000056
The condition that the algorithm A satisfies the differential privacy is as follows:
Pr[A(D)=O]≤eεPr[A(D')=O]
wherein O is a query result, the algorithm A satisfies ε -one differential privacy protection.
Implementation of Laplace mechanism
Assuming that D is a set of scores of students in the same class, N is the number of students, a collusion threshold value is set to be k, epsilon represents a global privacy budget, and an individual privacy budget is
Figure GDA0003470213940000061
Let the privacy consumption budget of each inquiry of an attacker belong to the set M ═ epsilon123,…,εi,…,εnI denotes the number of queries, p is a function of i, which must be satisfied
Figure GDA0003470213940000062
avg is the average score output after noise is added, e represents the accumulated consumption of the privacy budget, the average score query function of students is f, the precision is set to be deta, the number of attackers is count, and tau is the highest score, and the specific algorithm is as follows:
inputting: data set D, k, ε, N, average score query function f (D)
Figure GDA0003470213940000063
For the Laplace distribution function, the expectation and variance are μ and 2b, respectively2Since the value of the probability accumulation function is [0,1 ]]Thus, first, the generation interval is [0,1 ]]Satisfying a uniformly distributed random value. The noise x satisfying the laplacian distribution can be obtained by solving the inverse function of the probability accumulation function. The random variables xi-Uni (-0.5,0.5) are recorded to satisfy the uniform distribution, and the inverse cumulative function is distributed as follows:
x=u-b×sign(ξ)×ln(1-2abs(ξ))
wherein
Figure GDA0003470213940000071
Suppose that the system finds the score gr of each student, and the added noise score after adding noise is gr',gr'=gr+ x, x is random noise that follows the Lap (b) distribution, resulting in a noisy score.
Step S105: and performing numerical value processing on the real score subjected to the noise processing, and sending score information subjected to the numerical value processing to the user.
Numerical processing
Each student's noise score gr ' after query return needs to be subjected to precision deta interception, and an output function after precision interception is I (D), wherein gr ' is gr+x,gr'、grE (0, 100). Is defined as: (Delta-deta, taking the midpoint between two points)
The treatment process is as follows:
Figure GDA0003470213940000072
assuming that deta is 1, a certain student achievement place grWhen 90, the values in the interval (89.5,90.5) are all assigned to gr90. If the returned value is less than 0, g is setr0; if greater than 100, set grAnd (5) calculating an average value according to the processed result, and displaying and feeding back the average value to the user as 100.
The following is an operation process of the embodiment of the present invention in combination with specific data, in an embodiment, a user uses the system to obtain an average performance of a class, assuming that the global privacy epsilon is 10, the collusion threshold is k is 10, and the personal privacy budget is
Figure GDA0003470213940000073
Where p (M) is 1 x 0.5iKnowing e1=0.5,
Figure GDA0003470213940000074
Since q is 0.5 less than 1, it is easy to obtain
Figure GDA0003470213940000075
This is always true. q is 0.5, the first query is ε10.5, the second query is ε20.25, …, with the ith query being
Figure GDA0003470213940000076
Setting query conditions
Suppose that a user sets the query condition of this time as the second class of science and technology of the computer academy of science and technology before querying. Part of the initial performance of science class two is shown in table one below.
Table one is the initial score table
Figure GDA0003470213940000081
Calculation of sensitivity
Suppose that the class has 100 students, the highest score is 100, i.e. the sensitivity
Figure GDA0003470213940000082
Recording the random variable xi-Uni (-0.5,0.5) to satisfy uniform distribution
Figure GDA0003470213940000083
And u is 0, substituting the expression of the noise value:
x=u-b×sign(ξ)×ln(1-2abs(ξ))
and obtaining the partial noise adding result added with the Laplace noise according to an expression gr' to gr + x of the noise adding result, as shown in the following table II.
Table two is a noise score table
Figure GDA0003470213940000084
Numerical processing
And (3) carrying out numerical processing on the result obtained in the table two, and calling the I (D) conversion process to standardize the data result to obtain a numerical processing result table shown in the table three.
And averaging according to the results in the third table, and returning the results to the user.
Table three is a numerical processing achievement table
Figure GDA0003470213940000091
Fig. 4 is a schematic block diagram of a privacy protecting apparatus for querying an average performance according to an embodiment of the present application. As shown in fig. 4, the present application also provides a privacy protecting apparatus for querying an average score, corresponding to the above privacy protecting method for querying an average score. The device for protecting privacy of the average score comprises a unit for executing the steps of the privacy protection method of the average score query, and the device can be configured in a server.
As shown in fig. 4, the privacy protecting apparatus 400 for querying the average score includes: a first calculating unit 410, a judging unit 420, a second calculating unit 430, a first executing unit 440 and a second executing unit 450.
A first calculating unit 410, configured to calculate a personal privacy budget threshold according to a preset collusion threshold and a preset global security privacy budget;
a determining unit 420, configured to generate a query condition, and determine whether a current privacy budget queried by a user under the query condition exceeds a personal privacy budget threshold;
the second calculating unit 430 is configured to calculate a system sensitivity if the current privacy budget of the user for querying under the querying condition does not exceed the personal privacy budget threshold;
the first execution unit 440 is used for performing noise processing on the real score according to the system sensitivity;
the second execution unit 450 performs numerical processing on the real score after the noise processing, and sends score information after the numerical processing to the user.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the privacy protecting apparatus and unit for querying an average score described above may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
The above-described apparatus may be implemented in the form of a computer program which is executable on a computer device as shown in fig. 5.
Referring to fig. 5, fig. 5 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device 700 may be a terminal or a server.
Referring to fig. 5, the computer device 700 includes a processor 720, a memory, which may include a non-volatile storage medium 730 and an internal memory 740, and a network interface 750, which are connected by a system bus 710.
The non-volatile storage medium 730 may store an operating system 731 and computer programs 732. The computer program 732, when executed, may cause the processor 720 to perform any of a number of privacy preserving methods for querying the average performance.
The processor 720 is used to provide computing and control capabilities, supporting the operation of the overall computer device 700.
The internal memory 740 provides an environment for the execution of a computer program 732 on the non-volatile storage medium 730, and when executed by the processor 720, the computer program 732 may cause the processor 720 to perform any of a number of privacy preserving methods for querying for average performance.
The network interface 750 is used for network communication such as sending assigned tasks and the like. Those skilled in the art will appreciate that the configuration shown in fig. 5 is a block diagram of only a portion of the configuration relevant to the present teachings and is not intended to limit the computing device 700 to which the present teachings may be applied, and that a particular computing device 700 may include more or less components than those shown, or may combine certain components, or have a different arrangement of components. Wherein the processor 720 is configured to execute the program code stored in the memory to perform the following steps:
calculating a personal privacy budget threshold according to a preset collusion threshold and a preset global security privacy budget; generating a query condition, and judging whether the current privacy budget queried by the user under the query condition exceeds a personal privacy budget threshold value; if the current privacy budget queried by the user under the query condition does not exceed the personal privacy budget threshold, calculating the system sensitivity; carrying out noise adding processing on the real score according to the system sensitivity; and performing numerical value processing on the real score subjected to the noise processing, and sending score information subjected to the numerical value processing to the user.
It should be understood that, in the embodiment of the present Application, the Processor 720 may be a Central Processing Unit (CPU), and the Processor 720 may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Those skilled in the art will appreciate that the configuration of computer device 700 depicted in FIG. 5 is not intended to be limiting of computer device 700 and may include more or less components than those shown, or some components in combination, or a different arrangement of components.
It will be understood by those skilled in the art that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, and the computer program may be stored in a storage medium, which is a computer-readable storage medium. In the embodiment of the present invention, the computer program may be stored in a storage medium of a computer system and executed by at least one processor in the computer system to implement the flow steps of the embodiments including the methods as described above.
The computer readable storage medium may be a magnetic disk, an optical disk, a usb disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk or an optical disk, etc. which can store program codes.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In several embodiments provided in the present application, it should be understood that the disclosed privacy protecting apparatus and method for querying average performance may be implemented in other ways. For example, the privacy-preserving apparatus embodiments described above for querying average achievements are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, various elements or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The units in the device of the embodiment of the application can be combined, divided and deleted according to actual needs.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially or partially implemented in the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a terminal, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. A privacy protection method for inquiring average score is characterized by comprising the following steps:
calculating a personal privacy budget threshold according to a preset collusion threshold and a preset global security privacy budget;
generating a query condition, and judging whether the current privacy budget queried by the user under the query condition exceeds the personal privacy budget threshold value;
if the current privacy budget queried by the user under the query condition does not exceed the personal privacy budget threshold, calculating the system sensitivity, wherein the system sensitivity is generated through a sensitivity formula;
the sensitivity formula is:
Figure FDA0003473955200000011
wherein N is the total number of people, and tau is the highest score;
carrying out noise adding processing on the real score according to the system sensitivity;
carrying out numerical value processing on the real score after the noise processing, and sending score information after the numerical value processing to a user;
the denoising processing of the real score according to the system sensitivity comprises:
calculating a noise value according to the system sensitivity by combining a Laplace distribution function, and adding the real score to the noise value to obtain a noise score after noise processing;
the expression of the noise value is:
x=u-b×sign(ξ)×ln(1-2abs(ξ))
wherein u is an expected value of a Laplace distribution function, b is delta f/xi, and xi to Uni (-0.5,0.5) are random variables satisfying uniform distribution;
the expression of the noise adding result is as follows:
gr'=gr+x
wherein, gr' is the noise-added result, and gr is the real result.
2. The privacy preserving method for query average performance as claimed in claim 1, wherein after the step of generating query conditions is performed, the method further comprises:
and if the current privacy budget queried by the user under the query condition exceeds the personal privacy budget threshold, terminating the query action of the user.
3. The privacy preserving method for query average performance as claimed in claim 1, wherein the method further comprises:
and if the query condition is detected to be changed, resetting the current privacy budget of the user.
4. The privacy protection method for querying the average achievement according to claim 1, wherein the personal privacy budget threshold is generated through a privacy budget threshold formula;
the privacy budget threshold formula is:
Figure FDA0003473955200000021
wherein epsilon is the global privacy budget and k is the preset collusion threshold.
5. The privacy protection method for querying the average score as claimed in claim 1, wherein the numerical processing of the real score after the noise processing includes performing regularization processing on the noise score according to a preset rule.
6. A privacy preserving apparatus for querying average performance, comprising:
the first calculating unit is used for calculating the personal privacy budget threshold according to the preset collusion threshold and the preset global security privacy budget;
the judging unit is used for generating a query condition and judging whether the current privacy budget queried by the user under the query condition exceeds the personal privacy budget threshold value;
the second calculating unit is used for calculating the system sensitivity if the current privacy budget queried by the user under the query condition does not exceed the personal privacy budget threshold, and the system sensitivity is generated through a sensitivity formula;
the sensitivity formula is:
Figure FDA0003473955200000022
wherein N is the total number of people, and tau is the highest score;
the first execution unit is used for carrying out noise adding processing on the real score according to the system sensitivity;
the second execution unit is used for carrying out numerical value processing on the real score after the noise processing and sending score information after the numerical value processing to a user;
the denoising processing of the real score according to the system sensitivity comprises:
calculating a noise value according to the system sensitivity by combining a Laplace distribution function, and adding the real score to the noise value to obtain a noise score after noise processing;
the expression of the noise value is:
x=u-b×sign(ξ)×ln(1-2abs(ξ))
wherein u is an expected value of a Laplace distribution function, b is delta f/xi, and xi to Uni (-0.5,0.5) are random variables satisfying uniform distribution;
the expression of the noise adding result is as follows:
gr'=gr+x
wherein, gr' is the noise-added result, and gr is the real result.
7. Computer arrangement comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method according to any one of claims 1 to 5 when executing the computer program.
8. A storage medium, characterized in that the storage medium stores a computer program comprising program instructions which, when executed by a processor, cause the processor to carry out the steps of the method according to any one of claims 1 to 5.
CN201810671583.9A 2018-06-26 2018-06-26 Privacy protection method for inquiring average score Active CN108959961B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810671583.9A CN108959961B (en) 2018-06-26 2018-06-26 Privacy protection method for inquiring average score

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810671583.9A CN108959961B (en) 2018-06-26 2018-06-26 Privacy protection method for inquiring average score

Publications (2)

Publication Number Publication Date
CN108959961A CN108959961A (en) 2018-12-07
CN108959961B true CN108959961B (en) 2022-03-22

Family

ID=64486863

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810671583.9A Active CN108959961B (en) 2018-06-26 2018-06-26 Privacy protection method for inquiring average score

Country Status (1)

Country Link
CN (1) CN108959961B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109670341A (en) * 2018-12-29 2019-04-23 中山大学 The method for secret protection that a kind of pair of structural data and semi-structured data combine
CN110889141B (en) * 2019-12-11 2022-02-08 百度在线网络技术(北京)有限公司 Data distribution map privacy processing method and device and electronic equipment
CN112767693A (en) * 2020-12-31 2021-05-07 北京明朝万达科技股份有限公司 Vehicle driving data processing method and device
CN112989411A (en) * 2021-03-15 2021-06-18 Oppo广东移动通信有限公司 Privacy budget calculation method, device, storage medium and terminal
CN113792343A (en) * 2021-09-17 2021-12-14 国网山东省电力公司电力科学研究院 Data privacy processing method and device, storage medium and electronic equipment
CN114090656B (en) * 2021-11-23 2023-05-26 抖音视界有限公司 Data processing method, device, computer equipment and storage medium
CN113987309B (en) * 2021-12-29 2022-03-11 深圳红途科技有限公司 Personal privacy data identification method and device, computer equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009139807A2 (en) * 2008-05-16 2009-11-19 Alcatel-Lucent Usa Inc. Service induced privacy with synchronized noise insertion
CN105653981A (en) * 2015-12-31 2016-06-08 中国电子科技网络信息安全有限公司 Sensitive data protection system and method of data circulation and transaction of big data platform

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009139807A2 (en) * 2008-05-16 2009-11-19 Alcatel-Lucent Usa Inc. Service induced privacy with synchronized noise insertion
CN105653981A (en) * 2015-12-31 2016-06-08 中国电子科技网络信息安全有限公司 Sensitive data protection system and method of data circulation and transaction of big data platform

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于差分隐私的学生成绩隐私保护系统;朱晓燕 等;《江汉大学学报(自然科学版)》;20171031;第45卷(第5期);429-433 *

Also Published As

Publication number Publication date
CN108959961A (en) 2018-12-07

Similar Documents

Publication Publication Date Title
CN108959961B (en) Privacy protection method for inquiring average score
US10789363B1 (en) Identifying and protecting against computer security threats while preserving privacy of individual client devices using condensed local differential privacy (CLDP)
Cohen et al. Linear program reconstruction in practice
JP2018516421A (en) Network access operation identification method, server, and storage medium
Zhao et al. A blockchain-based approach for saving and tracking differential-privacy cost
CN111489290A (en) Face image super-resolution reconstruction method and device and terminal equipment
Qu et al. Big data set privacy preserving through sensitive attribute-based grouping
JP6711519B2 (en) Evaluation device, evaluation method and program
Yao et al. Sensitive label privacy preservation with anatomization for data publishing
WO2019232821A1 (en) Method for processing risk control data, device, computer apparatus, and storage medium
Yang et al. Associated attribute-aware differentially private data publishing via microaggregation
Sei et al. Private true data mining: Differential privacy featuring errors to manage Internet-of-Things data
US10795999B1 (en) Identifying and protecting against computer security threats while preserving privacy of individual client devices using condensed local differential privacy (CLDP)
Sharma et al. A practical approach to navigating the tradeoff between privacy and precise utility
Qu et al. Privacy preserving in big data sets through multiple shuffle
CN116451278A (en) Star-connection workload query privacy protection method, system, equipment and medium
Riboni et al. Incremental release of differentially-private check-in data
CN116561737A (en) Password validity detection method based on user behavior base line and related equipment thereof
He et al. Customized privacy preserving for classification based applications
US20230274004A1 (en) Subject Level Privacy Attack Analysis for Federated Learning
CN107798249B (en) Method for releasing behavior pattern data and terminal equipment
Kacem et al. Geometric noise for locally private counting queries
Yadav et al. Privacy preserving data mining with abridge time using vertical partition decision tree
Kan Seeking the ideal privacy protection: Strengths and limitations of differential privacy
Qian et al. Integer-granularity locality-sensitive bloom filter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant