CN117633892A - Private data processing method, device, equipment and storage medium - Google Patents

Private data processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN117633892A
CN117633892A CN202311649883.4A CN202311649883A CN117633892A CN 117633892 A CN117633892 A CN 117633892A CN 202311649883 A CN202311649883 A CN 202311649883A CN 117633892 A CN117633892 A CN 117633892A
Authority
CN
China
Prior art keywords
random number
quantum
laplace
character string
data processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311649883.4A
Other languages
Chinese (zh)
Inventor
陈鋆昊
康洁
余剑斌
许锦标
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Agricultural Bank of China
Original Assignee
Agricultural Bank of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agricultural Bank of China filed Critical Agricultural Bank of China
Priority to CN202311649883.4A priority Critical patent/CN117633892A/en
Publication of CN117633892A publication Critical patent/CN117633892A/en
Pending legal-status Critical Current

Links

Landscapes

  • Storage Device Security (AREA)

Abstract

The invention discloses a method, a device, equipment and a storage medium for processing privacy data, which comprise the following steps: generating a quantum random number by using a quantum physical process as an entropy source through a quantum random number generator; substituting the quantum random number into a Laplace distribution function, and calculating the Laplace distribution function to obtain a random Laplace noise value; and adding the random Laplace noise value to target privacy data. The technical scheme of the embodiment of the invention can improve the security of the private data of the user.

Description

Private data processing method, device, equipment and storage medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for processing private data.
Background
With the advent and development of database applications such as data mining and data distribution, how to protect private data and prevent sensitive information from disclosure becomes a major challenge, and the differential privacy protection technology is a strict provable privacy protection model, does not care about background knowledge possessed by an attacker, and realizes privacy protection by adding noise to data.
For numerical data, the most commonly used noise mechanism for differential privacy is the laplace mechanism, the generation of laplace noise depends on random values as parameters, while the random numbers currently used are pseudo random numbers generated by a programming language built-in algorithm, once the random number parameters for the generated laplace noise are predicted and mastered by an attacker, the differential privacy mechanism will completely lose the data protection level.
Disclosure of Invention
The invention provides a method, a device, equipment and a storage medium for processing private data, which can improve the security of private data of a user.
According to an aspect of the present invention, there is provided a method of processing private data, the method comprising:
generating a quantum random number by using a quantum physical process as an entropy source through a quantum random number generator;
substituting the quantum random number into a Laplace distribution function, and calculating the Laplace distribution function to obtain a random Laplace noise value;
and adding the random Laplace noise value to target privacy data.
Optionally, generating, by the quantum random number generator, the quantum random number using the quantum physical process as an entropy source, includes:
generating a random number character string by using a quantum physical process as an entropy source through a quantum random number generator;
and converting the random number character string into a target value in a preset value interval to obtain the quantum random number.
Optionally, converting the random number string into a target value in a preset value interval includes:
and taking the random number character string as a Seed of a programming built-in algorithm, and converting the random number character string into a target value in a preset value interval by adopting the programming built-in algorithm according to a conversion function.
Optionally, converting the random number string into a target value in a preset value interval includes:
and processing the random number character string by adopting a linear or nonlinear algorithm, and mapping a processing result with a target value in a preset value interval.
Optionally, the privacy data processing method is applied to a global differential privacy data processing scene and a local differential privacy data processing scene.
Optionally, generating, by the quantum random number generator, the quantum random number using the quantum physical process as an entropy source, includes:
responding to a target privacy data processing request triggered by a user, and generating a quantum random number by using a quantum physical process as an entropy source through a quantum random number generator;
after adding the random laplace noise value to the target privacy data, further comprising:
and feeding the processed target privacy data back to the user.
According to another aspect of the present invention, there is provided a private data processing apparatus comprising:
the random number generation module is used for generating a quantum random number by using a quantum physical process as an entropy source through the quantum random number generator;
the noise generation module is used for substituting the quantum random number into a Laplace distribution function, and obtaining a random Laplace noise value through calculating the Laplace distribution function;
and the noise adding module is used for adding the random Laplace noise value to the target privacy data.
Optionally, the random number generation module includes:
the character string generation unit is used for generating a random number character string by using a quantum physical process as an entropy source through the quantum random number generator;
and the character string conversion unit is used for converting the random number character string into a target value in a preset value interval to obtain the quantum random number.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of privacy data processing according to any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute a method of processing private data according to any one of the embodiments of the present invention.
According to the technical scheme provided by the embodiment of the invention, the quantum random number generator is used for generating the quantum random number by taking the quantum physical process as an entropy source, the quantum random number is substituted into the Laplace distribution function, the Laplace distribution function is operated to obtain the random Laplace noise value, and the random Laplace noise value is added into the target privacy data by the technical means, so that the security of the privacy data of a user can be improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1a is a graph of a probability density function corresponding to a Laplace distribution function according to an embodiment of the present invention;
FIG. 1b is a flow chart of a method for processing private data according to an embodiment of the present invention;
FIG. 2 is a flow chart of another method of processing private data provided in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart of another method of processing private data provided in accordance with an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a private data processing apparatus according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device implementing a method for processing private data according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Differential privacy is a strict privacy protection mechanism capable of proving the privacy protection level, does not care about background knowledge of an attacker, achieves the effect of data distortion by adding noise, and can keep the data attribute unchanged and maintain the data property in the aspect of statistics. Any addition or deletion of a piece of data to the dataset will not affect the output result. The specific definition is as follows: for one algorithm a, S is a subset of the set of all values that algorithm a can output. If for any pair of adjacent datasets X and X' (up to two datasets of one record differ), algorithm a satisfies:
Pr[A(x)∈S]≤e ε Pr[A(x')∈S]
then algorithm a is said to satisfy epsilon-differential privacy where the parameter epsilon is the privacy preserving budget, typically a small number such as 0.1, so that the probability of outputting results without passing through the dataset is close. Under the condition that this formula is satisfied, it is said that the differential privacy definition is strictly satisfied. In this case, even if an attacker has a neighboring data set of the original data set, i.e., a data set that differs from the original data set by at most one piece of data, the data is not compromised.
The Laplace distribution is a statistical concept, a continuous probability distribution, if the probability density function of the random variable is distributed asThe random variable satisfies the laplace distribution. Where μ represents a position parameter and λ represents a scale parameter. The corresponding probability density function profile when μ=0, λ takes different values can be as shown in fig. 1 a.
The distribution diagram shown in fig. 1a includes three distribution curves, where λ corresponding to a distribution curve with a highest peak is 0.5, λ corresponding to a distribution curve with a middle peak is 1, and λ corresponding to a distribution curve with a lowest peak is 2.
Laplace noise is a random value satisfying the Laplace distribution, and is the most common noise adding mode in the field of differential privacy data processingDimensional parametersWhere Δf is the unit of measure and ε is the privacy preserving budget.
In the prior art, a programming language built-in algorithm is generally used for generating pseudo random numbers, specifically, a cumulative distribution function is obtained by using a probability density function of Laplace, and the cumulative distribution function is as follows:
the range interval is [0,1] according to the cumulative distribution function of the Laplace distribution, the definition field is [ - ≡, ++ infinity ]. Therefore, the noise meeting the Laplace distribution can be obtained by solving the inverse function of the cumulative distribution function. If the random variables alpha-UNI (0, 1) are taken to meet the uniform distribution and are brought into the inverse function of the Laplace cumulative distribution function, the noise value meeting the condition can be obtained:
if alpha-UNI (-0.5, 0.5) is taken, the process is carried out simultaneouslyThe piecewise function may be expressed as a form of a equation in which sign functions are used to represent the sign of the acquisition parameters and abs functions are used to acquire the absolute values, whereby the following noise values may be obtained:
from the above deductions, it can be easily understood that the existing codes for generating the laplace noise by taking the main stream Java language as an example are as follows:
it is worth specifically describing that the core of Random implementation (i.e., random entropy source) is seed, and the structure of seed is in two ways, one by system. Whether for the former system. Nanotime () represents nanoseconds since some fixed but arbitrary original time, or the latter custom fixed seed, the random number to be generated can be derived by breaking down with sufficient computational effort.
Whereas once the random number parameters for the generated laplace noise are predictably grasped by an attacker, the differential privacy mechanism will completely lose the data protection level. Thus, the prior art has serious safety hazards.
Accordingly, the embodiment of the invention improves the Laplace noise generation mechanism used for differential privacy, replaces pseudo random numbers by using quantum random numbers (Quantum Random Number, QRN), generates safer Laplace noise by utilizing the unpredictability, unrepeatability and unbiasedness characteristics of the quantum random numbers, and then adds the noise into data to be protected, thereby further protecting private data of users. Fig. 1b is a flowchart of a method for processing private data according to an embodiment of the present invention, where the method may be performed by a private data processing apparatus, and the private data processing apparatus may be implemented in hardware and/or software, and the private data processing apparatus may be configured in an electronic device. As shown in fig. 1b, the method comprises:
step 110, generating a quantum random number by using a quantum physical process as an entropy source through a quantum random number generator.
In the present embodiment, the quantum random number is a random number generated using a system based on quantum physics principles (i.e., a quantum random number generator). The quantum random number generator is used for reading the intrinsic randomness of the microscopic particles and taking the intrinsic randomness as an original random number entropy source.
The method has the advantages that the quantum random number can be guaranteed to have the characteristics of unpredictability, unrepeatability, unbiasedness and the like, so that an attacker who grasps the computing capability cannot predict the quantum random number.
In one implementation of the present embodiment, generating, by a quantum random number generator, a quantum random number using a quantum physical process as an entropy source, includes: and responding to a target privacy data processing request triggered by a user, and generating a quantum random number by using a quantum physical process as an entropy source through a quantum random number generator.
And 120, substituting the quantum random number into a Laplace distribution function, and calculating the Laplace distribution function to obtain a random Laplace noise value.
In this step, the quantum random number may be substituted into the laplace distribution function, and the random laplace noise value may be obtained by performing an operation on the laplace distribution function.
And 130, adding the random Laplace noise value to target privacy data.
In this step, a laplace noise value enhanced based on a quantum random number may be added to target privacy data requested by a user to protect the target privacy data.
In one implementation of this embodiment, after adding the random laplace noise value to the target privacy data, the method further includes: and feeding the processed target privacy data back to the user.
In this embodiment, a privacy data processing method based on enhanced differential privacy laplace noise is provided, so that an attacker who grasps computing power cannot predict a noise result, and privacy data of a user can be further protected from a source.
According to the technical scheme provided by the embodiment of the invention, the quantum random number generator is used for generating the quantum random number by taking the quantum physical process as an entropy source, the quantum random number is substituted into the Laplace distribution function, the Laplace distribution function is operated to obtain the random Laplace noise value, and the random Laplace noise value is added into the target privacy data by the technical means, so that the security of the privacy data of a user can be improved.
On the basis of the above embodiment, generating, by a quantum random number generator, a quantum random number using a quantum physical process as an entropy source, includes: generating a random number character string by using a quantum physical process as an entropy source through a quantum random number generator; and converting the random number character string into a target value in a preset value interval to obtain the quantum random number.
In this embodiment, various flexible ways of converting the random number string into the target value are provided, and the conversion of the string may be implemented by using the random number string as a random Seed of a programming language built-in algorithm, or the random number string may be converted into the target value by a self-grinding algorithm (e.g., a linear or nonlinear algorithm).
In the technical scheme of the disclosure, the related processes of collecting, storing, using, processing, transmitting, providing, disclosing and the like of the personal information of the user accord with the regulations of related laws and regulations, and the public order colloquial is not violated.
Fig. 2 is a flowchart of another method for processing private data according to an embodiment of the present invention, where the method further refines the foregoing embodiment, as shown in fig. 2, and includes:
step 210, generating a random number character string by using a quantum physical process as an entropy source through a quantum random number generator.
Step 220, the random number character string is used as a Seed of a programming built-in algorithm, and the programming built-in algorithm is adopted to convert the random number character string into a target value in a preset value interval according to a conversion function, so that a quantum random number is obtained.
In this step, the random number string may be specifically converted by the following programming code:
in a specific embodiment, assuming a random number string of 010100110101, this can be converted to the target value 0.982749448194 in the manner described above. The method has the advantages that the random number character string is converted according to the conversion function by adopting the programming built-in algorithm, not only can unpredictable random numerical values be obtained, but also the current noise generation mode can be compatible, and the rapid integration application can be realized.
And 230, substituting the quantum random number into a Laplace distribution function, and calculating the Laplace distribution function to obtain a random Laplace noise value.
And step 240, adding the random Laplace noise value to the target privacy data.
According to the technical scheme provided by the embodiment of the invention, the quantum random number generator is used for generating the random number character string by taking a quantum physical process as an entropy source, the random number character string is used as a Seed of a programming built-in algorithm, the programming built-in algorithm is adopted to convert the random number character string into a target value in a preset value interval according to a conversion function, the quantum random number is obtained, the quantum random number is substituted into a Laplace distribution function, a random Laplace noise value is obtained by calculating the Laplace distribution function, and the random Laplace noise value is added into target privacy data by the technical means, so that the security of user privacy data can be improved.
Fig. 3 is a flowchart of another method for processing private data according to an embodiment of the present invention, as shown in fig. 3, where the method includes:
step 310, generating a random number character string by using a quantum physical process as an entropy source through a quantum random number generator.
And 320, processing the random number character string by adopting a linear or nonlinear algorithm, and mapping the processing result with a target value in a preset value interval to obtain the quantum random number.
In this step, specifically, a one-to-one mapping may be performed on the values between the random number string with the length of 32 bits and the value interval [0,1], so as to obtain Double type values uniformly distributed in the [0,1] interval.
The advantage of this arrangement is that by converting the random number string into the target value using a self-grinding algorithm (e.g., linear or nonlinear algorithm), the flexibility of the string conversion mode can be improved, and the function of converting the random string into the random value with higher security requirements can be realized.
And 330, substituting the quantum random number into a Laplace distribution function, and calculating the Laplace distribution function to obtain a random Laplace noise value.
And step 340, adding the random Laplace noise value to the target privacy data.
According to the technical scheme provided by the embodiment of the invention, the quantum random number generator is used for generating the random number character string by taking a quantum physical process as an entropy source, a linear or nonlinear algorithm is adopted for processing the random number character string, a processing result is mapped with a target value in a preset value interval to obtain the quantum random number, the quantum random number is substituted into a Laplace distribution function, a random Laplace noise value is obtained by calculating the Laplace distribution function, and the random Laplace noise value is added into target privacy data.
On the basis of the embodiment, the privacy data processing method is applied to a global differential privacy data processing scene and a local differential privacy data processing scene.
The global differential privacy data processing scene refers to that a trusted data manager collects data and adds noise disturbance to the statistical result of a data set. The local differential privacy data processing scene refers to that a user adds noise disturbance to local data and then sends the local data to an untrusted data manager.
In a specific embodiment, taking a global differential privacy data processing scenario as an example, it is desirable to be able to represent the general trend without revealing personal information, assuming that a bank needs to issue statistics outwards. For the privacy data processing request, enhanced differential privacy laplace noise based on quantum random numbers may be added to the statistical data. For example: the original statistical data are that 100 persons below ten thousand yuan deposit, 200 persons above ten thousand deposit, 300 persons above ten thousand deposit, the statistical data of the persons in the statistical data are denoised by using enhanced differential privacy Laplacian noise based on quantum random numbers, firstly, the quantum random number 01001101010101110010101010100101 is generated, the corresponding random number random is 0.27898797843975, alpha is-0.22101202156025, and then noise is generated according to mu being 0, delta being 1, epsilon being 0.1:
noise=0-1/0.1 x math.sign (-0.22101202156025) math.log (1-2 x math.abs (-0.22101202156025)), and is 1 after rounding, so that the last output statistical data according to the laplace noise mechanism is "101 people under ten thousand yuan deposit", and similarly "198 people under ten thousand to twenty thousand deposit", "302 people over ten thousand deposit" statistical data can be obtained.
It should be noted that: assuming that an attacker knows that the original data has 100 persons under ten thousand yuan deposit and 200 persons under ten thousand to twenty thousand deposit, the attacker can know corresponding noise, and if pseudo-random noise generation is used, the noise in the release data has 302 persons over ten thousand deposit can be further deduced according to the pseudo-random number property; the use of the quantum random number cannot deduce the noise in the release data '302 people with deposit more than twenty thousand' according to the known noise, so that the safety of the data is ensured.
In another specific embodiment, taking a local differential privacy data processing scenario as an example, assume that a certain bank APP needs to collect user usage habits and create a user's personal portraits, hopefully hiding some of the privacy information. For the privacy data processing request, enhanced differential privacy laplace noise based on quantum random numbers may be added to the statistics. For example: the original statistical data is that the transaction number is 100 times, the purchase number of a certain product is 2 times, the amount of money is 100, the data in the data is denoised by using enhanced differential privacy Laplacian noise based on quantum random numbers, firstly, the quantum random number 01001101010101110010101010100101 is generated, the corresponding random number random is 0.27898797843975, alpha is-0.22101202156025, and then noise is generated according to mu is 0, delta is 1, epsilon is 0.1:
noise=0-1/0.1×math.sign (-0.22101202156025) ×math.log (1-2×math.abs (-0.22101202156025)), and is 1 after rounding, so that the statistics data finally output according to the laplace noise mechanism is "the number of transactions is 101", and the statistics data of "the number of times of purchasing a certain product is 3 and the amount is 90" can be obtained by the same method.
The technical scheme of the embodiment can be completely suitable for the existing differential privacy protection mechanism scene, whether global differential privacy or local differential privacy is achieved, and the enhanced differential privacy Laplacian noise scheme based on the quantum random numbers can protect the data availability and the user data privacy.
Fig. 4 is a schematic structural diagram of a private data processing apparatus according to an embodiment of the present invention, where the apparatus is applied to an electronic device, as shown in fig. 4, and the apparatus includes: a random number generation module 410, a noise generation module 420, and a noise addition module 430.
A random number generation module 410, configured to generate a quantum random number by using a quantum physical process as an entropy source through a quantum random number generator;
the noise generation module 420 is configured to substitute the quantum random number into a laplace distribution function, and obtain a random laplace noise value by calculating the laplace distribution function;
and the noise adding module 430 is configured to add the random laplace noise value to the target privacy data.
According to the technical scheme provided by the embodiment of the invention, the quantum random number generator is used for generating the quantum random number by taking the quantum physical process as an entropy source, the quantum random number is substituted into the Laplace distribution function, the Laplace distribution function is operated to obtain the random Laplace noise value, and the random Laplace noise value is added into the target privacy data by the technical means, so that the security of the privacy data of a user can be improved.
On the basis of the embodiment, the privacy data processing method is applied to a global differential privacy data processing scene and a local differential privacy data processing scene.
The random number generation module 410 includes:
the character string generation unit is used for generating a random number character string by using a quantum physical process as an entropy source through the quantum random number generator;
the character string conversion unit is used for converting the random number character string into a target value in a preset value interval to obtain the quantum random number;
the programming conversion unit is used for taking the random number character string as a Seed of a programming built-in algorithm and converting the random number character string into a target value in a preset value interval by adopting the programming built-in algorithm according to a conversion function;
the numerical mapping unit is used for processing the random number character string by adopting a linear or nonlinear algorithm and mapping a processing result with a target numerical value in a preset numerical value interval;
and the request response unit is used for responding to a target privacy data processing request triggered by a user, and generating a quantum random number by using a quantum physical process as an entropy source through the quantum random number generator.
The noise adding module 430 includes:
and the data feedback unit is used for feeding the processed target privacy data back to the user.
The device can execute the method provided by all the embodiments of the invention, and has the corresponding functional modules and beneficial effects of executing the method. Technical details not described in detail in the embodiments of the present invention can be found in the methods provided in all the foregoing embodiments of the present invention.
Fig. 5 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the private data processing method.
In some embodiments, the private data processing method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into the RAM 13 and executed by the processor 11, one or more steps of the privacy data processing method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the private data processing method in any other suitable way (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A method of private data processing, the method comprising:
generating a quantum random number by using a quantum physical process as an entropy source through a quantum random number generator;
substituting the quantum random number into a Laplace distribution function, and calculating the Laplace distribution function to obtain a random Laplace noise value;
and adding the random Laplace noise value to target privacy data.
2. The method of claim 1, wherein generating, by the quantum random number generator, the quantum random number using the quantum physical process as an entropy source, comprises:
generating a random number character string by using a quantum physical process as an entropy source through a quantum random number generator;
and converting the random number character string into a target value in a preset value interval to obtain the quantum random number.
3. The method of claim 2, wherein converting the string of random numbers to target values within a predetermined value interval comprises:
and taking the random number character string as a Seed of a programming built-in algorithm, and converting the random number character string into a target value in a preset value interval by adopting the programming built-in algorithm according to a conversion function.
4. A method according to claim 3, wherein converting the string of random numbers into target values within a predetermined value interval comprises:
and processing the random number character string by adopting a linear or nonlinear algorithm, and mapping a processing result with a target value in a preset value interval.
5. The method of claim 1, wherein the privacy data processing method is applied to a global differential privacy data processing scenario and a local differential privacy data processing scenario.
6. The method of claim 1, wherein generating, by the quantum random number generator, the quantum random number using the quantum physical process as an entropy source, comprises:
responding to a target privacy data processing request triggered by a user, and generating a quantum random number by using a quantum physical process as an entropy source through a quantum random number generator;
after adding the random laplace noise value to the target privacy data, further comprising:
and feeding the processed target privacy data back to the user.
7. A private data processing apparatus, the apparatus comprising:
the random number generation module is used for generating a quantum random number by using a quantum physical process as an entropy source through the quantum random number generator;
the noise generation module is used for substituting the quantum random number into a Laplace distribution function, and obtaining a random Laplace noise value through calculating the Laplace distribution function;
and the noise adding module is used for adding the random Laplace noise value to the target privacy data.
8. The apparatus of claim 7, wherein the random number generation module comprises:
the character string generation unit is used for generating a random number character string by using a quantum physical process as an entropy source through the quantum random number generator;
and the character string conversion unit is used for converting the random number character string into a target value in a preset value interval to obtain the quantum random number.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of private data processing of any one of claims 1-6.
10. A computer readable storage medium storing computer instructions for causing a processor to perform the method of private data processing according to any one of claims 1 to 6.
CN202311649883.4A 2023-12-04 2023-12-04 Private data processing method, device, equipment and storage medium Pending CN117633892A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311649883.4A CN117633892A (en) 2023-12-04 2023-12-04 Private data processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311649883.4A CN117633892A (en) 2023-12-04 2023-12-04 Private data processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117633892A true CN117633892A (en) 2024-03-01

Family

ID=90033597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311649883.4A Pending CN117633892A (en) 2023-12-04 2023-12-04 Private data processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117633892A (en)

Similar Documents

Publication Publication Date Title
Chen et al. Pseudorandom Number Generator Based on Three Kinds of Four‐Wing Memristive Hyperchaotic System and Its Application in Image Encryption
CN112508118B (en) Target object behavior prediction method aiming at data offset and related equipment thereof
JP2016511891A (en) Privacy against sabotage attacks on large data
Zhu et al. A novel iris and chaos-based random number generator
US11451387B2 (en) Cryptographic key generation and storage
CN113609113A (en) User information generation method and device based on power data and electronic equipment
CN115545216B (en) Service index prediction method, device, equipment and storage medium
Immler et al. Practical aspects of quantization and tamper-sensitivity for physically obfuscated keys
CN114826553A (en) Cloud storage data security protection method and device based on group signature and homomorphic encryption
CN111400755A (en) Index mechanism-based personalized differential privacy protection method and system
CN116432040B (en) Model training method, device and medium based on federal learning and electronic equipment
US11163895B2 (en) Concealment device, data analysis device, and computer readable medium
CN111625587B (en) Data sharing apparatus
CN117633892A (en) Private data processing method, device, equipment and storage medium
US11411726B2 (en) Cryptographic key generation using multiple random sources
US8660262B2 (en) Method and device for generating random wait states
CN116680728B (en) Privacy-preserving biometric methods, systems, devices, and media
CN114531247B (en) Data sharing method, device, equipment, storage medium and program product
CN116760545B (en) Smart community data encryption method and system based on quantum random number verification
CN118133249A (en) Image encryption method, device, equipment and medium based on two-dimensional chaotic mapping
CN117272381A (en) Desensitization method, device, equipment and medium for sensitive data
CN117640081A (en) Data encryption method and device, electronic equipment and storage medium
Yuan et al. VFLGAN: Vertical Federated Learning-based Generative Adversarial Network for Vertically Partitioned Data Publication
CN117313133A (en) Data desensitization method, device, equipment and storage medium
CN117574432A (en) Data security release method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination