EP3036678A1 - Procédé et appareil d'association préservant la confidentialité tenant compte de l'utilité dans une optique de collusion et de composition - Google Patents
Procédé et appareil d'association préservant la confidentialité tenant compte de l'utilité dans une optique de collusion et de compositionInfo
- Publication number
- EP3036678A1 EP3036678A1 EP13812233.8A EP13812233A EP3036678A1 EP 3036678 A1 EP3036678 A1 EP 3036678A1 EP 13812233 A EP13812233 A EP 13812233A EP 3036678 A1 EP3036678 A1 EP 3036678A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- data
- bound
- public
- private
- privacy
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
Definitions
- This invention relates to a method and an apparatus for preserving privacy, and more particularly, to a method and an apparatus for preserving privacy of user data in view of collusion or composition.
- This service, or other benefit that the user derives from allowing access to the user's data may be referred to as utility.
- privacy risks arise as some of the collected data may be deemed sensitive by the user, e.g., political opinion, health status, income level, or may seem harmless at first sight, e.g., product ratings, yet lead to the inference of more sensitive data with which it is correlated.
- the latter threat refers to an inference attack, a technique of inferring private data by exploiting its correlation with publicly released data.
- FIG. 1 is a pictorial example illustrating collusion and composition.
- FIG. 2 is a flow diagram depicting an exemplary method for preserving privacy, in accordance with an embodiment of the present principles.
- FIG. 3 is a flow diagram depicting another exemplary method for preserving privacy, in accordance with an embodiment of the present principles.
- FIG. 4 is a block diagram depicting an exemplary privacy agent, in
- FIG. 5 is a block diagram depicting an exemplary system that has multiple privacy agents, in accordance with an embodiment of the present principles.
- the present principles provide a method for processing user data for a user, comprising the steps of: accessing the user data, which includes private data, a first public data and a second public data, the first public data corresponding to a first category of data, and the second public data corresponding to a second category of data; determining a first information leakage bound between the private data and a first and second released data; determining a second information leakage bound between the private data and the first released data, and a third information leakage bound between the private data and the second released data, responsive to the first information leakage bound; determining a first privacy preserving mapping that maps the first category of data to the first released data responsive the second bound and a second privacy preserving mapping that maps the second category of data to the second released data responsive the third bound; modifying the first and second public data for the user, based on the first and second privacy preserving mappings respectively, to form the first and second released data; and releasing the modified first and second public data to at least one of a service provider and a data collecting agency as described
- the present principles also provide a method for processing user data for a user, comprising the steps of: accessing the user data, which includes private data, a first public data and a second public data, the first public data corresponding to a first category of data, and the second public data corresponding to a second category of data; determining a first information leakage bound between the private data and a first and second released data; determining a second information leakage bound between the private data and the first released data, and a third information leakage bound between the private data and the second released data, responsive to the first information leakage bound, wherein each of the second bound and the third bound substantially equals the first bound; determining a first privacy preserving mapping that maps the first category of data to the first released data responsive the second bound and a second privacy preserving mapping that maps the second category of data to the second released data responsive the third bound; modifying the first and second public data for the user, based on the first and second privacy preserving mappings respectively, to form the first and second released data; and releasing the modified first and second
- the term analyst which for example may be a part of a service provider's system, as used in the present application, refers to a receiver of the released data, who ostensibly uses the data in order to provide utility to the user. Often the analyst is a legitimate receiver of the released data. However, an analyst could also illegitimately exploit the released data and infer some information about private data of the user. This creates a tension between privacy and utility requirements. To reduce the inference threat while maintaining utility the user may release a "distorted version" of data, generated according to a conditional probabilistic mapping, called “privacy preserving mapping," designed under a utility constraint.
- a user would like to remain private as “private data,” the data the user is willing to release as “public data,” and the data the user actually releases as “released data.”
- a user may want to keep his political opinion private, and is willing to release his TV ratings with modification (for example, the user's actual rating of a program is 4, but he releases the rating as 3).
- the user's political opinion is considered to be private data for this user
- the TV ratings are considered to be public data
- the released modified TV ratings are considered to be the released data.
- another user may be willing to release both political opinion and TV ratings without modifications, and thus, for this other user, there is no distinction between private data, public data and released data when only political opinion and TV ratings are considered. If many people release political opinions and TV ratings, an analyst may be able to derive the correlation between political opinions and TV ratings, and thus, may be able to infer the political opinion of the user who wants to keep it private.
- private data this refers to data that the user not only indicates that it should not be publicly released, but also that he does not want it to be inferred from other data that he would release.
- Public data is data that the user would allow the privacy agent to release, possibly in a distorted way to prevent the inference of the private data.
- public data is the data that the service provider requests from the user in order to provide him with the service. The user however will distort (i.e., modify) it before releasing it to the service provider.
- public data is the data that the user indicates as being "public” in the sense that he would not mind releasing it as long as the release takes a form that protects against inference of the private data.
- a specific category of data is considered as private data or public data is based on the point of view of a specific user. For ease of notation, we call a specific category of data as private data or public data from the perspective of the current user. For example, when trying to design privacy preserving mapping for a current user who wants to keep his political opinion private, we call the political opinion as private data for both the current user and for another user who is willing to release his political opinion.
- the distortion between the released data and public data as a measure of utility.
- the distortion is larger, the released data is more different from the public data, and more privacy is preserved, but the utility derived from the distorted data may be lower for the user.
- the distortion is smaller, the released data is a more accurate representation of the public data and the user may receive more utility, for example, receive more accurate content recommendations.
- we model the privacy-utility tradeoff and design the privacy preserving mapping by solving an optimization problem minimizing the information leakage, which is defined as mutual information between private data and released data, subject to a distortion constraint.
- finding the privacy preserving mapping relies on the fundamental assumption that the prior joint distribution that links private data and released data is known and can be provided as an input to the optimization problem.
- the true prior distribution may not be known, but rather some prior statistics may be estimated from a set of sample data that can be observed.
- the prior joint distribution could be estimated from a set of users who do not have privacy concerns and publicly release different categories of data, which may be considered to be private or public data by the users who are concerned about their privacy.
- the marginal distribution of the public data to be released, or simply its second order statistics may be estimated from a set of users who only release their public data.
- the statistics estimated based on this set of samples are then used to design the privacy preserving mapping mechanism that will be applied to new users, who are concerned about their privacy.
- the public data is denoted by a random variable X ⁇ X with the probability distribution P x .
- X is correlated with the private data, denoted by random variable S e S.
- the correlation of S and X is defined by the joint distribution P s x .
- the released data, denoted by random variable Y G y is a distorted version of X.
- Y is achieved via passing X through a kernel, P Y ⁇ X .
- the term "kernel” refers to a conditional probability that maps data X to data Y probabilistically. That is, the kernel P Y ⁇ X is the privacy preserving mapping that we wish to design.
- D (. ) is the K-L divergence
- E(. ) is the expectation of a random variable
- H(. ) is the entropy
- e e [0,1] is called the leakage factor
- I(S; Y) represents the information leakage.
- any distortion metric can be used, such as the
- leakage factor, e, and distortion level, D of a privacy preserving mapping.
- our objective is to limit the amount of private information that can be inferred, given a utility constraint.
- the objective can be mathematically formulated as to find the probability mapping P Y ⁇ X that minimizes the maximum information leakage 7(5; Y) given a distortion constraint, where the maximum is taken over the uncertainty in the statistical knowledge on the distribution P s x available at the privacy agent:
- the probability distribution P S Y can be obtained from the joint distribution
- Theorem 1 decouples the dependency of Y and S into two terms, one relating S and X, and one relating X and Y. Thus, one can upper bound the information leakage even without knowing P s x , by minimizing the term relating X and Y.
- the application of this result in our problem is the following:
- I(S; X) is the intrinsic information embedded in X about S, which we do not have control on.
- the value of ⁇ does not affect the mapping we will find, but the value of ⁇ affects what we think is the privacy guarantee (in term the leakage factor) resulting from this mapping. If the ⁇ bound is tight, then the privacy guarantee will be tight. If the ⁇ bound is not tight, we may then be paying more distortion than is actually necessary for a target leakage factor, but this does not affect the privacy guarantee.
- Maximal correlation is a measure of correlation between two random variables with applications both in information theory and computer science.
- maximal correlation provides its relation with S * (X Y).
- Ahlswede and P. Gacs, "Spreading of sets in product spaces and hypercontraction of the markov operator," The Annals of Probability (hereinafter “Ahlswede”):
- Collusion a private data, S, is correlated with two public data, X t and X 2 .
- Each privacy preserving mapping is designed to protect against the inference of S from each of the released data separately.
- Decentralization simplifies the design, by breaking one large optimization with many variables (joint design) into several smaller optimizations with fewer variables.
- Composition a private data S is correlated with the public data, X x and X 2 through the joint probability distribution P(S; X x ; X 2 ).
- P(S; X x ; X 2 ) the probability distribution of the public data
- X x the public data
- X 2 the joint probability distribution of the public data
- P(S; X x ; X 2 ) the probability distribution of the public data
- P(S; X x ; X 2 ) the joint probability distribution
- FIG. 1 provides examples on collusion and composition:
- Example 1 collusion when a single private data and multiple public data are considered
- Example 2 collusion when multiple private data and multiple public data are considered
- Example 3 composition when a single private data and multiple public data are considered.
- Example 4 composition when multiple private data and multiple public data are considered.
- a private data, S is correlated with two public data, X x and X 2 .
- Netflix is a legitimate receiver of information about TV rating, but not snack rating
- Kraft Foods is a legitimate receiver of information about snack rating, but not TV rating. However, they may share information in order to infer more about the user's private data.
- Example 2 private data S 1 is correlated with public data X Xl and private data S 2 is correlated with public data X 2 .
- income as private data S t
- gender as private data S 2
- TV rating as public data X x
- snack rating as public data X 2 .
- Two privacy preserving mappings are applied on these public data to obtain two released data, Y 1 and Y 2 provided to two analysts, respectively.
- Example 3 a private data, S is correlated with public data X x and X 2 through joint probability distribution Ps ⁇ x ⁇
- Ps ⁇ x ⁇ we consider political opinion as private data S, TV rating for Fox news as public data X x and TV rating for ABC news as public data X 2 .
- An analyst for example, Comcast asks for both X 1 and X 2 .
- the privacy preserving mappings are designed separately and we want to analyze the privacy guarantees when the privacy agent combines her information Y 1 and Y 2 about both S t and S 2 .
- Comcast is an legitimate receiver of both TV ratings for Fox news and ABC news.
- Example 4 two private data, S t and S 2 are correlated with public data, X x and X 2 through joint probability distribution Ps ⁇ x ⁇
- income as private data S t
- gender as private data S 2
- TV rating as public data X x
- snack rating as public data X 2 .
- mappings for large size X are more difficult to design than mappings for small size X (possibly one variable, or a small vector), as the complexity of the optimization problem which provides a solution to the privacy mapping scales with the size of vector X.
- a private random variable S is correlated with 3 ⁇ 4and X 2 .
- Distorted versions of 3 ⁇ 4and X 2 are denoted by Y x and Y 2 , respectively.
- Y x and Y 2 Distorted versions of 3 ⁇ 4and X 2 are denoted by Y x and Y 2 , respectively.
- PiX ⁇ X- and P(Y 2 ⁇ X 2 ) on ⁇ and X 2 to obtain Y 1 and Y 2 , respectively given distortion constraints.
- the individual information leakages are I(S Y t ) and I(S Y 2 ).
- Y x and Y 2 are combined together into a pair (Y l t Y 2 ), either by colluding entities, or by a privacy agent through composition.
- a private random variable 5 is correlated with X x and X 2 .
- Distorted versions of X x and X 2 are denoted by Y 1 and Y 2 , respectively.
- Y 1 and Y 2 Distorted versions of X x and X 2 are denoted by Y 1 and Y 2 , respectively.
- Py 2 ⁇ x 2 are designed with given distortion constraints, and the individual information leakages are 7(5; and 7(5; Y 2 ), respectively.
- the two released data Y 1 and Y 2 are combined together into a pair (3 ⁇ 4, Y 2 ), either by colluding entities, or by a privacy agent through composition.
- Lemma 1 Assume Y t , Y 2 , and S form a Markov chain in any order. If the privacy preserving mappings leak I(Y ; S) and I(Y 2 ; S bits by Y x and Y 2 , respectively, then at most I(Yi, S) + l(Y 2 ; S) bits of information are leaked by the pair Y 1 and Y 2 . In other words, 5) ⁇ I(Y 1 ; S) + I(Y 2 ; S). Moreover, if S ⁇ Y 1 ⁇ Y 2 , then /(S; ⁇ , ⁇ ) ⁇
- Lemma 1 applies regardless of how much knowledge on P s x is available when the mapping is designed.
- the bounds in Lemma 1 holds when P sx is known. It also holds if the privacy preserving mappings are designed using the method based on the separability result in Theorem 1 .
- FIG. 2 illustrates an exemplary method 200 for preserving privacy in view of collusion or composition, in accordance with an embodiment of the present principles.
- Method 200 starts at step 205.
- it collects statistical information based on the single private data S and public data X x and X 2 .
- it decides the cumulative privacy guarantee for the private data S in view of collusion or composition of released data Y x and Y 2 . That is, it decides a leakage factor e for / (S Y 1 , Y 2 ) .
- the privacy preserving mappings are designed in a decentralized fashion for public data X x and X 2 .
- it determines a privacy preserving mapping P Yl ⁇ Xl for public data X t , given leakage factor e t for I(S; Y t ).
- it determines a privacy preserving mapping ⁇ 2 ⁇ 2 for public data X 2 , given leakage factor e 2 for I(S; Y 2 ).
- collusion may occur when a legitimate receiver of released data Y 1 (but not Y 2 ) exchanges information about Y 2 with a legitimate receiver of released data Y 2 (but not Y t ).
- both released data are legitimately received by the same receiver, and composition occurs when the receiver combines information from both released data to infer more information about the user.
- S*(3 ⁇ 4,3 ⁇ 4 3 ⁇ 4) max ⁇ S*(3 ⁇ 4; 3 ⁇ 4S*3 ⁇ 4; 3 ⁇ 4) ⁇ .
- I(Yi>' 3 ⁇ 4 ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ) is the only required inequality as mentioned in Anantharam to obtain the inequality (20) (see Anantharam, page 10, part C).
- FIG.3 illustrates an exemplary method 300 for preserving privacy in view of collusion or composition, in accordance with an embodiment of the present principles.
- Method 300 is similar to method 200,except that S*(3 ⁇ 4; ⁇ e (330) and S*(X 2 ; Y 2 ) ⁇ e (335). Note that method 200 works under some Markov chain assumptions stated in Lemma 1 , while method 300 works more generally. Multiple private data, multiple public data
- the cumulative information leakage of the pair Y 1 and y 2 is bounded by (21 ). In particular, if X x and X 2 are independent, then this bound holds.
- method 200 determines privacy preserving mappings considering a single private data and two public data in view of collusion or composition.
- method 200 can be applied with some modifications.
- step 210 we collect statistical information based on S t , S 2 , X 1 and X 2 .
- step 230 we design a privacy preserving mapping P Yl ⁇ Xl for public data X t , given leakage factor ⁇ for /(5 1( - Y t ).
- step 235 we design a privacy preserving mapping ⁇ 2 ⁇ 2 for public data X 2 , given leakage factor ⁇ 2 for l(S 2 ; Y 2 ).
- step 310 we collect statistical information based on S t , S 2 , X x and X 2 .
- step 335 we design a privacy preserving mapping ⁇ 2 ⁇ 2 for public data X 2 , given leakage factor ⁇ for /(5 2 ; y 2 ) .
- a privacy agent is an entity that provides privacy service to a user.
- a privacy agent may perform any of the following:
- FIG. 4 depicts a block diagram of an exemplary system 400 where a privacy agent can be used.
- Public users 410 release their private data (S) and/or public data (X).
- S private data
- X public data
- the information released by the public users becomes statistical information useful for a privacy agent.
- a privacy agent 480 includes statistics collecting module 420, privacy preserving mapping decision module 430, and privacy preserving module 440.
- Statistics collecting module 420 may be used to collect joint distribution P s x , marginal probability measure P x , and/or mean and covariance of public data.
- Statistics collecting module 420 may also receive statistics from data aggregators, such as bluekai.com.
- privacy preserving mapping decision module 430 designs several privacy preserving mapping mechanisms.
- Privacy preserving module 440 distorts public data of private user 460 before it is released, according to the conditional probability.
- the privacy preserving module may design separate privacy preserving mappings for X x and X 2 , respectively, in view of composition.
- each colluding entity may use system 400 to design a separate privacy preserving mapping.
- the privacy agent needs only the statistics to work without the knowledge of the entire data that was collected in the data collection module and that allowed to compute the statistics.
- the data collection module could be a standalone module that collects data and then computes statistics, and needs not be part of the privacy agent. The data collection module shares the statistics with the privacy agent.
- a privacy agent sits between a user and a receiver of the user data (for example, a service provider).
- a privacy agent may be located at a user device, for example, a computer, or a set-top box (STB).
- STB set-top box
- a privacy agent may be a separate entity.
- All the modules of a privacy agent may be located at one device, or may be distributed over different devices, for example, statistics collecting module 420 may be located at a data aggregator who only releases statistics to the module 430, the privacy preserving mapping decision module 430, may be located at a "privacy service provider" or at the user end on the user device connected to a module 420, and the privacy preserving module 440 may be located at a privacy service provider, who then acts as an intermediary between the user, and the service provider to who the user would like to release data, or at the user end on the user device.
- the privacy agent may provide released data to a service provider, for example, Comcast or Netflix, in order for private user 460 to improve received service based on the released data, for example, a recommendation system provides movie recommendations to a user based on its released movies rankings.
- a service provider for example, Comcast or Netflix
- FIG. 5 we show that there are multiple privacy agents in the system. In different variations, there need not be privacy agents everywhere as it is not a requirement for the privacy system to work. For example, there could be only a privacy agent at the user device, or at the service provider, or at both. In FIG. 5, we show that the same privacy agent "C" for both Netflix and Facebook. In another embodiment, the privacy agents at Facebook and Netflix, can, but need not, be the same.
- the implementations described herein may be implemented in, for example, a method or a process, an apparatus, a software program, a data stream, or a signal.
- An apparatus may be implemented in, for example, appropriate hardware, software, and firmware.
- the methods may be implemented in, for example, an apparatus such as, for example, a processor, which refers to processing devices in general, including, for example, a computer, a microprocessor, an integrated circuit, or a programmable logic device.
- processors also include communication devices, such as, for example, computers, cell phones, portable/personal digital assistants ("PDAs"), and other devices that facilitate communication of information between end-users.
- PDAs portable/personal digital assistants
- the appearances of the phrase “in one embodiment” or “in an embodiment” or “in one implementation” or “in an implementation”, as well any other variations, appearing in various places throughout the specification are not necessarily all referring to the same embodiment.
- Determining the information may include one or more of, for example, estimating the information, calculating the information, predicting the information, or retrieving the information from memory.
- Accessing the information may include one or more of, for example, receiving the information, retrieving the information (for example, from memory), storing the information, processing the information, transmitting the information, moving the information, copying the information, erasing the information, calculating the information, determining the information, predicting the information, or estimating the information. Additionally, this application or its claims may refer to "receiving" various pieces of information. Receiving is, as with “accessing", intended to be a broad term. Receiving the information may include one or more of, for example, accessing the information, or retrieving the information (for example, from memory).
- receiving is typically involved, in one way or another, during operations such as, for example, storing the information, processing the information, transmitting the information, moving the information, copying the information, erasing the information, calculating the information, determining the information, predicting the information, or estimating the information.
- implementations may produce a variety of signals formatted to carry information that may be, for example, stored or transmitted.
- the information may include, for example, instructions for performing a method, or data produced by one of the described implementations.
- a signal may be formatted to carry the bitstream of a described embodiment.
- Such a signal may be formatted, for example, as an electromagnetic wave (for example, using a radio frequency portion of spectrum) or as a baseband signal.
- the formatting may include, for example, encoding a data stream and modulating a carrier with the encoded data stream.
- the information that the signal carries may be, for example, analog or digital information.
- the signal may be transmitted over a variety of different wired or wireless links, as is known.
- the signal may be stored on a processor-readable medium.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Security & Cryptography (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Storage Device Security (AREA)
- Data Mining & Analysis (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Physics (AREA)
- Computational Mathematics (AREA)
- Mathematical Optimization (AREA)
- Mathematical Analysis (AREA)
- Evolutionary Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Operations Research (AREA)
- Probability & Statistics with Applications (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Algebra (AREA)
- Bioinformatics & Computational Biology (AREA)
- Automation & Control Theory (AREA)
Abstract
La présente invention concerne, dans les modes de réalisation décrits, le compromis confidentialité-utilité auquel fait face un utilisateur qui souhaite révéler à un analyste des données publiques corrélées à ses données privées, dans l'espoir d'obtenir une certaine utilité. Lorsque des données multiples sont révélées à un ou plusieurs analystes, des associations préservant la confidentialité sont conçues de façon décentralisée. En particulier, chaque association préservant la confidentialité est conçue pour protéger contre l'inférence de données privées à partir de chacune des données révélées séparément. La décentralisation simplifie la conception, en fractionnant un problème d'optimisation conjointe de grande taille comprenant de nombreuses variables en plusieurs optimisations plus petites comprenant un plus petit nombre de variables.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361867544P | 2013-08-19 | 2013-08-19 | |
PCT/US2013/071287 WO2015026385A1 (fr) | 2013-08-19 | 2013-11-21 | Procédé et appareil d'association préservant la confidentialité tenant compte de l'utilité dans une optique de collusion et de composition |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3036678A1 true EP3036678A1 (fr) | 2016-06-29 |
Family
ID=49880941
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13812233.8A Withdrawn EP3036678A1 (fr) | 2013-08-19 | 2013-11-21 | Procédé et appareil d'association préservant la confidentialité tenant compte de l'utilité dans une optique de collusion et de composition |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP3036678A1 (fr) |
JP (1) | JP2016535898A (fr) |
KR (1) | KR20160044485A (fr) |
CN (1) | CN105612529A (fr) |
WO (1) | WO2015026385A1 (fr) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014031551A1 (fr) * | 2012-08-20 | 2014-02-27 | Thomson Licensing | Procédé et appareil de mise en correspondance de données de préservation de confidentialité avec compromis confidentialité-exactitude |
CN108073821B (zh) * | 2016-11-09 | 2021-08-06 | 中国移动通信有限公司研究院 | 数据安全处理方法及装置 |
EP3729319A1 (fr) * | 2017-12-18 | 2020-10-28 | Privitar Limited | Procédé ou système de diffusion de produit de données |
CN108763947B (zh) * | 2018-01-19 | 2020-07-07 | 北京交通大学 | 时间-空间型的轨迹大数据差分隐私保护方法 |
CN108763954B (zh) * | 2018-05-17 | 2022-03-01 | 西安电子科技大学 | 线性回归模型多维高斯差分隐私保护方法、信息安全系统 |
CN109766710B (zh) * | 2018-12-06 | 2022-04-08 | 广西师范大学 | 关联社交网络数据的差分隐私保护方法 |
JP2021056435A (ja) | 2019-10-01 | 2021-04-08 | 株式会社東芝 | 情報処理装置、情報処理方法、およびプログラム |
CN110968893A (zh) * | 2019-11-21 | 2020-04-07 | 中山大学 | 一种基于Pufferfish框架的针对关联分类数据序列的隐私保护方法 |
CN111461858B (zh) * | 2020-03-10 | 2023-02-17 | 支付宝(杭州)信息技术有限公司 | 基于隐私保护的连乘计算方法、装置、系统和电子设备 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7533808B2 (en) * | 2005-02-09 | 2009-05-19 | Yuh-Shen Song | Privacy protected cooperation network |
US20100036884A1 (en) * | 2008-08-08 | 2010-02-11 | Brown Robert G | Correlation engine for generating anonymous correlations between publication-restricted data and personal attribute data |
US8312273B2 (en) * | 2009-10-07 | 2012-11-13 | Microsoft Corporation | Privacy vault for maintaining the privacy of user profiles |
CN102624708A (zh) * | 2012-02-23 | 2012-08-01 | 浙江工商大学 | 一种面向云存储的高效数据加密、更新和访问控制方法 |
-
2013
- 2013-11-21 EP EP13812233.8A patent/EP3036678A1/fr not_active Withdrawn
- 2013-11-21 KR KR1020167004285A patent/KR20160044485A/ko not_active Application Discontinuation
- 2013-11-21 WO PCT/US2013/071287 patent/WO2015026385A1/fr active Application Filing
- 2013-11-21 JP JP2016536078A patent/JP2016535898A/ja not_active Withdrawn
- 2013-11-21 CN CN201380078967.5A patent/CN105612529A/zh active Pending
Non-Patent Citations (1)
Title |
---|
See references of WO2015026385A1 * |
Also Published As
Publication number | Publication date |
---|---|
CN105612529A (zh) | 2016-05-25 |
KR20160044485A (ko) | 2016-04-25 |
WO2015026385A1 (fr) | 2015-02-26 |
JP2016535898A (ja) | 2016-11-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3036678A1 (fr) | Procédé et appareil d'association préservant la confidentialité tenant compte de l'utilité dans une optique de collusion et de composition | |
US20160203333A1 (en) | Method and apparatus for utility-aware privacy preserving mapping against inference attacks | |
Ye et al. | Heterogeneous federated learning: State-of-the-art and research challenges | |
KR20160044553A (ko) | 가산성 잡음을 통한 유틸리티-인식 프라이버시 보호 매핑을 위한 방법 및 장치 | |
Zhou et al. | Kernelized probabilistic matrix factorization: Exploiting graphs and side information | |
Shen et al. | Epicrec: Towards practical differentially private framework for personalized recommendation | |
Salamatian et al. | How to hide the elephant-or the donkey-in the room: Practical privacy against statistical inference for large data | |
US20160210463A1 (en) | Method and apparatus for utility-aware privacy preserving mapping through additive noise | |
Shen et al. | Privacy-preserving personalized recommendation: An instance-based approach via differential privacy | |
US11106809B2 (en) | Privacy-preserving transformation of continuous data | |
US20150235051A1 (en) | Method And Apparatus For Privacy-Preserving Data Mapping Under A Privacy-Accuracy Trade-Off | |
WO2022160623A1 (fr) | Procédé d'apprentissage par agrégation de consensus d'enseignants basé sur une technologie de confidentialité différentielle à réponse aléatoire | |
US20160006700A1 (en) | Privacy against inference attacks under mismatched prior | |
EP3036677A1 (fr) | Procédé et appareil permettant un mappage utilitaire préservant la vie privée contre les attaques d'interférence | |
WO2015157020A1 (fr) | Procédé et appareil de mise en correspondance de préservation de confidentialité éparse | |
CN107609421A (zh) | 隐私保护协同Web服务质量预测的基于邻域的协同过滤方法 | |
Asad et al. | CEEP-FL: A comprehensive approach for communication efficiency and enhanced privacy in federated learning | |
WO2022237175A1 (fr) | Procédé et appareil de traitement de données de graphe, dispositif, support d'enregistrement et produit programme | |
Zhou et al. | Differentially private distributed learning | |
US20160203334A1 (en) | Method and apparatus for utility-aware privacy preserving mapping in view of collusion and composition | |
Weng et al. | Practical privacy attacks on vertical federated learning | |
Yang | Improving privacy preserving in modern applications | |
Jiang | Differentially Private Data Publishing | |
Qian et al. | FDP-FL: differentially private federated learning with flexible privacy budget allocation | |
Feng et al. | MPLDP: Multi-level Personalized Local Differential Privacy Method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20160315 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20190426 |