CN109886030B - Privacy minimum exposure method facing service combination - Google Patents
Privacy minimum exposure method facing service combination Download PDFInfo
- Publication number
- CN109886030B CN109886030B CN201910084983.4A CN201910084983A CN109886030B CN 109886030 B CN109886030 B CN 109886030B CN 201910084983 A CN201910084983 A CN 201910084983A CN 109886030 B CN109886030 B CN 109886030B
- Authority
- CN
- China
- Prior art keywords
- privacy
- data
- chain
- user
- service
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Storage Device Security (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention relates to a privacy minimum exposure method facing service combination, which ensures that privacy information of a user is not leaked as much as possible, and enables the combined service to provide application and simultaneously only use a minimum privacy data set of the user. An optimized exposure method for obtaining the private data through the discretization of the private data, the sensitivity analysis and the service availability analysis of the private data in the first stage and the acquisition of the private data in the second stage; thereby achieving the purpose of protecting the privacy information of the user.
Description
Technical Field
The invention belongs to the technical field of privacy protection methods of service combinations, and particularly relates to a privacy minimum exposure method for service combinations.
Background
Privacy was proposed as a non-functional attribute, originally as a kind of "keep-individual-independent" personal right. Due to the rapid development of the information industry, the extension of privacy is also expanding and applied to the software and network industries. Along with the popularization of social networks and the continuous enlargement and complexity of software, the wide use of mobile intelligent terminals and the increasing requirements of users on the performance and reliability of software, the non-functional attributes of users become one of the important factors for evaluating the software quality of users and merchants. The protection capability of private information also becomes a main factor of non-functional attributes of networks and software. Due to the emergence of virtual technology and function isolation theory and technology, represented by software defined service and software defined network, the function realization proportion of software in the whole system is increased day by day. In order to meet the functional requirements of users, the evolution requirements of the privacy law text and the application scenario and context environment thereof, the software model must also evolve. In an evolution environment, how to keep the consistency of the user privacy requirements and the software model, the correctness of the model after evolution and the satisfiability of the software model to the privacy policy during operation become the difficulty of information security research in a new computing paradigm.
In the field of information systems and software engineering, privacy protection is defined as the ability to control the collection, exposure and maintenance of self-information by others on behalf of an individual. Privacy protection in a cloud computing environment is defined as the ability of a user of a cloud service to control Personal Sensitive Information (PSI) to be collected, used, exposed and maintained by the cloud service provider. In the context of big data, cloud computing has served users as a new computing mode, and is the focus of research of broad scholars. The cloud computing has the advantages of providing services on demand, supporting a series of advantages of universal network access, independent resource pool position, rapid resource expansion, charge collection on use and the like besides the advantages of traditional Web services, and has a series of characteristics of service outsourcing, virtualization, distribution, multi-tenant and the like. These characteristics have improved the quality of service of cloud computing, have reduced the waste of computing resource, for example: outsourcing of services can improve the capacity and the level of specialization of the services in a service combination manner. Meanwhile, the cloud computing is a computing platform with multi-party collaboration, transparent interaction and evolution, the user privacy data is transparently interacted with each party collaborating on the SaaS layer, and after interaction is completed, the user is stored and used by cloud service participants, and the user loses control over the data. Therefore, after the service composition evolves, the private data of the user and the participants who quit the service composition are easily disclosed. Typical events such as new unified privacy policy implemented by Google from 3/1 of 2012 were mostly prosecuted by users in the united states and investigated and suspended implementation in the european union. According to the analysis of the American Electronic Privacy Information Center (EPIC), the new privacy policy of Google does not consider the use of the outsourced service for the privacy data, and the setting and management of the transmission of the privacy data among the services, cannot correctly reflect the privacy requirements of the users, and is possibly in conflict with the local laws. In 2016, 32 universal user data of China's era of the United states Cable television company are stolen, a Tumblr account number which exceeds 6500 ten thousand mail boxes is leaked, LinkedIn accounts which exceed 1.67 hundred million are publicly sold in black cities, Yahoo declares that at least 5 million user account information is stolen, and the stolen contents comprise the name, email address, telephone number, birthday, password and the like of the user.
The privacy enhancement technology research of the combined service computing process is also in the aspect of conceptual models, for example, the Anupam Datta professor of Kanai Meilong university protects the privacy information by using a logical reasoning method aiming at different stages of a software development cycle; the privacy protection research under the service combination process evolution environment also belongs to the starting stage, and a role-based privacy perception model is provided for Qun Ni of Pudi university. Li Duan et al proposes a privacy exposure recommendation method based on a privacy cost model, establishes a combined service model according to a privacy recommendation strategy required by a service and attributes of a user, and ensures that the privacy exposure cost of the user is minimum. Rocky Slavin et al propose a privacy policy violation framework based on Android application, utilize a privacy policy ontology to represent a mapping set from an API interface to a privacy policy, and prove the validity of the method through real verification.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a privacy minimum exposure method facing service combination, and the invention analyzes and describes privacy data discretely and measures sensitivity and service availability; thereby protecting the security of the private data of the user.
The invention discloses a privacy minimum exposure method facing service combination, which comprises the following steps:
step 1: judging whether the private data input by the user is discrete private data or continuous private data, and if the private data is continuous private data, performing the step 2; if the data is the discrete private data, performing step 6;
step 2: constructing a privacy ontology tree according to the relation between the privacy data;
and step 3: detecting whether the privacy data contain key privacy data or not, if so, traversing the privacy ontology tree by taking the key privacy data as a root node, if so, searching a child node set, forming the child node set into an inner discrete data chain, replacing the key privacy data with the inner discrete data chain, and deleting data outside the inner discrete data chain to obtain a new data chain, and performing the step 5; if the key privacy data does not have child nodes, detecting the privacy data, searching for an exposure chain, decomposing the existing exposure chain, and performing the step 4; if the key private data is not contained, searching for an exposure chain, decomposing the existing exposure chain, and performing the step 4;
and 4, step 4: searching the privacy ontology tree by taking the elements in the exposed chain as root nodes, searching a child node set of the privacy ontology tree, forming the child node set into an inner discrete data chain, replacing the elements in the exposed chain by the inner discrete data chain, and deleting data outside the chain in the inner discrete data chain to obtain a new data chain;
and 5: discretizing the new data chain;
step 6: acquiring user privacy sensitivity, and sequencing the privacy sensitivity in an ascending order to obtain sequencing of corresponding privacy data;
and 7: constructing a state space tree according to the applicability metric value corresponding to the privacy data to obtain a privacy data set with minimum service availability; the constraint function for constructing the state space tree is as follows:
wherein m is the sum of the minimum privacy information applicability metric values in the minimum privacy data set that the composite service can provide applications for users, and W is { W ═ W1,w2,,w3,......wnAn applicability metric, w, for each privacy datai≤wi+1,X={x1,x2,x3,......,xnIs the state of the service applicability privacy set, where xi∈{1,0};
And 8: and selecting privacy data with the minimum sensitivity to intersect with the privacy data set with the minimum service availability, and merging the intersection results to obtain the minimum privacy data set meeting the user requirements.
Further, in the step 6, if the user has a privacy requirement, the privacy sensitivity is defined as any real number in the interval [0,1] according to the privacy requirement of the user, where 0 represents the weakest sensitivity, and 1 represents the strongest sensitivity; and if the user does not have the privacy requirement, obtaining the privacy sensitivity according to the frequency of using the privacy data by the user.
Further, in step 5, deleting the last private data in the new data chain to obtain the discretized data chain.
Further, if the user does not have a privacy requirement, obtaining the sensitivity value of the privacy data in the data chain according to the formula (1):
where μ is the number of times of use of the private data in each ten thousand words, α is a coefficient, and k is a constant.
Further, after the step 5, assigning the discretized privacy data obtained in the step 5 to obtain a discretized privacy data chain example, and judging whether the discretized privacy data meets the privacy requirements of the user according to the data chain example; and if the user privacy requirements are met, performing the step 6, and if the user privacy requirements are not met, returning to the step 1.
Has the advantages that: compared with the prior art, the method and the device have the advantages that the minimum privacy data set of the user is exposed, so that the leakage of the privacy information of the user can be prevented, and the safety of the sensitive privacy data of the user is protected.
Drawings
FIG. 1 is a schematic diagram of a privacy ontology tree;
FIG. 2 is a schematic diagram of a process of private data decomposition;
FIG. 3 is a schematic diagram of a privacy data minimal exposure framework.
Detailed Description
The invention is further illustrated below with reference to the figures and examples.
The privacy minimum exposure method facing the service combination ensures that the privacy information of the user is not leaked as much as possible, and enables the combined service to only use the minimum privacy data set of the user while providing the application. An optimized exposure method for obtaining the private data through the discretization of the private data, the sensitivity analysis and the service availability analysis of the private data in the first stage and the acquisition of the private data in the second stage; thereby achieving the purpose of protecting the privacy information of the user. The method specifically comprises the following steps:
definition 1, privacy data exposure chain dc (disclosure chain), in the privacy data to be protected by the user, a partial order set defined according to the sensitivity of the userWherein PD is a private data set, i.e. PD ═ { PD ═ PD1,pd2,......pdi,......pdnAs follows: { name, IDcode, adress, phoneNumber, … … } ifAll satisfyThen DC is called exposed chain, such as: { name, adress }, { name, phoneNumber }. The number of elements in the partial order set | dc | is referred to as the length of the exposure chain.
Definition 2, Discrete/continuous Privacy Data DP/CP (Discrete Privacy Data/Continuity Privacy Data) Discrete Privacy Data means that the combination of any two pieces of Privacy Data in a Privacy Data set does not result in the exposure of the user's hidden Privacy information, i.e., there is no exposure chain. Can be expressed as a number of times as,wherein i, j is more than or equal to 1, DC is equal to DC, such as: { name, zipcode, Provision, age, career }. And continuous private data means that the presence of a combination of two private data in a private data set may result in the exposure of private information of a user. Can be expressed as a number of times as,wherein i, j, r1 or more, such as:
{name,IDcard,adress,phoneNumber,zipcode,age}。
definition 3, key Privacy data kp (key Privacy data), which is the only Privacy item that can determine the identity information of a user, exists in a Privacy data set, and the combination of a certain Privacy data set and any other Privacy data set is continuous Privacy data, and the Privacy data set is called key Privacy data. Can be expressed as a number of times as,wherein i is more than or equal to 1, n is more than or equal to k is more than or equal to 1, and i is not equal to k, such as: IDcard. In contrast, a private item other than the one that can determine the identity information of the user, that is, a non-key private data, is a non-key private data, may be expressed as,wherein NKP ∈ NKP ∈ KP, such as: { name, adress, phoneNumber, zipcode, age }.
Definition 4, combining Discrete Data chain dpc (composition Discrete Privacy Data chain) the chain composed of Discrete Data sets is called a combined Discrete Data chain. Combining discrete data chains requires either of the following two conditions to be satisfied:
(1) in the discrete data set, any one of the discrete data in combination with the continuous privacy data or the discrete privacy data does not derive the continuous privacy data, that is:whereinSuch a discrete data chain is often referred to as an outer composite discrete data chain.
(2) If the chain length of the privacy data is a, continuous privacy data can be derived by combining the set formed by the privacy data chains and the key privacy data, but continuous privacy data cannot be derived by any element in the set and the key privacy data, that is to say:wherein k is a. The discrete data chains are combined for the inner by referring to such discrete data chains.
For example: { name, zip code, progress, age, carer }, { country, progress, city, distribute, street } are all discrete data chains.
and (3) proving that: from the definition of the key privacy data:thenNamely, it isTherefore, the first and second electrodes are formed on the substrate,
according to the definition of the continuous privacy data, i.e. definition 2, the continuous privacy data necessarily includes the exposure chain, and may include the key privacy data.
The protection of the continuous privacy data is to break the continuous privacy data into discrete privacy data. Since the exposure chain is necessarily included, the parent data with the internal discrete data chain in the exposure chain is decomposed and converted into the external discrete data.
Detecting an exposure chain existing in the continuous privacy data, searching the privacy ontology tree by taking an element in the exposure chain as a root node, and searching for an existing child node set; collecting the child nodes to form an internal discrete data chain i-dpc; replacing private data dp with inner discrete data chain i-dpci(ii) a Deleting the off-link data in the inner discrete data chain; the original exposed chain is replaced by a non-exposed chain.
Because key privacy data may be contained, the privacy ontology tree is traversed by taking the key privacy data as a root node, if the key privacy data does not store child nodes, the privacy data set is detected, an exposure chain is searched, and the existing exposure chain is decomposed.
If the key privacy data exists in the child nodes, the key privacy data kpd is used as a root node to search the privacy ontology tree, and a child node set is searched; collecting the child nodes to form an internal discrete data chain i-dpc; replace the key privacy data kpd with an inner discrete data chain i-dpc; deleting the off-link data in the inner discrete data chain; the original exposed chain is replaced with a non-exposed chain.
The invention provides a method for sensitivity analysis and service availability analysis of privacy data, which comprises the following specific contents:
definition 5, privacy sensitivity PS(Privacy Sensitivity developer): privacy sensitivity is the degree of sensitivity of a user to personal private data, let ps={ps0,ps1,…,psn-1As the privacy data chain of the user, the sensitivity can be expressed as: sv ═ { sv ═ sv0,sv1,…,svn-1In which sv isiRepresenting data piI is more than or equal to 1 and less than or equal to n. The privacy sensitivity of a user is divided into two cases:
(1) when the user has a privacy requirement, defining the sensitivity degree of the privacy information of the user as any real number in a [0,1] interval according to the privacy requirement of the user, wherein 0 represents the weakest sensitivity, and 1 represents the strongest sensitivity;
(2) when the user has no privacy requirement, the privacy data is obtained according to the frequency of using the privacy data by the user, namely the privacy data can be obtained by using a recurrence relation, wherein the recurrence relation is as follows:
where μ is the number of times of use of the private data in each ten thousand words, α is a coefficient, and k is a constant.
Theorem 1, if user privacy data is selected, sorting the user privacy data in ascending order of sensitivity, namely:
1/sv0≥1/sv1≥......1/svn-1
and the private data can be decomposed, when the sensitivity of the private data set required by the user is not more than the selected sensitivity, the private data sets selected according to the sequence are necessarily the optimal data sets meeting the privacy requirements of the user.
And (3) proving that: let X be (X)0,x1,…,xn-1),0≤xiAnd n is more than or equal to 1, and i is more than or equal to 0 and less than n, so that the optimal privacy data set meeting the privacy requirements of the user is obtained. If the service can be provided for the user, all the private data may not be exposed, that is: x is the number ofiWhen the value is 0, the optimal solution to meet the privacy requirement of the user is determined. Otherwise, let r be such that xrMinimum subscript not equal to 1. As can be seen from the manner of selecting according to the order, the form of the private data set solution is:
X=(1,…,1,xr,0,…,0),0≤xr≤1
if X is not the optimal solution to satisfy the user's privacy requirements, then there is another feasible solution Y ═ Y0,y1…,yk,…,yn-1) Is the best solution, so thatLet k be such that yk≠xkMust have yk<xkThe explanation can be divided into 3 cases:
(1) if k < r, then xk=1,yk≠xkSo that y isk<xk。
(2) If k is r, then xkIs that the kth private data, after decomposition, can be added to the maximum number that satisfies the user's private data set, so that yk>xkIs not possible. In the same way, since yk≠xkTherefore must have yk<xk。
(3) If k > r, since xi0, r < i < n, if ykAnd not equal to 0, the private data set does not meet the privacy requirement of the user.
In conclusion, y must be presentk<xk。
Assume as xkReplacement Y ═ Y0,y1…,yk,yk+1,…,yn-1) Y in (1)kObtaining a new privacy data set Z ═ { Z ═ Z1,z2,...zk,zk+1...,zn-1Note that z is before substitutioni=yi=xiI is more than or equal to 0 and less than k-1, z is after replacementk=xk. To ensure
Thus, X ═ X0,x1,…,xn-1),0≤xiAnd n is more than or equal to 1, and i is more than or equal to 0 and less than n, so that the optimal privacy data set meeting the privacy requirements of the user is obtained.
Definition 6, service applicability privacy set PA(Service-Availability): so that the service can provide the user with a minimal private data set of the application, this set can be constructed using the state space tree. Let pA={pA1,pA2,…,pAnIs the user's private data set, W ═ W1,w2,,w3,......wnThe applicability metric value corresponding to each user privacy information is set by the service provider, and W is assumed to be in ascending order, namely W isi≤wi+1. Let X be { X ═ X1,x2,x3,......,xnIs the state of the service applicability privacy set, where xi∈{1,0}。
The constraint function for constructing the state space tree is:
Wherein m is the applicability of the minimum privacy data set privacy information which can provide applications for users by the combined servicek > n-1, but in fact, according to the precondition of the recursive algorithm, there is always s + w when entering the k-n-1 layern-1M is less than or equal to m, s + r is more than or equal to m, r is wn-1Thus, s + wn-1Otherwise, if one of the conditions is not met, it is not possible to access the k-n-1 th layer. So the function can be terminated only by ensuring that the initial conditions for invoking the algorithm are true.
The following is the solution algorithm for the service availability privacy set: lines 4-7 indicate when s + wkWhen m, outputting a set of feasible solutions, namely a set of minimum privacy data sets which can make the service available; lines 8-9 indicate when s + w is satisfiedk+wk+1<When m, searching the left sub-tree of the state space tree; lines 10-13 indicate when (s + r-w) is satisfiedk>=m)&&(s+wk+1<M), the right sub-tree of the state space tree is searched; lines 15-23 represent the values for which r is calculated.
Definition 7, private data-service satisfiability PS-A(PSensitivity-Availability): the satisfiability measure between the private data and the service may be obtained by computing the intersection of the private data set that meets the user's privacy requirements and the private data set that enables the service to be operational, i.e.: pS-A=PS∩PA={ps1,ps2,…,psn}∩{pA1,pA2,…,pAn}. There are 4 cases of intersection results:
(1) the intersection is the set of applicability, i.e.: pS-A=PS∩PA={pA1,pA2,…,pAnDescription of the inventionThis private data set satisfies the input and preconditions of the service, which is operational.
(2) The intersection is the privacy sensitive set, i.e.: pS-A=PS∩PA={ps1,ps2,…,psnDescription of the inventionThis private data set does not satisfy the input and preconditions of the service, which is not operational. In this case, elements in the privacy sensitive set are decomposed and then intersected.
(3) The intersection is a subset of the privacy sensitive set or a subset of the applicability set, i.e.:orAnd isThe privacy dataset does not satisfy the input and precondition of the service, which is not operational, in which case the elements in the complement of the intersection can be decomposed and then intersected.
(4) The intersection is empty, i.e.This private data set does not satisfy the input and preconditions of the service, which is not operational.
Example (b):
assume that the service composer collects all service participants 'inputs and preconditions as the user's Name (Name), home Address (Address), phone number (PhoneNumber), zip code (Postcode), and Age (Age), etc. And the user Hao Wang sends the private data of the user: the Name (Name), home Address (Address) and phone (PhoneNumber) are set as exposure chains, while the Name (Name) is set as key privacy data. Thus, the set of private data made up of the inputs and preconditions required by the service provider is just continuous private data. The privacy data of the user can be protected through the following steps:
firstly, constructing a privacy ontology tree according to the relationship between privacy data, as shown in fig. 1;
and secondly, obtaining the privacy data Name, Address and PhoneNumber corresponding to the exposure chain in the continuous privacy data through matching, and traversing the privacy ontology tree to find that the Name has child nodes of FirstName, SecondName and LastName, and the Address has child nodes of Country, Provision, City, Street and Community.
Thirdly, decomposing the private data Name and Address, namely replacing the private data Name and Address by using child nodes of the private data Name and Address;
fourthly, discretizing the newly formed privacy data chain, namely deleting the last privacy data in the inner discrete data chain and changing the continuous data chain into a discrete data chain;
and fifthly, assigning the privacy data in the discrete data chain to obtain a discrete privacy data chain example. This process is also a private data set exposed to the service provider after the user confirms the private data, namely:
Name(HAO WANG);Street(MOFAN);City(NANJING);Province(JIANGSU); Country(CHINA);PhoneNumber(+86-123456789);Postcode(210033);Age(30);
according to the exposure chain and the key privacy data defined by the user, the finally obtained privacy data chain meets the privacy requirements of the user.
Assuming that the discretized private data has the corresponding sensitivity value by defining equation (1) in fig. 5, (LN, Postcode, PhoneNum, Street, City, provision, Country, Age) ═ 0.3,0.2,0.8,0.22,0.18,0.1,0.03, 0.6). And the values of the sensitivity are sorted in ascending order to obtain the sorting (Country, Provision, City, Postcode, Street, LN, Age, PhoneNum) of the corresponding private data.
The corresponding service availability metric value is (LN, Postcode, PhoneNum, Street, City, provice, Country, Age) ═ 0.4,0.1,0.9,0.6,0.5,0.2,0.15,0.13, and M ═ 2.4. Constructing a state space tree by defining the constraint conditions in 6 to obtain a minimum privacy data set with service availability (PhoneNum, LN, Street, City).
The client selects the privacy data with the minimum sensitivity to intersect with the privacy data set with the minimum service availability each time, and union is performed on intersection results, so that the minimum privacy data set meeting the user requirements is obtained: { City, Street, LN, PhoneNum }.
And (3) intersection operation:
{City}∩{PhoneNum,LN,Street,City}={City}
{Street}∩{PhoneNum,LN,Street,City}={Street}
{LN}∩{PhoneNum,LN,Street,City}={LN}
{PhoneNum}∩{PhoneNum,LN,Street,City}={PhoneNum}
and calculating:
Claims (5)
1. a privacy minimum exposure method for service composition, characterized in that: the method comprises the following steps:
step 1: judging whether the privacy data input by the user is discrete privacy data or continuous privacy data, and if the privacy data is continuous privacy data, performing the step 2; if the data is the discrete private data, performing step 6;
step 2: constructing a privacy ontology tree according to the relation between the privacy data;
and step 3: detecting whether the privacy data contain key privacy data or not, if so, traversing the privacy ontology tree by taking the key privacy data as a root node, if the key privacy data contain child nodes, searching a child node set, forming the child node set into an internal discrete data chain, replacing the key privacy data by the internal discrete data chain, deleting data outside the chain in the internal discrete data chain at the same time, obtaining a new data chain, and performing the step 5; if the key privacy data does not have child nodes, detecting the privacy data, searching for an exposure chain, decomposing the existing exposure chain, and performing the step 4; if the private data do not contain the key private data, searching the exposure chain, decomposing the existing exposure chain, and performing the step 4;
and 4, step 4: searching the privacy ontology tree by taking the elements in the exposed chain as root nodes, searching a child node set of the privacy ontology tree, forming the child node set into an inner discrete data chain, replacing the elements in the exposed chain by the inner discrete data chain, and deleting data outside the chain in the inner discrete data chain to obtain a new data chain;
and 5: discretizing the new data chain;
step 6: acquiring user privacy sensitivity, and sequencing the privacy sensitivity in an ascending order to obtain sequencing of corresponding privacy data;
and 7: constructing a state space tree according to the applicability metric value corresponding to the privacy data to obtain a privacy data set with minimum service availability; the constraint function for constructing the state space tree is as follows:
Wherein m is the sum of the applicability metric values of each privacy information in the minimum privacy data set in which the composite service can provide applications for users, and W is { W ═ W1,w2,w3,......wnAn applicability metric, w, for each privacy datai≤wi+1,X={x1,x2,x3,......,xnIs the state of the service applicability privacy set, where xi∈{1,0};
And 8: and selecting privacy data with the minimum sensitivity to intersect with the privacy data set with the minimum service availability, and merging the intersection results to obtain the minimum privacy data set meeting the user requirements.
2. The service-oriented combinational privacy minimal exposure method according to claim 1, characterized in that: in the step 6, if the user has a privacy requirement, defining the privacy sensitivity as any real number in an interval of [0,1] according to the privacy requirement of the user, wherein 0 represents the weakest sensitivity, and 1 represents the strongest sensitivity; and if the user does not have the privacy requirement, obtaining the privacy sensitivity according to the frequency of using the privacy data by the user.
3. The service-oriented combinational privacy minimal exposure method according to claim 1, characterized in that: in the step 5, deleting the last private data in the new data chain to obtain the discretized data chain.
4. The method of claim 2, wherein the privacy minimum exposure method for service-oriented composition comprises: if the user does not have the privacy requirement, obtaining the sensitivity value of the privacy data in the data chain according to the formula (1):
where μ is the number of times of use of the private data in each ten thousand words, α is a coefficient, and k is a constant.
5. The service-oriented combinational privacy minimal exposure method according to claim 1, characterized in that: after the step 5, assigning the discretized privacy data obtained in the step 5 to obtain a discretized privacy data chain example, and judging whether the discretized privacy data meet the privacy requirements of the user according to the data chain example; and if the user privacy requirements are met, performing the step 6, and if the user privacy requirements are not met, returning to the step 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910084983.4A CN109886030B (en) | 2019-01-29 | 2019-01-29 | Privacy minimum exposure method facing service combination |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910084983.4A CN109886030B (en) | 2019-01-29 | 2019-01-29 | Privacy minimum exposure method facing service combination |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109886030A CN109886030A (en) | 2019-06-14 |
CN109886030B true CN109886030B (en) | 2021-06-11 |
Family
ID=66927206
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910084983.4A Active CN109886030B (en) | 2019-01-29 | 2019-01-29 | Privacy minimum exposure method facing service combination |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109886030B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111539012B (en) * | 2020-03-19 | 2021-07-20 | 重庆特斯联智慧科技股份有限公司 | Privacy data distribution storage system and method of edge framework |
CN113127916B (en) * | 2021-05-18 | 2023-07-28 | 腾讯科技(深圳)有限公司 | Data set processing method, data processing method, device and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104378370A (en) * | 2014-11-12 | 2015-02-25 | 南京邮电大学 | Secure use method of privacy data in cloud computation |
CN106572111A (en) * | 2016-11-09 | 2017-04-19 | 南京邮电大学 | Big-data-oriented privacy information release exposure chain discovery method |
CN108446568A (en) * | 2018-03-19 | 2018-08-24 | 西北大学 | A kind of histogram data dissemination method going trend analysis difference secret protection |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB201517331D0 (en) * | 2015-10-01 | 2015-11-18 | Chase Information Technology Services Ltd And Cannings Nigel H | System and method for preserving privacy of data in a cloud |
-
2019
- 2019-01-29 CN CN201910084983.4A patent/CN109886030B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104378370A (en) * | 2014-11-12 | 2015-02-25 | 南京邮电大学 | Secure use method of privacy data in cloud computation |
CN106572111A (en) * | 2016-11-09 | 2017-04-19 | 南京邮电大学 | Big-data-oriented privacy information release exposure chain discovery method |
CN108446568A (en) * | 2018-03-19 | 2018-08-24 | 西北大学 | A kind of histogram data dissemination method going trend analysis difference secret protection |
Non-Patent Citations (2)
Title |
---|
"Privacy Disclosure Checking Method Applied on Collaboration Interactions Among SaaS Services";Changbo Ke, Zhiqiu Huang, Xiaohui Cheng;《IEEE Access》;20170822;全文 * |
"Privacy-aware cloud service selection approach based on P-Spec policy models and privacy sensitivities";Yunfei Meng, Zhiqiu Huang, Yu Zhou, Changbo Ke;《https://doi.org/10.1016/j.future.2018.03.013》;20180312;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN109886030A (en) | 2019-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Serafino et al. | True scale-free networks hidden by finite size effects | |
WO2021046541A1 (en) | Graph outcome determination in domain-specific execution environment | |
US7523135B2 (en) | Risk and compliance framework | |
US8626835B1 (en) | Social identity clustering | |
US20150278699A1 (en) | Rules based data processing system and method | |
US20230161746A1 (en) | Data pipeline branching | |
US9830333B1 (en) | Deterministic data replication with conflict resolution | |
CN109257364B (en) | Multi-core mesh type multi-level cross-domain access control method based on cloud platform | |
CN114600420A (en) | Pruning entries in a tamper-resistant data storage device | |
CN110717076A (en) | Node management method, node management device, computer equipment and storage medium | |
CN109886030B (en) | Privacy minimum exposure method facing service combination | |
CN113626128B (en) | Audio-visual media micro-service third-party module access method, system and electronic equipment | |
CN107590189B (en) | Intelligent contract execution method, device, equipment and storage medium | |
Zhou et al. | Measuring inconsistency in dl-lite ontologies | |
Andročec et al. | Ontologies for platform as service APIs interoperability | |
Chen et al. | Decomposition of UML activity diagrams | |
CN109495460B (en) | Privacy policy dynamic updating method in combined service | |
CN111191050A (en) | Method and device for constructing knowledge graph body model | |
Ke et al. | Ontology-based privacy data chain disclosure discovery method for big data | |
CN115840738A (en) | Data migration method and device, electronic equipment and storage medium | |
CN113835780A (en) | Event response method and device | |
Mukunthan et al. | Multilevel Petri net‐based ticket assignment and IT management for improved IT organization support | |
Marín et al. | Devolved ontology in practice for a seamless semantic alignment within dynamic collaboration networks of smes | |
CN113779017A (en) | Method and apparatus for data asset management | |
Thirumaran et al. | Parallel analytic hierarchy process for web service discovery and composition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
EE01 | Entry into force of recordation of patent licensing contract | ||
EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20190614 Assignee: NUPT INSTITUTE OF BIG DATA RESEARCH AT YANCHENG Assignor: NANJING University OF POSTS AND TELECOMMUNICATIONS Contract record no.: X2021980013920 Denomination of invention: A service composition oriented privacy minimum exposure method Granted publication date: 20210611 License type: Common License Record date: 20211202 |