WO2020087878A1 - 一种隐私信息管理方法、装置和系统 - Google Patents

一种隐私信息管理方法、装置和系统 Download PDF

Info

Publication number
WO2020087878A1
WO2020087878A1 PCT/CN2019/083048 CN2019083048W WO2020087878A1 WO 2020087878 A1 WO2020087878 A1 WO 2020087878A1 CN 2019083048 W CN2019083048 W CN 2019083048W WO 2020087878 A1 WO2020087878 A1 WO 2020087878A1
Authority
WO
WIPO (PCT)
Prior art keywords
privacy
information
privacy protection
protection
algorithm
Prior art date
Application number
PCT/CN2019/083048
Other languages
English (en)
French (fr)
Inventor
李凤华
李晖
牛犇
朱辉
Original Assignee
中国科学院信息工程研究所
西安电子科技大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中国科学院信息工程研究所, 西安电子科技大学 filed Critical 中国科学院信息工程研究所
Publication of WO2020087878A1 publication Critical patent/WO2020087878A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes

Definitions

  • This article relates to, but not limited to, the field of information application technology, in particular to a method, device and system for managing private information.
  • the private information management method only manages for specific information types and specific information processing stages, lacks comprehensive management of multiple links such as fusion of private information, design of privacy protection algorithms, and evaluation of protection effects, and cannot support extended control. , Privacy violation behavior tracking and traceability and other functions.
  • the privacy information management methods in the related technologies mostly focus on relatively isolated application scenarios and technical points.
  • a single protection scheme is proposed for the problems in a given application scenario. When faced with diverse privacy information, it cannot According to factors such as the type of privacy information, application scenarios, and privacy protection requirements, an adaptive privacy protection algorithm and parameter adjustment for privacy information are provided.
  • An embodiment of the present invention provides a privacy information management method, including any combination of one or more of the following steps:
  • Determine the privacy protection scheme according to one or more of the following combinations, and use the determined privacy protection scheme to protect the privacy information: the format of the privacy information and / or the type of privacy information; the scene description information and / or the privacy scene description information; privacy Privacy operations supported by information; privacy protection requirements.
  • the determining privacy information, determining the scene description information and / or privacy scene description information, determining the privacy operations supported by the privacy information, and determining the privacy protection requirements of the privacy information may include:
  • the privacy operations supported by the privacy information can be determined according to one or more of the following combinations: the format of the privacy information and / or the type of the privacy information; the semantics of the privacy information; and the description information of the privacy scenario.
  • extracting the private information from the information may include:
  • At least one information component is extracted from the information vector as a privacy information component, and the extracted privacy information component constitutes the privacy information.
  • the extracting at least one information component from the information vector as the privacy information component may include:
  • An information component with a sensitivity greater than a sensitivity threshold is extracted from the information vector as the privacy information component, and the sensitivity may be determined according to the user's subjective sensitivity to the information component and the information entropy of the information component.
  • the determining privacy information, determining the scene description information and / or privacy scene description information, determining the privacy operations supported by the privacy information, and determining the privacy protection requirements of the privacy information may include:
  • Extracting at least one information component from the information vector as the privacy information component according to the scene description information the extracted privacy information component constitutes the privacy information, and the privacy scene description information corresponding to the privacy information is extracted according to the scene description information;
  • the method may further include:
  • the privacy protection requirement refers to the degree to which the information holder wishes to protect the privacy information in a specific application scenario, and may include any one or more of the following combinations: privacy protection expectations and constraints;
  • the privacy protection expectation may include any one or more of the following combinations: after privacy protection, the attacker speculates on the probability of the privacy information before privacy protection, the uncertainty of the privacy information after privacy protection, and the privacy protection Expected loss between private information and private information before privacy protection;
  • the constraints may include any one or more of the following: the correspondence between privacy scenario description information, privacy operations allowed by the information holder and privacy information; privacy scenario description information, privacy not allowed by the information holder Correspondence between operations and private information.
  • the scene description information refers to the state information of the information
  • the privacy scene description information refers to the state information of the private information, which may include any combination of one or more of the following:
  • the privacy operation may include any combination of one or more of the following: read, write, encrypt, obfuscate, generalize, add noise, anonymize, sign, verify, calculate summary, encrypt, save, copy, paste, Forward, cut, modify, delete.
  • the determined privacy protection scheme may include:
  • classification of the privacy protection algorithm may include any combination of one or more of the following:
  • Cryptography-based classification generalization-based classification, access control-based classification.
  • classification of the privacy protection algorithm, the privacy protection algorithm and the parameter value range of the privacy protection algorithm may include:
  • the format and / or type of the information corresponds to the classification of the first privacy protection algorithm, the first privacy protection algorithm and the value range of the first parameter of the first privacy protection algorithm, according to the privacy information, scene description information and / or privacy scene description information
  • the constraints in the privacy protection requirements and the privacy operations supported by the privacy information are determined from the found categories of the first privacy protection algorithm, the first privacy protection algorithm and the value range of the first parameter of the first privacy protection algorithm.
  • the classification of the privacy protection algorithm, the second privacy protection algorithm, and the value range of the second parameter of the second privacy protection algorithm are determined from the found categories of the first privacy protection algorithm, the first privacy protection algorithm and the value range of the first parameter of the first privacy protection algorithm.
  • classification of the privacy protection algorithm, the privacy protection algorithm and the parameter value range of the privacy protection algorithm may include:
  • the classification of the first privacy protection algorithm, the first privacy protection algorithm and the value range of the first parameter of the first privacy protection algorithm cannot be found
  • the first privacy protection algorithm classification corresponding to the format and / or type of the privacy information, the first privacy protection algorithm and the first parameter protection value range of the first privacy protection algorithm, or the format of the found privacy information When the classification of the first privacy protection algorithm corresponding to and / or type, the value range of the first privacy protection algorithm and the first parameter of the first privacy protection algorithm do not meet the constraints in the privacy protection requirements, according to the format of the privacy information and / Or any combination of one or more of the types, scene description information and / or privacy scene description information, privacy operations supported by privacy information, and privacy protection requirements to design a new privacy protection scheme.
  • the privacy protection effect is set to evaluate the degree to which the privacy information protected by the privacy protection scheme is actually protected, and may include any combination of one or more of the following:
  • the attacker speculates the probability of the privacy information before privacy protection, the uncertainty of the privacy information after privacy protection, and the amount of loss between the privacy information after privacy protection and the privacy information before privacy protection.
  • An embodiment of the present invention provides a privacy information management device, including any combination of one or more of the following modules:
  • the determination module is configured to determine the privacy information, the privacy operations supported by the privacy information, the privacy protection requirements of the privacy information, and the scenario description information and / or privacy scenario description information;
  • the decision-making evaluation module of the privacy protection scheme is set to determine the privacy protection scheme according to any combination of one or more of the following, and use the determined privacy protection scheme to protect the privacy information: the format and / or type of the privacy information; / Or privacy scenario description information; privacy operations supported by privacy information; privacy protection requirements.
  • An embodiment of the present invention proposes a privacy information management apparatus, including a processor and a computer-readable storage medium.
  • the computer-readable storage medium stores instructions, and when the instructions are executed by the processor, the A privacy information management method.
  • An embodiment of the present invention provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, any of the above-mentioned privacy information management methods is implemented.
  • An embodiment of the present invention provides a privacy information management system, including:
  • the first device is configured to determine the privacy information, the privacy operations supported by the privacy information, the privacy protection requirements of the privacy information, and the scenario description information and / or privacy scenario description information;
  • the second device is configured to determine a privacy protection scheme based on any combination of one or more of the following, and use the determined privacy protection scheme to protect privacy information: the format and / or type of the privacy information; scene description information and / or privacy Scene description information; privacy operations supported by privacy information; privacy protection requirements.
  • Embodiments of the present invention include any combination of one or more of the following steps: determining privacy information, determining privacy operations supported by privacy information, determining privacy protection requirements of privacy information, and determining scene description information and / or privacy scene description information;
  • the privacy protection scheme is determined according to any combination of one or more of the following, and the privacy protection scheme is used to protect the privacy information: the format and / or type of the privacy information, the scene description information and / or the privacy scene description information, and the privacy information support The need for privacy operations and privacy protection of private information.
  • the embodiment of the present invention is directed to the construction of a system containing diversified privacy information. In response to the privacy protection requirements of different privacy information in different application scenarios, the corresponding privacy protection scheme is selected for protection, and personalized and automated management of different privacy information is realized.
  • FIG. 1 is a flowchart of a privacy information management method according to an embodiment of the present invention.
  • FIG. 2 is a schematic flowchart of a privacy information management method according to an embodiment of the present invention.
  • FIG. 3 (b) is a schematic structural composition diagram of a determination module according to an embodiment of the present invention.
  • 3 (c) is a schematic structural composition diagram of a determination module according to an embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of a privacy information management system according to an embodiment of the present invention.
  • this application proposes a privacy information management method, including:
  • Step 100 Determine the privacy information, determine the privacy operations supported by the privacy information, determine the privacy protection requirements of the privacy information, and determine the scenario description information and / or privacy scenario description information;
  • the privacy information, the privacy operations supported by the privacy information, the privacy protection requirements of the privacy information, and the scenario description information and / or privacy scenario description information may be determined in any of the following ways.
  • the first one is to extract private information from the information, and construct private scene description information according to any combination of one or more of the format and / or type of the private information, the environmental information where the private information is located, and the semantics of the private information; Any combination of one or more of private information, private scene description information, and information holder's preferences determines the privacy protection requirements of private information; according to the format and / or type of private information, semantics of private information, and private scene description Any combination of one or more of the information determines the privacy operations supported by the privacy information.
  • extracting private information from the information includes:
  • the information is divided according to the format and / or type of information and the semantics of the information to obtain an information vector; at least one information component is extracted from the information vector as a privacy information component, and the extracted privacy information component constitutes privacy information.
  • the private information component refers to the information component in the information whose sensitivity is greater than the sensitivity threshold.
  • the sensitivity in the industry generally refers to the degree of risk caused by improper use of information components or unauthorized contact or modification.
  • the information vector is obtained by splitting the original information according to different sources, formats, or semantics.
  • a piece of information includes "a piece of voice information of user A, a photo of user B, and user C.
  • the information vectors are: “user A's voice information", “user B's photo”, “user C's text information”.
  • the information component is a more fine-grained split of the information vector. For example, if the information vector corresponding to the text information of user C is: “Xiaoming and Xiaohong go to the canteen to eat", the information components are: “Xiaoming" ",” Little Red “,” Go “,” Canteen “,” Eating ".
  • the method for calculating the sensitivity of the information component includes, but is not limited to, any combination of one or more of the following:
  • the information entropy of the information component is calculated as follows: different information components X i have n i different values x i, j , and each value of each information component is in the entire database The frequency of occurrence f i, j is also different, among them, there is Replacing the probability with frequency, the information entropy of each information component X i is obtained, ie
  • w i may be preset by the user, and is referred to as the weight of the information component.
  • user v sets the weight of information components such as his name, mobile phone number, and mailbox to 0.5, and sets the weight of information components such as address, ID number, and frontal photograph to 0.8, combined with text, image, or video recognition technology, when When the information component is read, the corresponding weight is set for the information component, and the sensitivity of the information may be the maximum value of the sensitivity of each information component under the information, or the average value of the sensitivity of all information components.
  • a machine learning method is used to build a model based on the user ’s historical preferences, learn the user ’s privacy information in historical data, and extract the correspondence between it and sensitivity.
  • the trained model will recognize Sensitivity of private information.
  • the privacy protection requirement refers to the degree to which the information holder wishes to protect the privacy information in a specific application scenario, including but not limited to a combination of any one or more of the following: privacy protection expectations, Restrictions;
  • privacy protection expectations include, but are not limited to, a combination of any one or more of the following: the anticipation of an attacker's guess of the probability of privacy information before privacy protection after privacy protection, and the absence of privacy information after privacy protection Expectation of certainty, expectation of loss between privacy information after privacy protection and privacy information before privacy protection;
  • the constraint conditions include but are not limited to include any one or more of the following: privacy scene description information, correspondence between the privacy operations allowed by the information holder and privacy information; privacy scene description information, information Correspondence between privacy operations and privacy information that are not allowed by the holder.
  • manual marking, machine learning, and other methods may be used to determine the privacy protection requirements of private information.
  • different privacy protection requirements can be automatically analyzed by learning user behavior habits.
  • user u takes manual marking and divides the privacy protection requirements into 3 levels:
  • a model is established based on the user ’s historical preferences through machine learning methods, learning the user ’s privacy information and privacy scene description information in historical data, and extracting the corresponding relationship between them and privacy protection needs.
  • the trained model will identify privacy protection needs.
  • the format and / or type of the private information refers to the format and / or type of one or more private information components, including but not limited to any combination of one or more of the following: Text, pictures, audio, video.
  • data integration technology may be used to process the format and / or type of private information, the environmental information in which the private information is located, and the semantics of the private information, according to the privacy of information from different sources, formats, and characteristics.
  • the requirements of the scene description information are concentrated to obtain the privacy scene description information.
  • the storage contents of the privacy scene description information are: time, position coordinates, privacy information format, and storage size, then read the corresponding information: "Beijing time 12:00 , Zhongguancun subway station, text, 200KB ", and enter it in the privacy scene description information according to this format;
  • Privacy can also be constructed by any combination of one or more of the format and / or type of privacy information, the environmental information where the privacy information is located, the semantics of the privacy information, and the correspondence between the privacy scenario description information (which can be preset) Scene description information; for example, taking the two dimensions of time and space location of environmental information as an example, you can read the current time (12:00 Beijing time), release or generation of one or more privacy information components in private information Location (Zhongguancun Subway Station, Haidian District, Beijing), get the description information of the corresponding privacy scene.
  • the privacy scenario description information which can be preset
  • Scene description information for example, taking the two dimensions of time and space location of environmental information as an example, you can read the current time (12:00 Beijing time), release or generation of one or more privacy information components in private information Location (Zhongguancun Subway Station, Haidian District, Beijing), get the description information of the corresponding privacy scene.
  • the environment information where the private information is located includes any combination of one or more of the following:
  • the privacy operation includes any combination of one or more of the following:
  • it can be analyzed and obtained through manual setting, automatic extraction, machine learning, etc. under the given privacy information format and / or type, privacy information semantics, and privacy scenario description information, supported by privacy information Privacy operations.
  • privacy information Privacy operations For example, manually calibrate the privacy operations supported by different privacy information, for example: the text information containing the ID number can be modified, copied, pasted, and encrypted; in another example, using keyword matching to automatically extract and set the extracted key Privacy operations corresponding to words, when relevant keywords are extracted, the set privacy operations are extracted; in another example, a model is established according to the user's historical preferences through machine learning methods to learn the format and / or type of the user's privacy information in historical data , Semantic, privacy scene description information, extract the correspondence between it and the privacy operations supported by the privacy information, when inputting new privacy information format and / or type, semantic, privacy scene description information, the trained model will Identify the privacy operations supported by the privacy information.
  • the method of extracting private information may use manual settings, automatic extraction, machine learning, etc., such as manually calibrating the private information in the information, for example, when the scene description information read is 14: 00-17 : 00, when the place is in an entertainment place, the information related to the location is private information; another example is automatic extraction using keyword matching, and the information corresponding to the extracted keywords constitutes private information; another example is through machine learning methods Establish a model based on the user's historical preferences, learn the user's information in historical data, scene description information, and extract the correspondence between it and privacy information. When entering new information and scene description information, the trained model will recognize privacy information.
  • different privacy protection requirements can be automatically analyzed by learning user behavior habits.
  • user u takes manual marking and divides the privacy protection requirements into 3 levels:
  • a model is established based on the user ’s historical preferences through machine learning methods, learning the user ’s privacy information and privacy scene description information in historical data, and extracting the correspondence between them and privacy protection needs.
  • the trained model will identify privacy protection needs.
  • the privacy operations supported by the privacy information are determined according to any combination of one or more of the following: the format and / or type of the privacy information, the semantics of the privacy information, and the privacy scenario description information, where Manual setting, automatic extraction, machine learning and other methods analyze and obtain the privacy operations supported by the privacy information under the given privacy information format and / or type, semantics, and privacy scenario description information.
  • the text information containing the ID number can be modified, copied, pasted, and encrypted; in another example, using keyword matching to automatically extract and set the extracted key Privacy operations corresponding to words, when relevant keywords are extracted, the set privacy operations are extracted; in another example, a model is established according to the user's historical preferences through machine learning methods to learn the format and / or type of the user's privacy information in historical data , Semantic, privacy scene description information, extract the correspondence between it and the privacy operations supported by the privacy information, when inputting new privacy information format and / or type, semantic, privacy scene description information, the trained model will Identify the privacy operations supported by the privacy information.
  • the method when the source and data format of the private information components are different, the method further includes: performing transformation operations on the private information components, so that the dimensions of all the private information components are unified, and after the unified dimensions The private information components are fused to obtain private information.
  • the format and / or type of the information refers to the format and / or type of one or more information vectors, including but not limited to any combination of one or more of the following: text, Pictures, audio, video.
  • the scene description information refers to the state information of the information, including but not limited to any combination of one or more of the following:
  • the transformation operation refers to unifying the dimensional privacy information components into dimensionally consistent privacy information components, which is convenient for comparison and measurement, including but not limited to at least one of the following: speech recognition, image text Mutual conversion, video text conversion.
  • the private information components are: A's voice information, B's photo, and C's text information
  • the private information component can be unified into text information or into a picture format for subsequent fusion.
  • fusion refers to the operation of merging privacy information components. That is, after the dimensions of the privacy information components are unified, all the privacy information components after the unified dimensions are merged together to obtain privacy information.
  • Step 110 Determine a privacy protection scheme according to any combination of one or more of the following: the format and / or type of privacy information, scenario description information and / or privacy scenario description information, privacy operations supported by privacy information, and privacy protection requirements, Adopt a definite privacy protection scheme to protect privacy of private information.
  • determining the privacy protection scheme includes:
  • a privacy protection scheme selects one of a set of privacy protection algorithm classification, privacy protection algorithm and privacy protection algorithm parameter values to get one of the privacy protection schemes;
  • the privacy protection algorithm classification includes, but is not limited to, any combination of one or more of the following: classification based on cryptography, classification based on generalization technology, and classification based on access control technology.
  • the classification of the privacy protection algorithm is a combination of one or more approximate privacy protection algorithms.
  • the approximate privacy protection algorithm refers to a specific technology or several specific technologies as the core, derived from Outcoming algorithms: For example, k-anonymity, l-diversity algorithms or their combinations are approximate privacy protection algorithms under the privacy protection algorithm classification of generalization technology.
  • the classification of the privacy protection algorithm, the privacy protection algorithm, and the parameter value range of the privacy protection algorithm include:
  • the format and / or type of the information corresponds to the classification of the first privacy protection algorithm, the first privacy protection algorithm and the value range of the first parameter of the first privacy protection algorithm, according to the privacy information, scene description information and / or privacy scene description information
  • the constraints in the privacy protection requirements and the privacy operations supported by the privacy information are determined from the found categories of the first privacy protection algorithm, the first privacy protection algorithm and the value range of the first parameter of the first privacy protection algorithm.
  • the classification of the privacy protection algorithm, the second privacy protection algorithm, and the value range of the second parameter of the second privacy protection algorithm are determined from the found categories of the first privacy protection algorithm, the first privacy protection algorithm and the value range of the first parameter of the first privacy protection algorithm.
  • the classification of the first privacy protection algorithm, the first privacy protection algorithm and the value range of the first parameter of the first privacy protection algorithm cannot be found
  • the first privacy protection algorithm classification corresponding to the format and / or type of the privacy information, the first privacy protection algorithm and the first parameter protection value range of the first privacy protection algorithm, or the format of the found privacy information When the classification of the first privacy protection algorithm corresponding to and / or type, the value range of the first privacy protection algorithm and the first parameter of the first privacy protection algorithm do not meet the constraints in the privacy protection requirements, according to the format of the privacy information and / Or any combination of one or more of the types, scene description information and / or privacy scene description information, privacy operations supported by privacy information, and privacy protection requirements to design a new privacy protection scheme.
  • methods such as manual labeling and machine learning may be used to determine the format and / or type of privacy information, the classification of the first privacy protection algorithm, the first privacy protection algorithm, and the parameters of the first privacy protection algorithm. Correspondence between value ranges.
  • manual setting, machine learning, and other methods can also be used to analyze and obtain the privacy information supported by the given privacy information format and / or type, semantics, privacy scenario description information, and privacy protection requirements.
  • the parameters of the privacy protection algorithm refer to independent variable parameters and adjustment parameters in the privacy protection algorithm, and selecting different independent variable parameters can obtain privacy protection algorithms with different privacy protection strengths; for example, k anonymous K is an independent parameter; the privacy budget ⁇ in differential privacy is the independent parameter; the key length in the RSA encryption algorithm is also the independent parameter.
  • the format and / or type of the privacy information, the classification of the first privacy protection algorithm, the first privacy protection algorithm, and the parameter value range of the first privacy protection algorithm can be found to find the format and / or the privacy information Or the classification of the first privacy protection algorithm corresponding to the type, the first privacy protection algorithm, the value range of the first parameter of the first privacy protection algorithm; according to the scenario description information and / or privacy scenario description information and the privacy protection algorithm classification Correspondence, select the second privacy protection algorithm classification from the first privacy protection algorithm classification; according to the correspondence between privacy operations supported by privacy information, privacy protection needs, privacy protection algorithm classification and privacy protection algorithm, filter Under the classification of the second privacy protection algorithm, the second privacy protection algorithm in the first privacy protection algorithm consists of meeting the privacy operations and privacy protection requirements supported by the privacy information; describing the information and privacy protection according to the scenario description information and / or the privacy scenario Correspondence between the demand and the parameter value range of the privacy protection algorithm, Filtering out the second parameter value range of the second privacy protection algorithm from the first parameter value range of the second privacy
  • the correspondence between the format and / or type of the privacy information, the classification of the privacy protection algorithm, the privacy protection algorithm, and the parameter value range of the privacy protection algorithm can be found to find the correspondence between the format and / or type of the privacy information
  • Correspondence between requirements, classification of privacy protection algorithms, privacy protection algorithms, parameter ranges of privacy protection algorithms, from classification of first privacy protection algorithm, first privacy protection algorithm, first parameter of first privacy protection algorithm In the value range, the classification of the second privacy protection algorithm, the second privacy protection algorithm, and the second parameter value range of the second privacy protection algorithm are selected.
  • the classification of the first privacy protection algorithm, the first privacy protection algorithm and the value range of the first parameter of the first privacy protection algorithm cannot be found
  • the first privacy protection algorithm classification corresponding to the format and / or type of the privacy information, the first privacy protection algorithm and the first parameter protection value range of the first privacy protection algorithm, or the format of the found privacy information When the classification of the first privacy protection algorithm corresponding to and / or type, the value range of the first privacy protection algorithm and the first parameter of the first privacy protection algorithm do not meet the constraints in the privacy protection requirements, according to the format of the privacy information and / Or any combination of one or more of the types, scene description information and / or privacy scene description information, privacy operations supported by privacy information, and privacy protection requirements to design a new privacy protection scheme.
  • the design here refers to the construction of new privacy protection algorithms based on the format and / or type of privacy information, scene description information and / or privacy scene description information, privacy operations supported by privacy information, and privacy protection requirements, such as integrating existing algorithms Combine to obtain a combination scheme of privacy protection algorithm, or recombine the steps of existing privacy protection algorithm, for example, first adopt k anonymous algorithm, and then encrypt k anonymous results separately, the overall process can be regarded as a newly constructed algorithm.
  • the privacy protection effect is set to evaluate the degree to which the privacy information protected by the privacy protection scheme is actually protected, including any combination of one or more of the following:
  • the attacker speculates the probability of the privacy information before privacy protection, the uncertainty of the privacy information after privacy protection, and the amount of loss between the privacy information after privacy protection and the privacy information before privacy protection.
  • the deviation can be defined as the distance between the position after privacy protection and the real position, and the loss ratio can be defined as the number of POIs fed back to the user before privacy protection and the number of POIs fed back to the user after privacy protection Ratio;
  • the deviation can be defined as the difference between the RGB values of each pixel before and after privacy protection, and the loss ratio can be defined as the difference between the RGB values of each pixel before and after privacy protection, and privacy The ratio of RGB values of pixels before protection.
  • the privacy protection effect meets the privacy protection expectation in the privacy protection requirements including any combination of one or more of the following:
  • the attacker speculates that the probability of the privacy information before privacy protection is less than or equal to the expectation of the privacy protection expectation after the privacy protection.
  • the uncertainty of privacy information after privacy protection in the privacy protection effect is greater than or equal to the expectation of the uncertainty of privacy information after privacy protection in privacy protection expectations;
  • the amount of loss between the privacy information after privacy protection and the privacy information before privacy protection in the privacy protection effect is less than or equal to the expected loss amount between the privacy information after privacy protection in the privacy protection expectation and the privacy information before privacy protection .
  • the privacy protection effect does not meet the privacy protection requirements in the privacy protection requirements, including any combination of one or more of the following:
  • the attacker speculates that the probability of the privacy information before privacy protection is greater than the expectation of the privacy protection expectation that the attacker guesses the privacy information before privacy protection;
  • the uncertainty of the privacy information after the privacy protection in the privacy protection effect is less than the expectation of the uncertainty of the privacy information after the privacy protection in the privacy protection expectations;
  • the amount of loss between the privacy information after privacy protection and the privacy information before privacy protection in the privacy protection effect is greater than the expected loss amount between the privacy information after privacy protection and the privacy information before privacy protection in the privacy protection expectation.
  • the privacy protection effect of the algorithm and determine whether the privacy protection effect of the privacy protection algorithm meets the privacy protection expectations in the privacy protection requirements; until the privacy protection effect of the privacy protection algorithm calculated based on all parameters corresponding to the currently selected privacy protection algorithm is not satisfied When the privacy protection expectations in the privacy protection requirements are completed, this process is ended; or a prompt message with excessive privacy requirements is output, and no privacy protection program is output.
  • the embodiments of the present invention take into account that different privacy protection algorithms have different privacy protection capabilities due to their different mathematical foundations and selection of algorithm parameters, so as to meet different privacy protection needs, and the number of privacy protection algorithms is limited. Privacy protection algorithms are classified according to the mathematical basis. For example, according to the foregoing example, the time-based access control technology can be classified to privacy protection level 3, and irreversible algorithms based on probabilistic ideas such as data confusion and disturbance can be classified and exchanged for encryption and homomorphism. Reversible algorithms based on cryptographic ideas such as encryption are classified to level 2 for privacy protection.
  • FIG. 3 (a) another embodiment of the present application proposes a device for managing private information, including any combination of one or more of the following modules:
  • the determination module 301 is configured to determine the privacy information, the privacy operations supported by the privacy information, the privacy protection requirements of the privacy information, and the scene description information and / or privacy scene description information;
  • the privacy protection scheme decision evaluation module 302 is configured to determine the privacy protection scheme according to any combination of one or more of the following, and use the determined privacy protection scheme to protect the privacy information: the format and / or type of the privacy information; scene description information And / or privacy scenario description information; privacy operations supported by privacy information; privacy protection requirements.
  • the determination module 301 may use any of the following methods to determine privacy information, determine the privacy operations supported by the privacy information, determine the privacy protection requirements of the privacy information, and determine the scenario description information and / or privacy scenario description information.
  • the determination module 301 includes:
  • the first privacy information extraction unit 3011 is configured to extract privacy information from the information
  • the first privacy protection requirement determination unit 3012 is configured to determine the privacy protection requirement of privacy information according to any combination of one or more of privacy information, privacy scenario description information, and information holder's preferences;
  • the first privacy operation determination unit 3014 is configured to determine the privacy operation supported by the privacy information according to any combination of one or more of the format and / or type of the privacy information, the semantics of the privacy information, and the privacy scenario description information.
  • the determination module 301 includes:
  • the splitting unit 3015 is configured to split the information to obtain an information vector according to the format and / or type of the information and the semantics of the information;
  • the scene description information generating unit 3016 is configured to generate the scene description information according to the format and / or type of the information, the environment information in which the information is located, and the semantics of the information;
  • the second privacy information extraction unit 3017 is configured to extract at least one information component from the information vector as the privacy information component according to the scene description information, the extracted privacy information component constitutes privacy information, and the privacy corresponding to the privacy information is extracted according to the scene description information Scene description information, where the method of extracting private information can use manual settings, automatic extraction, machine learning, etc., such as manually calibrating the privacy information in the information, for example, when the scene description information is read from 14: 00-17: 00 , When the place is in an entertainment place, the information related to the location is private information; another example is the automatic extraction using keyword matching, and the information corresponding to the extracted keywords constitutes private information; another example, according to the user through machine learning methods
  • the historical preference builds a model, learns the user's information in historical data, scene description information, and extracts the correspondence between it and privacy information. When new information or scene description information is input, the trained model will recognize the privacy information.
  • the second privacy protection requirement determination unit 3018 is configured to determine the privacy protection requirement of privacy information according to any combination of one or more of the privacy information, the privacy scenario description information, and the preference of the information holder.
  • the second privacy operation determination unit 3019 is configured to determine the privacy operation supported by the privacy information according to any combination of one or more of the format and / or type of the privacy information, the semantics of the privacy information, and the scene description information.
  • the methods of manual setting, automatic extraction, and machine learning can be used to analyze and obtain the privacy operations supported by the privacy information under the given privacy information format and / or type, semantics, and privacy scenario description information.
  • 3 (a), 3 (b), and 3 (c) respectively show only a realizable presentation form of a privacy information management device proposed in this application, and may also appear in other forms.
  • This application does not limit the number and order of the modules, and may be one module or any combination of multiple modules in the figure, or the modules may be arranged in other orders.
  • FIG. 4 another embodiment of the present application proposes a privacy information management system, including:
  • the first device 401 is configured to determine the privacy information, the privacy operations supported by the privacy information, the privacy protection requirements of the privacy information, and the scene description information and / or privacy scene description information;
  • the second device 402 is configured to determine a privacy protection scheme based on any combination of one or more of the following, and use the determined privacy protection scheme to protect the privacy information; the format and / or type of the privacy information; the scene description information and / or Privacy scenario description information; privacy operations supported by privacy information; privacy protection requirements.
  • the implementation process of the first device 401 and the second device 402 described above is the same as the implementation process of the privacy information management method in the foregoing embodiment, and details are not described herein again.
  • the first device 401 and the second device 402 may be any devices, for example, the first device 401 may be a terminal device, and the second device 402 may be a server.
  • the term computer storage medium includes both volatile and non-volatile implemented in any method or technology configured to store information (such as computer-readable instructions, data structures, program modules, or other data) Sex, removable and non-removable media.
  • Computer storage media include but are not limited to RAM, ROM, EEPROM, flash memory or other memory technologies, CD-ROM, digital versatile disk (DVD) or other optical disk storage, magnetic cartridges, magnetic tape, magnetic disk storage or other magnetic storage devices, or may Any other medium configured to store the desired information and be accessible by the computer.
  • the communication medium generally contains computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transmission mechanism, and may include any information delivery medium .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Storage Device Security (AREA)
  • Small-Scale Networks (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

本发明实施例公开了一种隐私信息管理方法、装置和系统,所述方法包括:确定隐私信息、确定隐私信息所支持的隐私操作、确定隐私信息的隐私保护需求、以及确定场景描述信息和/或隐私场景描述信息;根据以下一种或多种的任意组合确定隐私保护方案,采用隐私保护方案对隐私信息进行隐私保护:隐私信息的格式和/或隐私信息的类型、场景描述信息和/或隐私场景描述信息、隐私信息所支持的隐私操作、隐私保护需求。

Description

一种隐私信息管理方法、装置和系统 技术领域
本文涉及但不限于信息应用技术领域,尤指一种隐私信息管理方法、装置和系统。
背景技术
随着信息技术、移动通信技术等的紧密结合与快速发展,以及智能终端软硬件的不断升级与换代,促进了互联网、移动互联网、云计算、大数据、物联网等方面的技术发展及其融合,新技术、新服务模式的产生与快速发展促使海量、异构的用户个人信息跨系统、跨生态圈,甚至跨国境交互已成为常态。然而,随着信息传播流的扩大,用户个人信息在采集、存储、处理、发布交换、销毁等环节中不可避免的会在不同信息系统中有意或无意地留存,给用户、企业、国家带来了极大的个人隐私信息泄露风险。
针对隐私信息在信息系统中潜在的泄露风险,需要对不同类型、不同来源、不同应用场景、不同隐私保护需求的隐私信息进行有效管理。然而相关技术中,隐私信息管理方法只针对特定信息类型、特定信息处理阶段进行管理,缺乏对隐私信息的融合、隐私保护算法设计、保护效果评估等多个环节的综合管理,且无法支持延伸控制、隐私侵犯行为追踪与溯源等功能。不仅如此,相关技术中的隐私信息管理方法大都聚焦于相对孤立的应用场景和技术点,针对给定的应用场景中存在的问题提出单一的保护方案,在面对多样化的隐私信息时,无法根据隐私信息类型、应用场景、隐私保护需求等因素,为隐私信息提供自适应的隐私保护算法及其参数调整。
发明概述
以下是对本文详细描述的主题的概述。本概述并非是为了限制权利要求的保护范围。
本发明实施例提出了一种隐私信息管理方法,包括以下步骤中一个或多个的任意组合:
确定隐私信息、隐私信息所支持的隐私操作、隐私信息的隐私保护需求, 以及确定场景描述信息和/或隐私场景描述信息;
根据以下一种或多种组合确定隐私保护方案,采用确定的隐私保护方案对隐私信息进行隐私保护:隐私信息的格式和/或隐私信息的类型;场景描述信息和/或隐私场景描述信息;隐私信息所支持的隐私操作;隐私保护需求。
其中,所述确定隐私信息、确定场景描述信息和/或隐私场景描述信息、确定隐私信息所支持的隐私操作、确定隐私信息的隐私保护需求可以包括:
从信息中提取隐私信息,根据以下一种或多种的任意组合构造隐私场景描述信息:隐私信息的格式和/或类型;隐私信息所处的环境信息;隐私信息的语义;
根据隐私信息、隐私场景描述信息、信息持有者的偏好中的一种或多种的任意组合可以确定隐私信息的隐私保护需求;
根据以下一种或多种组合可以确定隐私信息所支持的隐私操作:隐私信息的格式和/或隐私信息的类型;隐私信息的语义;隐私场景描述信息。
其中,所述从信息中提取隐私信息可以包括:
根据所述信息的格式和/或类型、所述信息的语义对信息进行拆分得到信息向量;
从所述信息向量中提取至少一个信息分量作为隐私信息分量,提取的隐私信息分量构成所述隐私信息。
其中,所述从信息向量中提取至少一个信息分量作为隐私信息分量可以包括:
从所述信息向量中提取敏感度大于敏感阈值的信息分量,作为所述隐私信息分量,所述敏感度可以根据用户对信息分量的主观敏感程度和信息分量的信息熵确定。
其中,所述确定隐私信息、确定场景描述信息和/或隐私场景描述信息、确定隐私信息所支持的隐私操作、确定隐私信息的隐私保护需求可以包括:
根据信息的格式和/或类型、所述信息的语义对所述信息进行拆分得到信息向量;
根据所述信息的格式和/或类型、所述信息所处的环境信息、所述信息的语义生成所述场景描述信息;
根据所述场景描述信息从信息向量中提取至少一个信息分量作为隐私信息分量,提取的隐私信息分量构成所述隐私信息,根据所述场景描述信息提取出隐私信息对应的隐私场景描述信息;
根据隐私信息、隐私场景描述信息、信息持有者的偏好中的一种或多种的任意组合确定隐私信息的隐私保护需求;
根据以下一种或多种组合确定隐私信息所支持的隐私操作:隐私信息的格式和/或类型;隐私信息的语义;隐私场景描述信息。
其中,当不同所述隐私信息分量的来源、数据格式不相同时,该方法还可以包括:
对所述隐私信息分量进行变换操作,使得所有所述隐私信息分量的量纲得到统一;
对统一量纲后的隐私信息分量进行融合得到所述隐私信息。
其中,所述隐私保护需求是指信息持有者在特定应用场景下希望隐私信息被保护的程度,可以包括以下任意一个或多个的组合:隐私保护期望、约束条件;
其中,隐私保护期望可以包括以下任意一个或多个的组合:隐私保护后攻击者推测出隐私保护前的隐私信息的概率的预期、隐私保护后隐私信息的不确定度的预期、隐私保护后的隐私信息与隐私保护前的隐私信息之间的损失量预期;
其中,约束条件可以包括以下任意一个或多个的组合:隐私场景描述信息、信息持有者允许的隐私操作和隐私信息之间的对应关系;隐私场景描述信息、信息持有者不允许的隐私操作和隐私信息之间的对应关系。
其中,所述场景描述信息是指信息所处的状态信息,所述隐私场景描述信息是指所述隐私信息所处的状态信息,可以包括以下一种或多种的任意组合:
信息格式和/或类型、时间、空间位置、设备、交互对象、交互途径、传输方式、存储方式、语义。
其中,所述隐私操作可以包括以下一种或多种的任意组合:读、写、加密、模糊、泛化、加噪、匿名、签名、验签、计算摘要、加密、保存、复制、粘贴、转发、剪切、修改、删除。
其中,所述确定隐私保护方案可以包括:
根据以下一种或多种组合确定至少一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值范围:隐私信息的格式和/或类型;场景描述信息和/或隐私场景描述信息;隐私信息所支持的隐私操作;隐私保护需求;
根据确定的至少一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值范围确定至少一个隐私保护方案;
从确定的至少一个隐私保护方案中选取其中一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值得到其中一个隐私保护方案;
计算选取的隐私保护方案对应的隐私保护后的隐私信息的隐私保护效果,当判断出所述隐私保护效果不满足隐私保护需求中的隐私保护期望时,重新选取另一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值得到另一个隐私保护方案,直至选取的隐私保护方案对应的隐私保护效果满足隐私保护需求中的隐私保护期望。
其中,所述隐私保护算法的分类可以包括以下一种或多种的任意组合:
基于密码学的分类、基于泛化技术的分类、基于访问控制技术的分类。
其中,所述确定隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值范围可以包括:
在预先设置的隐私信息的格式和/或类型、第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围之间的对应关系中,查找所述隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围,根据隐私信息、场景描述信息和/或隐私场景描述信息、隐私保护需求中的约束条件、隐私信息所 支持的隐私操作从查找到的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围中确定第二隐私保护算法的分类、第二隐私保护算法,以及第二隐私保护算法的第二参数取值范围。
其中,所述确定隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值范围可以包括:
当在预先设置的隐私信息的格式和/或类型、第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围之间的对应关系中,查找不到所述隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围,或,查找到的所述隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围均不满足隐私保护需求中的约束条件时,根据隐私信息的格式和/或类型、场景描述信息和/或隐私场景描述信息、隐私信息所支持的隐私操作、隐私保护需求中的一种或多种的任意组合设计新型隐私保护方案。
其中,所述隐私保护效果设置成评价隐私保护方案保护后的隐私信息实际被保护的程度,可以包括以下一种或多种的任意组合:
隐私保护后攻击者推测出隐私保护前的隐私信息的概率、隐私保护后隐私信息的不确定度、隐私保护后的隐私信息与隐私保护前的隐私信息之间的损失量。
本发明实施例提出了一种隐私信息管理装置,包括以下模块中一个或多个的任意组合:
确定模块,设置成确定隐私信息、隐私信息所支持的隐私操作、隐私信息的隐私保护需求,以及确定场景描述信息和/或隐私场景描述信息;
隐私保护方案决策评估模块,设置成根据以下一种或多种的任意组合确定隐私保护方案,采用确定的隐私保护方案对隐私信息进行隐私保护:隐私信息的格式和/或类型;场景描述信息和/或隐私场景描述信息;隐私信息所支持的隐私操作;隐私保护需求。
本发明实施例提出了一种隐私信息管理装置,包括处理器和计算机可读 存储介质,所述计算机可读存储介质中存储有指令,当所述指令被所述处理器执行时,实现上述任一种隐私信息管理方法。
本发明实施例提出了一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现上述任一种隐私信息管理方法。
本发明实施例提出了一种隐私信息管理系统,包括:
第一设备,设置成确定隐私信息、隐私信息所支持的隐私操作、隐私信息的隐私保护需求,以及确定场景描述信息和/或隐私场景描述信息;
第二设备,设置成根据以下一种或多种的任意组合确定隐私保护方案,采用确定的隐私保护方案对隐私信息进行隐私保护:隐私信息的格式和/或类型;场景描述信息和/或隐私场景描述信息;隐私信息所支持的隐私操作;隐私保护需求。
本发明实施例包括以下步骤中一个或多个的任意组合:确定隐私信息、确定隐私信息所支持的隐私操作、确定隐私信息的隐私保护需求、以及确定场景描述信息和/或隐私场景描述信息;根据以下一种或多种的任意组合确定隐私保护方案,采用隐私保护方案对隐私信息进行隐私保护:隐私信息的格式和/或类型、场景描述信息和/或隐私场景描述信息、隐私信息所支持的隐私操作、隐私信息的隐私保护需求。本发明实施例面向含有多样化隐私信息的系统构建,针对不同隐私信息在不同应用场景中的隐私保护需求,选择相应的隐私保护方案进行保护,实现了对不同隐私信息的个性化、自动化管理。
附图概述
图1为本发明实施例的隐私信息管理方法的流程图;
图2为本发明实施例的隐私信息管理方法的流程示意图;
图3(a)为本发明实施例的隐私信息管理装置的结构组成示意图;
图3(b)为本发明实施例的确定模块的结构组成示意图;
图3(c)为本发明实施例的确定模块的结构组成示意图;
图4为本发明实施例的隐私信息管理系统的结构组成示意图。
详述
在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互任意组合。
下文中将结合附图对本发明实施例进行详细说明。需要说明的是,在不冲突的情况下,本发明中的实施例及实施例中的特征可以相互任意组合。
参见图1,本申请提出了一种隐私信息管理方法,包括:
步骤100、确定隐私信息、确定隐私信息所支持的隐私操作、确定隐私信息的隐私保护需求、以及确定场景描述信息和/或隐私场景描述信息;
一个示例性的实施例中,隐私信息可以设置成描述信息所有者不想公开的信息中的部分或全部内容。
一个示例性的实施例中,可以采用以下任一种方式确定隐私信息、隐私信息所支持的隐私操作、隐私信息的隐私保护需求、以及确定场景描述信息和/或隐私场景描述信息。
第一种,从信息中提取隐私信息,根据隐私信息的格式和/或类型、隐私信息所处的环境信息、隐私信息的语义中的一种或多种的任意组合构造隐私场景描述信息;根据隐私信息、隐私场景描述信息、信息持有者的偏好中的一种或多种的任意组合确定隐私信息的隐私保护需求;根据隐私信息的格式和/或类型、隐私信息的语义和隐私场景描述信息中的一种或多种的任意组合确定隐私信息所支持的隐私操作。
一个示例性的实施例中,从信息中提取隐私信息包括:
根据信息的格式和/或类型、信息的语义对信息进行拆分得到信息向量;从信息向量中提取至少一个信息分量作为隐私信息分量,提取的隐私信息分量构成隐私信息。
一个示例性的实施例中,隐私信息分量是指信息中敏感度大于敏感阈值的信息分量。其中敏感度在业内一般是指信息分量被不当使用或未经授权被人接触或修改带来的风险程度。
一个示例性的实施例中,信息向量是将原始信息按照不同来源、格式或 语义拆分而成的,例如一则信息包含了“用户A的一段语音信息、用户B的一张照片、用户C的一段文字信息”的结合,则信息向量分别为:“用户A的语音信息”、“用户B的照片”、“用户C的文字信息”。其中,信息分量则是信息向量更细粒度的拆分,例如若用户C的文字信息对应的信息向量为:“小明和小红去食堂吃饭”,则信息分量分别为:“小明”,“和”,“小红”,“去”,“食堂”,“吃饭”。
一个示例性的实施例中,信息分量的敏感度计算方法包括但不限于包括以下的一种或多种的任意组合:
根据预先定义的敏感度计算方式得到、采用机器学习方法根据信息分量的历史敏感度统计得到。
例如,预先定义的敏感度计算方式为按照公式S(X i)=w i/H(X i)计算第i个信息分量X i的敏感度S(X i);其中,w i为用户对第i个信息分量的主观敏感程度,H(X i)为第i个信息分量的信息熵。
一个示例性的实施例中,信息分量的信息熵的计算方式如下:不同的信息分量X i有着n i个不同的取值x i,j,每一个信息分量的每一个取值在整个数据库中出现的频率f i,j也不相同,其中,有
Figure PCTCN2019083048-appb-000001
以频率代替概率,得到每一个信息分量X i的信息熵,即
Figure PCTCN2019083048-appb-000002
一个示例性的实施例中,w i可以由用户预先设定,称为信息分量的权重。例如,用户v将自己的姓名、手机号、邮箱等信息分量的权重设置为0.5,将住址、身份证号、正面照片等信息分量的权重设置为0.8,结合文字、图像或视频识别技术,当读取到信息分量时,为该信息分量设置相应的权重,而信息的敏感度可以为该信息下每个信息分量的敏感度的最大值,或者为所有信息分量敏感度的平均值。又如,通过机器学习方法根据用户历史偏好建立模型,学习用户在历史数据中的隐私信息、提取其与敏感度之间的对应关系,当输入新的隐私信息时,训练后的模型将识别出隐私信息的敏感度。
一个示例性的实施例中,所述隐私保护需求是指信息持有者在特定应用场景下希望隐私信息被保护的程度,包括但不限于包括以下任意一个或多个的组合:隐私保护期望、约束条件;
一个示例性的实施例中,隐私保护期望包括但不限于包括以下任意一个或多个的组合:隐私保护后攻击者推测出隐私保护前的隐私信息的概率的预期、隐私保护后隐私信息的不确定度的预期、隐私保护后的隐私信息与隐私保护前的隐私信息之间的损失量预期;
一个示例性的实施例中,约束条件包括但不限于包括以下任意一个或多个的组合:隐私场景描述信息、信息持有者允许的隐私操作和隐私信息的对应关系;隐私场景描述信息、信息持有者不允许的隐私操作和隐私信息的对应关系。
一个示例性的实施例中,可以采用人工标记、机器学习等方法确定隐私信息的隐私保护需求。
一个示例性的实施例中,可以通过学习用户行为习惯自动分析不同的隐私保护需求。
例如,用户u采取人工标记,将隐私保护需求分为3档:
1、允许所有用户在任意时间访问用户u的所有隐私信息的明文;
2、允许用户u的好友在任意时间可以访问用户u经过加噪扰动后的隐私信息;允许其他用户在任意时间访问用户u加密后的隐私信息;
3、不允许任何用户在21:00-04:00访问用户u的隐私信息。
又如,通过机器学习方法根据用户历史偏好建立模型,学习用户在历史数据中的隐私信息、隐私场景描述信息,提取其与隐私保护需求之间的对应关系,当输入新的隐私信息、隐私场景描述信息时,训练后的模型将识别出隐私保护需求。
一个示例性的实施例中,所述隐私信息的格式和/或类型,是指一个或多个隐私信息分量的格式和/或类型,包括但不限于包括以下一种或多种的任意组合:文本、图片、音频、视频。
一个示例性的实施例中,可以采用数据集成技术对隐私信息的格式和/或类型、隐私信息所处的环境信息、隐私信息的语义进行处理,把不同来源、格式、特点性质的信息按照隐私场景描述信息的要求集中,得到隐私场景描 述信息,例如,隐私场景描述信息的存储内容分别为:时间、位置坐标、隐私信息格式、存储大小,则读取对应的信息:“北京时间12:00、中关村地铁站、文本、200KB”,将其按照该格式录入隐私场景描述信息中;
也可以通过隐私信息的格式和/或类型、隐私信息所处的环境信息、隐私信息的语义中的一种或多种的任意组合和隐私场景描述信息的对应关系(可以预先设置)来构建隐私场景描述信息;例如,以环境信息的时间、空间位置两个维度为例,可以读取隐私信息中,一个或多个隐私信息分量当前所处的时间(北京时间12:00)、发布或生成地点(北京海淀区中关村地铁站),得到对应的隐私场景描述信息。
一个示例性的实施例中,隐私信息所处的环境信息包括以下一种或多种的任意组合:
时间、空间位置、设备、传输方式、存储方式。
一个示例性的实施例中,所述隐私场景描述信息是指隐私信息所处的状态信息,包括但不限于包括以下一种或多种的任意组合:
信息格式和/或类型、时间、空间位置、设备、交互对象、交互途径、传输方式、存储方式、语义。
一个示例性的实施例中,所述隐私操作包括以下一种或多种的任意组合:
读、写、加密、模糊、泛化、加噪、匿名、签名、验签、计算摘要、加密、保存、复制、粘贴、转发、剪切、修改、删除。
一个示例性的实施例中,可以通过人工设置、自动抽取、机器学习等方法,分析得到在给定隐私信息的格式和/或类型、隐私信息的语义、隐私场景描述信息下,隐私信息所支持的隐私操作。例如,人工标定不同的隐私信息所支持的隐私操作,举例:包含身份证号的文本信息支持被修改、复制、粘贴、加密;又如,利用关键词匹配进行自动抽取,设定所抽取的关键词对应的隐私操作,当抽取到相关关键词时,提取所设置的隐私操作;又如,通过机器学习方法根据用户历史偏好建立模型,学习用户在历史数据中的隐私信息的格式和/或类型、语义、隐私场景描述信息,提取其与隐私信息所支持的隐私操作之间的对应关系,当输入新的隐私信息的格式和/或类型、语义、隐 私场景描述信息时,训练后的模型将识别出隐私信息所支持的隐私操作。
第二种、如图2所示,根据信息的格式和/或类型、以及根据信息的语义对所述信息进行拆分得到信息向量;根据所述信息的格式和/或类型、根据所述信息所处的环境信息、以及根据信息的语义生成所述场景描述信息;根据所述场景描述信息从信息向量中提取至少一个信息分量作为隐私信息分量,提取的隐私信息分量构成隐私信息,根据场景描述信息提取出隐私信息对应的隐私场景描述信息;根据所述隐私信息、隐私场景描述信息、信息持有者的偏好中的一种或多种的任意组合确定隐私信息的隐私保护需求;根据隐私信息的格式和/或类型、语义、隐私场景描述信息中的一种或多种的任意组合确定隐私信息所支持的隐私操作。
一个示例性的实施例中,提取隐私信息的方法可以采用人工设置、自动抽取、机器学习等,例如人工标定信息中的隐私信息,例如当读取到的场景描述信息为时间14:00-17:00,地点在娱乐场所时,与位置相关的信息即为隐私信息;又如,利用关键词匹配进行自动抽取,所抽取的关键词对应的信息构成了隐私信息;又如,通过机器学习方法根据用户历史偏好建立模型,学习用户在历史数据中的信息、场景描述信息,提取其与隐私信息之间的对应关系,当输入新的信息、场景描述信息时,训练后的模型将识别出隐私信息。
一个示例性的实施例中,可以通过学习用户行为习惯自动分析不同的隐私保护需求。
例如,用户u采取人工标记,将隐私保护需求分为3档:
1、允许所有用户在任意时间访问用户u的所有隐私信息的明文;
2、允许用户u的好友在任意时间可以访问用户u经过加噪扰动后的隐私信息;允许其他用户在任意时间访问用户u加密后的隐私信息;
3、不允许任何用户在21:00-04:00访问用户u的隐私信息。
又如,通过机器学习方法根据用户历史偏好建立模型,学习用户在历史数据中的隐私信息、隐私场景描述信息,提取其与隐私保护需求之间的对应关系,当输入新的隐私信息、隐私场景描述信息时,训练后的模型将识别出隐私保护需求。
一个示例性的实施例中,根据以下一种或多种的任意组合确定隐私信息所支持的隐私操作:隐私信息的格式和/或类型、隐私信息的语义、隐私场景描述信息,其中,可以通过人工设置、自动抽取、机器学习等方法,分析得到在给定隐私信息的格式和/或类型、语义、隐私场景描述信息下,隐私信息所支持的隐私操作。例如,人工标定不同的隐私信息所支持的隐私操作,举例:包含身份证号的文本信息支持被修改、复制、粘贴、加密;又如,利用关键词匹配进行自动抽取,设定所抽取的关键词对应的隐私操作,当抽取到相关关键词时,提取所设置的隐私操作;又如,通过机器学习方法根据用户历史偏好建立模型,学习用户在历史数据中的隐私信息的格式和/或类型、语义、隐私场景描述信息,提取其与隐私信息所支持的隐私操作之间的对应关系,当输入新的隐私信息的格式和/或类型、语义、隐私场景描述信息时,训练后的模型将识别出隐私信息所支持的隐私操作。
一个示例性的实施例中,当隐私信息分量的来源、数据格式不相同时,该方法还包括:对隐私信息分量进行变换操作,使得所有隐私信息分量的量纲得到统一,对统一量纲后的隐私信息分量进行融合得到隐私信息。
一个示例性的实施例中,所述信息的格式和/或类型,是指一个或多个信息向量的格式和/或类型,包括但不限于包括以下一种或多种的任意组合:文本、图片、音频、视频。
一个示例性的实施例中,信息所处的环境信息包括以下一种或多种的任意组合:
时间、空间位置、设备、传输方式、存储方式。
一个示例性的实施例中,场景描述信息是指信息所处的状态信息,包括但不限于包括以下一种或多种的任意组合:
信息格式和/或类型、时间、空间位置、设备、交互对象、交互途径、传输方式、存储方式、语义。
一个示例性的实施例中,变换操作是指将量纲不统一的隐私信息分量统一为量纲一致的隐私信息分量,便于比较和度量,包括但不限于以下至少之一:语音识别、图像文字相互转换、视频文字相互转换。例如,当隐私信息 分量分别为:A的语音信息、B的照片、C的文字信息时,可以将隐私信息分量统一为文字信息,或者统一为图片格式,以便后续进行融合。
一个示例性的实施例中,融合是指对隐私信息分量进行合并的操作,即将隐私信息分量的量纲统一后,将统一量纲后的所有隐私信息分量合并在一起得到隐私信息。
步骤110、根据以下一种或多种的任意组合确定隐私保护方案:隐私信息的格式和/或类型、场景描述信息和/或隐私场景描述信息、隐私信息所支持的隐私操作、隐私保护需求,采用确定隐私保护方案对隐私信息进行隐私保护。
一个示例性的实施例中,确定隐私保护方案包括:
根据以下一种或多种的任意组合确定至少一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值范围:隐私信息的格式和/或类型、场景描述信息和/或隐私场景描述信息、隐私信息所支持的隐私操作、隐私保护需求;根据确定的至少一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值范围确定至少一个隐私保护方案,从确定的至少一个隐私保护方案中选取其中一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值得到其中一个隐私保护方案;
计算选取的隐私保护方案对应的隐私保护后的隐私信息的隐私保护效果,当判断出所述隐私保护效果不满足隐私保护需求中的隐私保护期望时,重新选取另一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值得到另一个隐私保护方案,直至选取的隐私保护方案对应的隐私保护效果满足隐私保护需求中的隐私保护期望。
一个示例性的实施例中,所述隐私保护算法分类包括但不限于包括以下一种或多种的任意组合:基于密码学的分类、基于泛化技术的分类、基于访问控制技术分类。
一个示例性的实施例中,所述隐私保护算法的分类为一种或多种近似隐私保护算法的组合,所述近似隐私保护算法是指以某一种或几种特定技术为核心,所衍生出的算法:例如k-匿名、l-多样性的算法或其组合是基于泛化 技术的隐私保护算法分类下的近似隐私保护算法。
一个示例性的实施例中,确定隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值范围包括:
在预先设置的隐私信息的格式和/或类型、第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围之间的对应关系中,查找所述隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围,根据隐私信息、场景描述信息和/或隐私场景描述信息、隐私保护需求中的约束条件、隐私信息所支持的隐私操作从查找到的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围中确定第二隐私保护算法的分类、第二隐私保护算法,以及第二隐私保护算法的第二参数取值范围。
当在预先设置的隐私信息的格式和/或类型、第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围之间的对应关系中,查找不到所述隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围,或,查找到的所述隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围均不满足隐私保护需求中的约束条件时,根据隐私信息的格式和/或类型、场景描述信息和/或隐私场景描述信息、隐私信息所支持的隐私操作、隐私保护需求中的一种或多种的任意组合设计新型隐私保护方案。
一个示例性的实施例中,可以采用人工标记、机器学习等方法,确定隐私信息的格式和/或类型、第一隐私保护算法的分类、第一隐私保护算法、第一隐私保护算法的参数取值范围之间的对应关系。
一个示例性的实施例中,也可以通过人工设置、机器学习等方法,分析得到在给定隐私信息的格式和/或类型、语义、隐私场景描述信息、隐私保护需求下,隐私信息所支持的隐私保护算法的分类、隐私保护算法、隐私保护算法的参数取值范围。例如,人工设置隐私保护算法的分类、隐私保护方案,举例:可以通过预设得到统计数据类型的隐私信息对应基于泛化技术(隐私 保护算法的分类)中的k-匿名算法(k=1-30)、差分隐私算法(隐私预算∈=0.1-1);又如,通过机器学习方法根据用户历史偏好建立模型,学习用户在历史数据中隐私信息的格式和/或类型、语义、隐私场景描述信息、隐私保护需求下,提取其与隐私保护算法的分类、隐私保护算法、隐私保护算法的参数取值范围之间的对应关系,当输入新的隐私信息的格式和/或类型、语义、隐私场景描述信息、隐私保护需求时,训练后的模型将识别出隐私保护算法的分类、隐私保护算法、隐私保护算法的参数取值范围。
一个示例性的实施例中,隐私保护算法的参数是指隐私保护算法中的自变量参数、调整参数,选择不同的自变量参数能够得到具有不同隐私保护强度的隐私保护算法;例如,k匿名中的k就是一个自变量参数;差分隐私中的隐私预算∈是自变量参数;RSA加密算法中的密钥长度也是自变量参数。
例如,可以通过建立隐私信息的格式和/或类型、第一隐私保护算法的分类、第一隐私保护算法、第一隐私保护算法的参数取值范围的对应关系,查找到隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法、第一隐私保护算法的第一参数取值范围;根据场景描述信息和/或隐私场景描述信息和隐私保护算法分类之间的对应关系,从第一隐私保护算法的分类中筛选出第二隐私保护算法分类;根据隐私信息所支持的隐私操作、隐私保护需求、隐私保护算法的分类和隐私保护算法之间的对应关系,筛选出第二隐私保护算法分类下,第一隐私保护算法中由满足隐私信息所支持的隐私操作、隐私保护需求构成的第二隐私保护算法;根据场景描述信息和/或隐私场景描述信息、隐私保护需求和隐私保护算法的参数取值范围之间的对应关系,从第二隐私保护算法的第一参数取值范围中筛选出第二隐私保护算法的第二参数取值范围;
又如,可以通过建立隐私信息的格式和/或类型、隐私保护算法的分类、隐私保护算法、隐私保护算法的参数取值范围之间的对应关系,查找到隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法、第一隐私保护算法的第一参数取值范围;根据场景描述信息和/或隐私场景描述信息、隐私信息所支持的隐私操作、隐私保护需求、隐私保护算法的分类、隐私保护算法、隐私保护算法的参数取值范围之间的对应关系,从第一隐私 保护算法的分类、第一隐私保护算法、第一隐私保护算法的第一参数取值范围中筛选出第二隐私保护算法的分类、第二隐私保护算法、第二隐私保护算法的第二参数取值范围。
当在预先设置的隐私信息的格式和/或类型、第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围之间的对应关系中,查找不到所述隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围,或,查找到的所述隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围均不满足隐私保护需求中的约束条件时,根据隐私信息的格式和/或类型、场景描述信息和/或隐私场景描述信息、隐私信息所支持的隐私操作、隐私保护需求中的一种或多种的任意组合设计新型隐私保护方案。
这里的设计是指根据隐私信息的格式和/或类型、场景描述信息和/或隐私场景描述信息、隐私信息所支持的隐私操作、隐私保护需求去构建新的隐私保护算法,例如将已有算法进行组合得到隐私保护算法的组合方案,或对已有隐私保护算法的步骤重新组合,例如先采用k匿名算法,然后对k个匿名结果分别加密,整体的流程可视作新构造的算法。
一个示例性的实施例中,隐私保护效果设置成评价隐私保护方案保护后的隐私信息实际被保护的程度,包括以下一种或多种的任意组合:
隐私保护后攻击者推测出隐私保护前的隐私信息的概率、隐私保护后隐私信息的不确定度、隐私保护后的隐私信息与隐私保护前的隐私信息之间的损失量。
例如,对于位置隐私信息,其偏差量可以定义为隐私保护后位置与真实位置的距离,其损失比可以定义为隐私保护前反馈给用户的POI个数与隐私保护后反馈给用户的POI个数之比;对于图片隐私信息,其偏差量可以定义为隐私保护前后每个像素点RGB数值上的差值,其损失比可以定义为隐私保护前后每个像素点RGB数值上的差值,与隐私保护前的像素点RGB数值之比。
一个示例性的实施例中,判断出所述隐私保护效果满足隐私保护需求中的隐私保护期望包括以下一种或多种的任意组合:
隐私保护效果中的隐私保护后攻击者推测出隐私保护前的隐私信息的概率小于或等于隐私保护期望中的隐私保护后攻击者推测出隐私保护前的隐私信息的概率的预期;
隐私保护效果中的隐私保护后隐私信息的不确定度大于或等于隐私保护期望中的隐私保护后隐私信息的不确定度的预期;
隐私保护效果中的隐私保护后的隐私信息与隐私保护前的隐私信息之间的损失量小于或等于隐私保护期望中的隐私保护后的隐私信息与隐私保护前的隐私信息之间的损失量预期。
一个示例性的实施例中,判断出所述隐私保护效果不满足隐私保护需求中的隐私保护期望包括以下一种或多种的任意组合:
隐私保护效果中的隐私保护后攻击者推测出隐私保护前的隐私信息的概率大于隐私保护期望中的隐私保护后攻击者推测出隐私保护前的隐私信息的概率的预期;
隐私保护效果中的隐私保护后隐私信息的不确定度小于隐私保护期望中的隐私保护后隐私信息的不确定度的预期;
隐私保护效果中的隐私保护后的隐私信息与隐私保护前的隐私信息之间的损失量大于隐私保护期望中的隐私保护后的隐私信息与隐私保护前的隐私信息之间的损失量预期。
一个示例性的实施例中,重新选取另一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值的方法包括但不限于:
例如,可以先选择其中一个隐私保护算法和参数,基于选择的隐私保护算法和参数计算其隐私保护效果,并判断隐私保护效果是否满足隐私保护需求中的隐私保护期望;当隐私保护算法的隐私保护效果不满足隐私保护需求中的隐私保护期望时,保持当前的隐私保护算法的分类和隐私保护算法不变,为其参数选择取值范围内的另一个取值,基于重新选择的参数计算隐私保护 算法的隐私保护效果,并判断隐私保护算法的隐私保护效果是否满足隐私保护需求中的隐私保护期望;直到基于当前选择的隐私保护算法对应的所有参数计算的隐私保护算法的隐私保护效果均不满足隐私保护需求中的隐私保护期望时,结束本流程;或者输出隐私需求过高的提示信息,并不予以输出任何隐私保护方案。
又如,可以先选择其中一个隐私保护算法和参数,基于选择的隐私保护算法和参数计算其隐私保护效果,并判断隐私保护效果是否满足隐私保护需求中的隐私保护期望;当隐私保护算法的隐私保护效果不满足隐私保护需求中的隐私保护期望时,保持当前的隐私保护算法的分类和隐私保护算法不变,为其参数选择取值范围内的另一个取值,基于重新选择的参数计算隐私保护算法的隐私保护效果,并判断隐私保护算法的隐私保护效果是否满足隐私保护需求中的隐私保护期望;直到基于当前选择的隐私保护算法对应的所有参数计算的隐私保护算法的隐私保护效果均不满足隐私保护需求中的隐私保护期望时,选择隐私保护算法的分类中的另一个不同的隐私保护算法及其对应参数;直到基于当前选择的隐私保护算法分类下对应的所有隐私保护算法及其参数所得到隐私保护效果均不满足隐私保护需求中的隐私保护期望时,结束本流程;或者输出隐私需求过高的提示信息,并不予以输出任何隐私保护方案。
又如,可以先选择其中一个隐私保护算法和参数,基于选择的隐私保护算法和参数计算其隐私保护效果,并判断隐私保护效果是否满足隐私保护需求中的隐私保护期望;当隐私保护算法的隐私保护效果不满足隐私保护需求中的隐私保护期望时,保持当前的隐私保护算法的分类和隐私保护算法不变,为其参数选择取值范围内的另一个取值,基于重新选择的参数计算隐私保护算法的隐私保护效果,并判断隐私保护算法的隐私保护效果是否满足隐私保护需求中的隐私保护期望;直到基于当前选择的隐私保护算法对应的所有参数计算的隐私保护算法的隐私保护效果均不满足隐私保护需求中的隐私保护期望时,选择当前隐私保护算法的分类中的另一个不同的隐私保护算法及其对应参数;直到基于当前选择的隐私保护算法分类下对应的所有隐私保护算法及其参数所得到隐私保护效果均不满足隐私保护需求中的隐私保护期望时, 选择另一个不同的隐私保护算法的分类;重复上述步骤,直到隐私保护算法的分类下的隐私保护方案的隐私保护效果满足隐私保护需求中的隐私保护期望时,输出隐私保护效果满足隐私保护需求中的隐私保护期望的隐私保护方案。
当所有查找到的隐私保护算法的隐私保护效果均不满足隐私保护需求中的隐私保护期望时,结束本流程;或者输出隐私需求过高的提示信息,并不予以输出任何隐私保护方案。
本发明实施例考虑到不同的隐私保护算法由于其数学基础、算法参数选择的不同,具有不同的隐私保护能力,从而满足不同的隐私保护需求,并且隐私保护算法数量有限,可以将系统所包含的隐私保护算法按照数学基础分类,例如,根据前述举例,可将基于时间的访问控制技术分类对应至隐私保护需求3级,将数据混淆、扰动等基于概率思想的不可逆算法分类和交换加密、同态加密等基于密码学思想的可逆算法分类至隐私保护需求2级。
参见图3(a),本申请另一个实施例提出一种隐私信息的管理装置,包括以下模块中的一个或多个的任意组合:
确定模块301,设置成确定隐私信息、隐私信息所支持的隐私操作、隐私信息的隐私保护需求,以及确定场景描述信息和/或隐私场景描述信息;
隐私保护方案决策评估模块302,设置成根据以下一种或多种的任意组合确定隐私保护方案,采用确定的隐私保护方案对隐私信息进行隐私保护:隐私信息的格式和/或类型;场景描述信息和/或隐私场景描述信息;隐私信息所支持的隐私操作;隐私保护需求。
一个示例性的实施例中,确定模块301可以采用以下任一种方式确定隐私信息、确定隐私信息所支持的隐私操作、确定隐私信息的隐私保护需求、以及确定场景描述信息和/或隐私场景描述信息。
第一种、参见图3(b),确定模块301包括:
第一隐私信息提取单元3011,设置成从信息中提取隐私信息;
第一隐私保护需求确定单元3012,设置成根据隐私信息、隐私场景描述 信息、信息持有者的偏好中的一种或多种的任意组合确定隐私信息的隐私保护需求;
隐私场景描述信息构造单元3013,设置成根据隐私信息的格式和/或类型、隐私信息所处的环境信息、隐私信息的语义中的一种或多种的任意组合构造隐私场景描述信息;
第一隐私操作确定单元3014,设置成根据隐私信息的格式和/或类型、隐私信息的语义和隐私场景描述信息中的一种或多种的任意组合确定隐私信息所支持的隐私操作。
第二种、如图3(c)所示,确定模块301包括:
拆分单元3015,设置成根据信息的格式和/或类型、信息的语义对所述信息进行拆分得到信息向量;
场景描述信息生成单元3016,设置成根据所述信息的格式和/或类型、所述信息所处的环境信息、信息的语义生成所述场景描述信息;
第二隐私信息提取单元3017,设置成根据所述场景描述信息从信息向量中提取至少一个信息分量作为隐私信息分量,提取的隐私信息分量构成隐私信息,根据场景描述信息提取出隐私信息对应的隐私场景描述信息,其中,隐私信息的提取方法可以采用人工设置、自动抽取、机器学习等,例如人工标定信息中的隐私信息,例如当读取到的场景描述信息为时间14:00-17:00,地点在娱乐场所时,与位置相关的信息即为隐私信息;又如,利用关键词匹配进行自动抽取,所抽取的关键词对应的信息构成了隐私信息;又如,通过机器学习方法根据用户历史偏好建立模型,学习用户在历史数据中的信息、场景描述信息,提取其与隐私信息之间的对应关系,当输入新的信息、场景描述信息时,训练后的模型将识别出隐私信息。
第二隐私保护需求确定单元3018,设置成根据所述隐私信息、隐私场景描述信息、信息持有者的偏好中的一种或多种的任意组合确定隐私信息的隐私保护需求,一个示例性的实施例中,可以采用人工标记、机器学习等方法确定隐私信息的隐私保护需求。
第二隐私操作确定单元3019,用于根据隐私信息的格式和/或类型、隐 私信息的语义、场景描述信息中的一种或多种的任意组合确定隐私信息所支持的隐私操作。其中,可以通过人工设置、自动抽取、机器学习等方法,分析得到在给定隐私信息的格式和/或类型、语义、隐私场景描述信息下,隐私信息所支持的隐私操作。
图3(a)、3(b)、3(c)分别仅展示本申请提出一种隐私信息管理装置的一种可实现的展现形式,还可以以其他形式出现。本申请对模块的数量和顺序不作限制,可以是图中一种模块或多种模块的任意组合,也可以以其他顺序排列模块。
参见图4,本申请另一个实施例提出了一种隐私信息管理系统,包括:
第一设备401,设置成确定隐私信息、隐私信息所支持的隐私操作、隐私信息的隐私保护需求,以及确定场景描述信息和/或隐私场景描述信息;
第二设备402,设置成根据以下一种或多种的任意组合确定隐私保护方案,采用确定的隐私保护方案对隐私信息进行隐私保护;隐私信息的格式和/或类型;场景描述信息和/或隐私场景描述信息;隐私信息所支持的隐私操作;隐私保护需求。
上述第一设备401和第二设备402的实现过程与前述实施例隐私信息管理方法的实现过程相同,这里不再赘述。
一个示例性的实施例中,第一设备401和第二设备402可以是任意设备,例如,第一设备401可以是终端设备,第二设备402可以是服务器。
本领域普通技术人员可以理解,上文中所公开方法中的全部或某些步骤、系统、装置中的功能模块/单元可以被实施为软件、固件、硬件及其适当的组合。在硬件实施方式中,在以上描述中提及的功能模块/单元之间的划分不一定对应于物理组件的划分;例如,一个物理组件可以具有多个功能,或者一个功能或步骤可以由若干物理组件合作执行。某些组件或所有组件可以被实施为由处理器,如数字信号处理器或微处理器执行的软件,或者被实施为硬件,或者被实施为集成电路,如专用集成电路。这样的软件可以分布在计算机可读介质上,计算机可读介质可以包括计算机存储介质(或非暂时性介质)和通信介质(或暂时性介质)。如本领域普通技术人员公知的,术语计算机 存储介质包括在设置成存储信息(诸如计算机可读指令、数据结构、程序模块或其他数据)的任何方法或技术中实施的易失性和非易失性、可移除和不可移除介质。计算机存储介质包括但不限于RAM、ROM、EEPROM、闪存或其他存储器技术、CD-ROM、数字多功能盘(DVD)或其他光盘存储、磁盒、磁带、磁盘存储或其他磁存储装置、或者可以设置成存储期望的信息并且可以被计算机访问的任何其他的介质。此外,本领域普通技术人员公知的是,通信介质通常包含计算机可读指令、数据结构、程序模块或者诸如载波或其他传输机制之类的调制数据信号中的其他数据,并且可包括任何信息递送介质。

Claims (15)

  1. 一种隐私信息管理方法,包括以下步骤中一个或多个的任意组合:
    确定隐私信息、隐私信息所支持的隐私操作、隐私信息的隐私保护需求,以及确定场景描述信息和/或隐私场景描述信息;
    根据以下一种或多种组合确定隐私保护方案,采用确定的隐私保护方案对隐私信息进行隐私保护:
    隐私信息的格式和/或类型;
    场景描述信息和/或
    隐私场景描述信息;
    隐私信息所支持的隐私操作;
    隐私保护需求。
  2. 根据权利要求1所述的隐私信息管理方法,其中,所述确定隐私信息、确定场景描述信息和/或隐私场景描述信息、确定隐私信息所支持的隐私操作、确定隐私信息的隐私保护需求包括:
    从信息中提取隐私信息,根据以下一种或多种的任意组合构造隐私场景描述信息:
    隐私信息的格式和/或类型;
    隐私信息所处的环境信息;
    隐私信息的语义;
    根据隐私信息、隐私场景描述信息、信息持有者的偏好中的一种或多种的任意组合确定隐私信息的隐私保护需求;
    根据以下一种或多种组合确定隐私信息所支持的隐私操作:
    隐私信息的格式和/或类型;
    隐私信息的语义;
    隐私场景描述信息。
  3. 根据权利要求2所述的隐私信息管理方法,其中,所述从信息中提取隐私信息包括:
    根据所述信息的格式和/或类型、以及根据所述信息的语义对信息进行拆分得到信息向量;
    从所述信息向量中提取至少一个信息分量作为隐私信息分量,提取的隐私信息分量构成所述隐私信息。
  4. 根据权利要求3所述的隐私信息管理方法,其中,所述从信息向量中提取至少一个信息分量作为隐私信息分量包括:
    从所述信息向量中提取敏感度大于敏感阈值的信息分量,作为所述隐私信息分量,所述敏感度可以根据用户对信息分量的主观敏感程度和信息分量的信息熵确定。
  5. 根据权利要求1所述的隐私信息管理方法,其中,所述确定隐私信息、确定场景描述信息和/或隐私场景描述信息、确定隐私信息所支持的隐私操作、确定隐私信息的隐私保护需求包括:
    根据信息的格式和/或类型、所述信息的语义对所述信息进行拆分得到信息向量;
    根据所述信息的格式和/或类型、以及根据所述信息所处的环境信息、所述信息的语义生成所述场景描述信息;
    根据所述场景描述信息从信息向量中提取至少一个信息分量作为隐私信息分量,提取的隐私信息分量构成所述隐私信息,根据所述场景描述信息提取出隐私信息对应的隐私场景描述信息;
    根据隐私信息、隐私场景描述信息、信息持有者的偏好中的一种或多种的任意组合确定隐私信息的隐私保护需求;
    根据以下一种或多种组合确定隐私信息所支持的隐私操作:
    隐私信息的格式和/或类型;
    隐私信息的语义;
    隐私场景描述信息。
  6. 根据权利要求3或5所述的隐私信息管理方法,其中,当不同所述隐私信息分量的来源、数据格式不相同时,该方法还包括:
    对所述隐私信息分量进行变换操作,使得所有所述隐私信息分量的量纲得到统一;
    对统一量纲后的隐私信息分量进行融合得到所述隐私信息。
  7. 根据权利要求1所述的隐私信息管理方法,其中,所述隐私保护需求是指信息持有者在特定应用场景下希望隐私信息被保护的程度,包括以下任意一个或多个的组合:隐私保护期望、约束条件;
    其中,隐私保护期望包括以下任意一个或多个的组合:隐私保护后攻击者推测出隐私保护前的隐私信息的概率的预期、隐私保护后隐私信息的不确定度的预期、隐私保护后的隐私信息与隐私保护前的隐私信息之间的损失量预期;
    其中,约束条件包括以下任意一个或多个的组合:隐私场景描述信息、信息持有者允许的隐私操作和隐私信息之间的对应关系;隐私场景描述信息、信息持有者不允许的隐私操作和隐私信息之间的对应关系。
  8. 根据权利要求1所述的隐私信息管理方法,其中,所述场景描述信息是指信息所处的状态信息,所述隐私场景描述信息是指所述隐私信息所处的状态信息,包括以下一种或多种的任意组合:
    信息格式和/或类型、时间、空间位置、设备、交互对象、交互途径、传输方式、存储方式、语义;或者,
    其中,所述隐私操作包括以下一种或多种的任意组合:读、写、加密、模糊、泛化、加噪、匿名、签名、验签、计算摘要、加密、保存、复制、粘贴、转发、剪切、修改、删除。
  9. 根据权利要求1所述的隐私信息管理方法,其中,所述确定隐私保护方案包括:
    根据以下一种或多种组合确定至少一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值范围:
    隐私信息的格式和/或类型;
    场景描述信息和/或
    隐私场景描述信息;
    隐私信息所支持的隐私操作;
    隐私保护需求;
    根据确定的至少一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值范围确定至少一个隐私保护方案;
    从确定的至少一个隐私保护方案中选取其中一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值得到其中一个隐私保护方案;
    计算选取的隐私保护方案对应的隐私保护后的隐私信息的隐私保护效果,当判断出所述隐私保护效果不满足隐私保护需求中的隐私保护期望时,重新选取另一组隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值得到另一个隐私保护方案,直至选取的隐私保护方案对应的隐私保护效果满足隐私保护需求中的隐私保护期望。
  10. 根据权利要求9所述的隐私信息管理方法,其中,所述隐私保护算法的分类包括以下一种或多种的任意组合:
    基于密码学的分类、基于泛化技术的分类、基于访问控制技术的分类。
  11. 根据权利要求9所述的隐私信息管理方法,其中,所述确定隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值范围包括:
    在预先设置的隐私信息的格式和/或类型、以及第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围之间的对应关系中,查找所述隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围,根据场景描述信息和/或隐私场景描述信息、以及根据隐私信息、隐私保护需求中的约束条件、隐私信息所支持的隐私操作从查找到的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围中确定第二隐私保护算法的分类、第二隐私保护算法,以及第二隐私保护算法的第二参数取值范围。
  12. 根据权利要求9所述的隐私信息管理方法,其中,所述确定隐私保护算法的分类、隐私保护算法和隐私保护算法的参数取值范围包括:
    当在预先设置的隐私信息的格式和/或类型、以及第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围之间的对应关系中,查找不到所述隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围,或,查找到的所述隐私信息的格式和/或类型对应的第一隐私保护算法的分类、第一隐私保护算法和第一隐私保护算法的第一参数取值范围均不满足隐私保护需求中的约束条件时,根据以下一种或多种的任意组合设计新型隐私保护方案:
    隐私信息的格式和/或类型;
    场景描述信息和/或
    隐私场景描述信息;
    隐私信息所支持的隐私操作;
    隐私保护需求。
  13. 一种隐私信息管理装置,包括以下模块中一个或多个的任意组合:
    确定模块,设置成确定隐私信息、隐私信息所支持的隐私操作、隐私信息的隐私保护需求,以及确定场景描述信息和/或隐私场景描述信息;
    隐私保护方案决策评估模块,设置成根据以下一种或多种的任意组合确定隐私保护方案,采用确定的隐私保护方案对隐私信息进行隐私保护;
    隐私信息的格式和/或类型;
    场景描述信息和/或
    隐私场景描述信息;
    隐私信息所支持的隐私操作;
    隐私保护需求。
  14. 一种计算机可读存储介质,其上存储有计算机程序,其中,所述计算机程序被处理器执行时实现如权利要求1~12任一项所述的隐私信息管理 方法。
  15. 一种隐私信息管理系统,包括:
    第一设备,设置成确定隐私信息、隐私信息所支持的隐私操作、隐私信息的隐私保护需求,以及确定场景描述信息和/或隐私场景描述信息;
    第二设备,设置成根据以下一种或多种的任意组合确定隐私保护方案,采用确定的隐私保护方案对隐私信息进行隐私保护;
    隐私信息的格式和/或类型;
    场景描述信息和/或
    隐私场景描述信息;
    隐私信息所支持的隐私操作;
    隐私保护需求。
PCT/CN2019/083048 2018-10-30 2019-04-17 一种隐私信息管理方法、装置和系统 WO2020087878A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811272632.8A CN109583228B (zh) 2018-10-30 2018-10-30 一种隐私信息管理方法、装置和系统
CN201811272632.8 2018-10-30

Publications (1)

Publication Number Publication Date
WO2020087878A1 true WO2020087878A1 (zh) 2020-05-07

Family

ID=65920823

Family Applications (3)

Application Number Title Priority Date Filing Date
PCT/CN2019/083048 WO2020087878A1 (zh) 2018-10-30 2019-04-17 一种隐私信息管理方法、装置和系统
PCT/CN2019/083050 WO2020087879A1 (zh) 2018-10-30 2019-04-17 一种隐私信息保护方法、装置及系统
PCT/CN2019/083045 WO2020087876A1 (zh) 2018-10-30 2019-04-17 一种信息流转方法、装置及系统

Family Applications After (2)

Application Number Title Priority Date Filing Date
PCT/CN2019/083050 WO2020087879A1 (zh) 2018-10-30 2019-04-17 一种隐私信息保护方法、装置及系统
PCT/CN2019/083045 WO2020087876A1 (zh) 2018-10-30 2019-04-17 一种信息流转方法、装置及系统

Country Status (2)

Country Link
CN (1) CN109583228B (zh)
WO (3) WO2020087878A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109583228B (zh) * 2018-10-30 2021-05-07 中国科学院信息工程研究所 一种隐私信息管理方法、装置和系统
US11115479B2 (en) * 2019-01-10 2021-09-07 Google Llc Enhanced online privacy
CN112926089B (zh) * 2021-03-25 2023-03-17 支付宝(杭州)信息技术有限公司 一种基于隐私保护的数据风险防控方法、装置及设备
CN112989425B (zh) * 2021-04-26 2021-08-13 南京审计大学 基于差分隐私的信用数据隐私保护方法及其系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150269383A1 (en) * 2014-01-22 2015-09-24 Object Security LTD Automated and adaptive model-driven security system and method for operating the same
CN107944299A (zh) * 2017-12-29 2018-04-20 西安电子科技大学 一种隐私信息的处理方法、装置及系统
CN108197453A (zh) * 2018-01-19 2018-06-22 中国科学院信息工程研究所 一种图像隐私保护方法及系统
CN109583228A (zh) * 2018-10-30 2019-04-05 中国科学院信息工程研究所 一种隐私信息管理方法、装置和系统

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100017870A1 (en) * 2008-07-18 2010-01-21 Agnik, Llc Multi-agent, distributed, privacy-preserving data management and data mining techniques to detect cross-domain network attacks
CN103391192B (zh) * 2013-07-16 2016-09-21 国家电网公司 一种基于隐私保护的跨安全域访问控制系统及其控制方法
CN104318171B (zh) * 2014-10-09 2017-11-07 中国科学院信息工程研究所 基于权限标签的Android隐私数据保护方法及系统
CN104375836B (zh) * 2014-11-19 2018-08-17 深圳市腾讯计算机系统有限公司 一种展示锁屏窗口的方法及装置
CN109583227B (zh) * 2018-10-30 2020-08-07 中国科学院信息工程研究所 一种隐私信息保护方法、装置及系统
CN109347845B (zh) * 2018-10-30 2020-08-07 中国科学院信息工程研究所 一种信息流转方法、装置及系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150269383A1 (en) * 2014-01-22 2015-09-24 Object Security LTD Automated and adaptive model-driven security system and method for operating the same
CN107944299A (zh) * 2017-12-29 2018-04-20 西安电子科技大学 一种隐私信息的处理方法、装置及系统
CN108197453A (zh) * 2018-01-19 2018-06-22 中国科学院信息工程研究所 一种图像隐私保护方法及系统
CN109583228A (zh) * 2018-10-30 2019-04-05 中国科学院信息工程研究所 一种隐私信息管理方法、装置和系统

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LI, FENGHUA ET AL: "Research Progress of Photo Privacy-Preserving Mechanisms in Online Social Network", JOURNAL OF CYBER SECURITY, vol. 3, no. 2, 1 March 2018 (2018-03-01), pages 41 - 61, XP055700170, DOI: 10.19363/j.cnki.cn10-1380/tn.2018.03.04 *
NIU, BEN ET AL.: "Research on Scenario-based Mechanism in Privacy-aware Mobile Networks", CHINESE JOURNAL OF NETWORK AND INFORMATION SECURITY, vol. 1, no. 1, 1 December 2015 (2015-12-01), pages 31 - 42, XP055700163, DOI: 10.11959/j.issn.2096-109x.2015.00005 *

Also Published As

Publication number Publication date
CN109583228A (zh) 2019-04-05
WO2020087879A1 (zh) 2020-05-07
CN109583228B (zh) 2021-05-07
WO2020087876A1 (zh) 2020-05-07

Similar Documents

Publication Publication Date Title
US11250887B2 (en) Routing messages by message parameter
WO2020087878A1 (zh) 一种隐私信息管理方法、装置和系统
US11372608B2 (en) Gallery of messages from individuals with a shared interest
US10019136B1 (en) Image sharing device, apparatus, and method
US10027727B1 (en) Facial recognition device, apparatus, and method
US10027726B1 (en) Device, apparatus, and method for facial recognition
US11423126B2 (en) Computerized system and method for modifying a media file by automatically applying security features to select portions of media file content
CN105659286B (zh) 自动化图像裁剪和分享
US10698945B2 (en) Systems and methods to predict hashtags for content items
US8385660B2 (en) Mixed media reality indexing and retrieval for repeated content
US11968255B2 (en) Methods and systems for secure information storage and delivery
US11620825B2 (en) Computerized system and method for in-video modification
US20230161905A1 (en) Firecommendation method, device, and system for distributed privacy-preserving learning
US20220312059A1 (en) Systems and methods for media verification, organization, search, and exchange
CN103814369A (zh) 基于上下文的通信方法和用户界面
CN114820011A (zh) 用户群体聚类方法、装置、计算机设备和存储介质
US20130054772A1 (en) Method and apparatus for content management using network server
Reddy et al. Isolation Procedure Inference of User-Uploaded Images on Content Sharing Sites
CN117499362A (zh) 传输方法、装置、电子设备及可读存储介质
CN117099104A (zh) 知识图隐私管理

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19880601

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19880601

Country of ref document: EP

Kind code of ref document: A1