CN110287711A - A kind of analysis method for protecting privacy of user - Google Patents

A kind of analysis method for protecting privacy of user Download PDF

Info

Publication number
CN110287711A
CN110287711A CN201910493316.1A CN201910493316A CN110287711A CN 110287711 A CN110287711 A CN 110287711A CN 201910493316 A CN201910493316 A CN 201910493316A CN 110287711 A CN110287711 A CN 110287711A
Authority
CN
China
Prior art keywords
user
data
information
initial data
desensitization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910493316.1A
Other languages
Chinese (zh)
Other versions
CN110287711B (en
Inventor
肖政宏
闫艺婷
王华嘉
周健烨
李旺
梁志鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Polytechnic Normal University
Original Assignee
Guangdong Polytechnic Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Polytechnic Normal University filed Critical Guangdong Polytechnic Normal University
Priority to CN201910493316.1A priority Critical patent/CN110287711B/en
Publication of CN110287711A publication Critical patent/CN110287711A/en
Application granted granted Critical
Publication of CN110287711B publication Critical patent/CN110287711B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6254Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present invention provides a kind of analysis method for protecting privacy of user, includes the following steps, obtains the initial data and user information needed for secrecy, and establish mapping relations;By the initial data in step 1 according to desensitization principle conversion;Grade is checked into user's initial data setting of need for confidentiality accordingly;A. in the case where meeting rating conditions, user information and primary data information (pdi) can be checked;B. rating conditions are not met, can only check user information and the information after desensitization design.The present invention is encrypted using label stacked system, but still remains the judgment basis of the similarity of user, so data analysis and excavation still can be carried out, also, the simple labeling of similarity-rough set of user is specifically and accurately.

Description

A kind of analysis method for protecting privacy of user
Technical field
The invention belongs to technical field of data processing more particularly to a kind of analysis methods for protecting privacy of user.
Background technique
It is more and more stronger in the private attribute of mobile internet era, data, and data have double attribute, existing privacy category Property, while valuable value attribute.The rise of digital economy becomes a kind of strong competitiveness, available more by data Good development, some data informations especially in social networks.Either scientific research is in commercial object, more and more Community network data need to issue, to will lead to the leakage of user privacy information in community network.Nowadays, mass data is caught The epoch obtained, handle, analyze and shared, the absolutely essential item that data safety has become to stand in the breach, but data safety has System, concept are complicated, content is more, and data-privacy protection has been deep into every field and various industries, it is often necessary to user Data carry out analysis mining, therefrom obtain some valuable resource informations, data mining successful story also has very much, just It is the correlation found using data between this things and that things, the recommendation of some Commdity advertisements can be carried out, to brings The problem of some privacy of user are revealed, including between the existence of user, the Sensitive Attributes label of user, user Relationship existence, the weight of relationship, the classification of relationship, structural property of social networks etc. need information protection.At present to user The way of secret protection is will to identify that the field (such as cell-phone number, age, identification card number etc.) of user data is added Close or cover form is shown.
Currently, to the method for data secret protection also there are many, for example desensitized using a variety of data, Jia Min and block side Method, these methods can classify and separately or cooperatively be applied according to demand for security;Data replacement, it is true to fabricate data substitution Value, it is easy to cause information chaotic, the data of magnanimity cannot be identified correctly;Truncation, encryption hide or are allowed to invalid, truncation It only uses incomplete data to advance mark, it may not be possible to accurately give expression to real meaning, encryption can be with " * " number come table Show, can be operated with relevant crack method, be allowed to restore.It is hiding to make data, meaning is lost to data itself;Partially It moves, numerical data is changed by random shift, needs to remember the rule of displacement, can be troublesome, the use of data is not easy to grasp Make.Deficiency existing for the above method is studied, although the data encryption of particular user or covering up, such as Fruit has grasped enough labels, and still can deduce user is specifically someone,
Common encryption method is achieved in that
Table 1: common data desensitization algorithm
Summary of the invention
It is an object of the invention to solve the problems of the above-mentioned prior art, a kind of practicable data encryption is provided Method.
The present invention adopts the following technical scheme:
A kind of analysis method for protecting privacy of user, comprising the following steps:
Step 1., which obtains, to be needed to analyze for the initial data and user information of secrecy and establish mapping relations;
Step 2. is by the initial data in step 1 according to desensitization principle design;
The initial data setting of need for confidentiality is checked that grade (implements fine-grained access control policy accordingly by step 3. With the strategy of minimum zone), divided rank as needed, grade is the step-length of numerical intervals, and then the division of step-length is arranged For stacking pattern.
Step 4. is according to the field information after converting in rate range matching corresponding step 2;
A. in the case where meeting rating conditions, user information and primary data information (pdi) can be checked;
B. rating conditions are not met, can only check user information and the data information after desensitization design.
Beneficial effects of the present invention:
Encryption attribute proposed by the present invention by user, people is any attribute that can not learn user, so also can not Retrodicting user is specifically whom.Although encrypting using label stacked system, the judgment basis of the similarity of user is still remained, So data analysis and excavation still can be carried out.Also, the similarity of user is specifically and more quasi- than simple labeling Really.
Detailed description of the invention
Fig. 1 is program execution flow figure of the invention;
Fig. 2 is that encryption principle of the invention analyzes result figure.
Specific embodiment
To make the object, technical solutions and advantages of the present invention clearer, the technical solution below in the present invention carries out clear Chu is fully described by, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
It also by the way of encryption, and is irreversible encryption, such as MD5 by all labels of user, such user's All information are all encryptions, and personage can not identify any content.But if simple processing, label information in this way can lose Compare more, for example takes in, although 0-5 ten thousand is different label with 5-10 ten thousand, there is certain degree of association, if simply become It is unrecognizable, it will lead to degree of association forfeiture.
As shown in Fig. 2, label is segmented and is overlapped to make up this point, the label that such as income is divided into multiple overlappings again: 0-5 ten thousand, 1-6 ten thousand, 2-7 ten thousand, 3-8 ten thousand, 4-9 ten thousand, 5-10 ten thousand, in this way, after switching to unrecognizable label, different user according to Identical label judges similarity.Such as, user's first income is 4.9 ten thousand, and user's second income is 5.1 ten thousand, and the income of user third is 9.9 Ten thousand, then, user's first has 4 identical labels with user's second, and user's second is with only one same label of user third.Although unclear Chu is any specific label, but is known that user's first is high with the similarity-rough set of user's second.It was found that data investigation Labeling processing can be played the role of protecting privacy of user, and labeling is handled, and is discrete, it is possible to will lead to data Entanglement, for example income 0-5 ten thousand, 5-10 ten thousand, come when labelling, user's first income is 4.9 ten thousand, and user's second income 5.1 Ten thousand, user third is 9.9 ten thousand, if that for rationally, Yong Hujia, second is similar in comparison, but when being converted into label, User's first and second difference label, second and third is common tag.
Data that treated, different has identical label, can carry out the analysis mining of data by this, and analyzes Person can not know that specific label is.
The present invention needs to implement fine-grained access control policy and minimum zone to preferably protect privacy of user Strategy such as can enable the permission control strategy based on column (field) to data.Only implement particulate for privacy of user protection It is inadequate for spending permission control, it is also necessary to cooperate relevant data desensitization strategy, can preferably protect privacy of user.
The plaintext access authority of the income field of embodiment employee
As shown in Figure 1, the plaintext access authority of the income field such as certain employee, can inquire all members within the scope of authority The income of work can not be to carry out operative constraint to its subsequent rows, may cause leakage of information in this way, if enabling data desensitization Strategy, employee it is seen that desensitization after data, be difficult to restore initial data.
Table: data desensitization label is superimposed example
Data desensitization is extremely important for enterprise's secret protection, can be right for significantly more efficient management data assets The fields such as annual income, age, identification card number can also promote desensitization technology to be widely applied using a kind of novel protection mechanism.
Finally, it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although Present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that: it still may be used To modify the technical solutions described in the foregoing embodiments or equivalent replacement of some of the technical features; And these are modified or replaceed, technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution spirit and Range.

Claims (1)

1. a kind of analysis method for protecting privacy of user, which is characterized in that include the following steps,
Step 1., which obtains, to be needed to analyze for the initial data and user information of secrecy and establish mapping relations;
Step 2. converts the initial data in step 1 according to desensitization design principle;
The initial data setting of need for confidentiality in step 1 is checked that grade, grade divide as needed accordingly by step 3., Grade is the step-length of numerical intervals, then sets stacking pattern for the division of step-length;
Step 4. is according to the field information after converting in rate range matching corresponding step 2;
A. in the case where meeting rating conditions, user information and primary data information (pdi) can be checked;
B. rating conditions are not met, can only check user information and the transitional information after desensitization design.
CN201910493316.1A 2019-06-06 2019-06-06 Analysis method for protecting user privacy Active CN110287711B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910493316.1A CN110287711B (en) 2019-06-06 2019-06-06 Analysis method for protecting user privacy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910493316.1A CN110287711B (en) 2019-06-06 2019-06-06 Analysis method for protecting user privacy

Publications (2)

Publication Number Publication Date
CN110287711A true CN110287711A (en) 2019-09-27
CN110287711B CN110287711B (en) 2021-07-16

Family

ID=68003608

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910493316.1A Active CN110287711B (en) 2019-06-06 2019-06-06 Analysis method for protecting user privacy

Country Status (1)

Country Link
CN (1) CN110287711B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106991337A (en) * 2017-04-06 2017-07-28 北京数聚世界信息技术有限公司 The desensitization method and device of a kind of date of birth data
CN107145799A (en) * 2017-05-04 2017-09-08 山东浪潮云服务信息科技有限公司 A kind of data desensitization method and device
CN107943925A (en) * 2017-11-21 2018-04-20 华中师范大学 Fuzzy method for individual information in privacy information issue of anonymity system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106991337A (en) * 2017-04-06 2017-07-28 北京数聚世界信息技术有限公司 The desensitization method and device of a kind of date of birth data
CN107145799A (en) * 2017-05-04 2017-09-08 山东浪潮云服务信息科技有限公司 A kind of data desensitization method and device
CN107943925A (en) * 2017-11-21 2018-04-20 华中师范大学 Fuzzy method for individual information in privacy information issue of anonymity system

Also Published As

Publication number Publication date
CN110287711B (en) 2021-07-16

Similar Documents

Publication Publication Date Title
Haghighat et al. Identification using encrypted biometrics
Guo et al. Towards efficient privacy-preserving face recognition in the cloud
CN103279697B (en) Based on details in fingerprint Information hiding and the restoration methods of orthogonal matrix and modular arithmetic
CN104239944A (en) Commodity information identification method
CN112215165B (en) Face recognition method based on wavelet dimensionality reduction under homomorphic encryption
CN106101098B (en) A kind of information assets recognition methods and device
Kakavand et al. Effective dimensionality reduction of payload-based anomaly detection in TMAD model for HTTP payload
CN110069907A (en) Big data source tracing method and system based on digital watermarking
CN109472150B (en) Method for setting and reading file information
CN110287711A (en) A kind of analysis method for protecting privacy of user
CN110492992A (en) A kind of data encryption and transmission method based on radio RF recognition technology
CN115967549A (en) Anti-leakage method based on internal and external network information transmission and related equipment thereof
Choudhury et al. Biometric passport for National Security Using Multibiometrics and encrypted biometric data encoded in the QR code
Arora et al. Comparative analysis of anonymization techniques
Haynes Using Image Steganography to Establish Covert Communication Channels
Presswala et al. Survey on anonymization in privacy preserving data mining
Jamuna et al. A novel approach for password strength analysis through support vector machine
Pricop Biometrics the secret to securing industrial control systems
Amlak et al. Data Mining Techniques for Cloud Privacy Preservation
CN204496561U (en) A kind of artwork anti-counterfeiting management system based on Internet of Things
Lakshmi et al. Wlan intrusion detection system based on svm
Wang et al. A data masking method based on genetic algorithm
More et al. Survey on cbir using ksecure sum protocol in privacy preserving framework’
Zhao et al. Research on the Big Data Security Application Based on Artificial Intelligence Technology in Operators
Prashar Current Status of Challenges in Data Security: A Review Neetika Prashar, Susheela Hooda, and Raju Kumar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant