CN111259449A - Processing method of private data, cleaner and cloud storage system - Google Patents

Processing method of private data, cleaner and cloud storage system Download PDF

Info

Publication number
CN111259449A
CN111259449A CN202010058236.6A CN202010058236A CN111259449A CN 111259449 A CN111259449 A CN 111259449A CN 202010058236 A CN202010058236 A CN 202010058236A CN 111259449 A CN111259449 A CN 111259449A
Authority
CN
China
Prior art keywords
data
signature
user
privacy
cleaner
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010058236.6A
Other languages
Chinese (zh)
Inventor
李良
李千目
练志超
Original Assignee
Shenzhen Bowei Chuangsheng Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Bowei Chuangsheng Technology Co ltd filed Critical Shenzhen Bowei Chuangsheng Technology Co ltd
Priority to CN202010058236.6A priority Critical patent/CN111259449A/en
Publication of CN111259449A publication Critical patent/CN111259449A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Storage Device Security (AREA)

Abstract

The application discloses a processing method of private data, a cleaner and a cloud storage system, wherein the processing method comprises the steps of receiving first data and a first signature uploaded by a user side, wherein the first data comprises hidden information, and the hidden information is the user privacy which is hidden; hiding organization privacy in the first data to obtain second data and generating a second signature; and uploading the second data and the second signature to the cloud, wherein the second signature is used for data integrity audit. By means of the method, the processing method can protect the private data when the data are uploaded to the cloud, and can verify whether the data are correctly stored in the cloud.

Description

Processing method of private data, cleaner and cloud storage system
Technical Field
The application relates to the technical field of data security, in particular to a processing method of private data, a cleaner and a cloud storage system.
Background
With the explosive growth of data, it is a heavy burden for users to store large amounts of data locally. As a result, more and more organizations and individuals prefer to store their data on the cloud. The user can store data on the cloud by means of the cloud storage service, and share the data with others. Data sharing is one of many common functions in cloud storage that allows many users to share their data among each other. However, the shared data stored in the cloud may contain some private information. Such private information should not be disclosed to others. Encrypting the entire shared file may enable privacy information hiding, but may make the shared file unusable by others.
Furthermore, data stored on the cloud may also suffer from corruption or loss due to some unavoidable software bugs, hardware problems, and human errors, requiring periodic verification of the integrity of the stored data.
Disclosure of Invention
The application provides a processing method of private data, a cleaner and a cloud storage system, and aims to solve the problem that the private information in the cloud system cannot be protected in the prior art.
In order to solve the technical problem, the application provides a method for processing private data, which includes receiving first data and a first signature uploaded by a user side, wherein the first data includes hidden information, and the hidden information is user privacy which is hidden; hiding organization privacy in the first data to obtain second data and generating a second signature; and uploading the second data and the second signature to the cloud, wherein the second signature is used for data integrity audit.
In order to solve the technical problem, the present application proposes a cleaner for processing private data, which includes a processor, wherein the processor is configured to execute instructions to implement the method.
In order to solve the technical problem, the application provides a method for processing private data, which includes dividing an original file into n data blocks, and hiding the data blocks containing user privacy to generate first data and a first signature; the first data and the first signature are sent to the cleaner.
In order to solve the above technical problem, the present application provides a user side for processing private data, including a processor, where the processor is configured to execute instructions to implement the above method.
In order to solve the technical problem, the application provides a system for cloud storage of private data, which includes a user side, a cleaner, a storage module and a processing module, wherein the user side is used for generating first data and a first signature, and sending the first data and the first signature to the cleaner, and the first data includes hidden information of the privacy of a hidden user; the cleaner is used for receiving the first data and the first signature, generating second data and a second signature, and uploading the second data and the second signature to the cloud, wherein the second data comprises hidden information for hiding user privacy and organization privacy; and the cloud is used for receiving and storing the second data and the second signature, wherein the second signature is used for data integrity audit.
In order to solve the above technical problem, the present application provides a computer storage medium, wherein a computer program is stored in the computer storage medium, and the computer program implements the above method when being executed by a processor.
The application provides a processing method of private data, and through the method, the private data can be hidden twice, the privacy of a user is hidden firstly, first data is formed and then sent to a cleaner, the cleaner processes the organization privacy, second data is formed and sent to a cloud end, and finally the second data is uploaded to the data of the cloud end. Finally, the user privacy and the organization privacy are both hidden, and the user privacy is not exposed to the cleaner. In addition, the data transmission is accompanied with the transmission of the signature, wherein the second signature can be used for verifying the integrity of the cloud storage data, and the problem of data loss or damage is solved. Therefore, the privacy data processing method provided by the application can effectively protect the privacy data uploaded to the cloud and verify the correct storage of the data.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram illustrating an embodiment of a method for processing private data according to the present application;
FIG. 2 is a schematic structural diagram of an embodiment of a user side for processing private data according to the present application;
FIG. 3 is a schematic flow chart diagram illustrating another embodiment of a method for processing private data according to the present application;
FIG. 4 is a schematic diagram illustrating processing of private information according to an embodiment of the private data processing method of the present application;
FIG. 5 is a schematic structural diagram of an embodiment of a cleaner for processing private data according to the present application
Fig. 6 is a schematic structural diagram of an embodiment of a cloud storage system according to the present application;
FIG. 7 is a schematic structural diagram of an embodiment of a computer storage medium according to the present application.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present application, the following describes in detail a processing method of private data, a cleaner, and a cloud storage system provided by the present invention with reference to the accompanying drawings and the detailed description.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an embodiment of a method for processing private data according to the present application.
S11: dividing an original file into n data blocks, and hiding the data blocks containing user privacy to generate first data and a first signature.
The method comprises the steps that a user side obtains an original file which the user wants to upload to a cloud, wherein the original file comprises user privacy and organization privacy. The user privacy may be user personal information such as user name, telephone number, identification number, home address, etc.
The user end divides the original file F into n data blocks (m)1,m2,…,mn) Wherein m isiRepresenting the ith data block of file F. Assume that the user's identity ID is 1 bit, which is described as ID ═ (ID)1,ID2,…,IDl)∈{0,1}1。K1Is an indexed set of data blocks corresponding to the user privacy of original file F. K2Is an index set of data blocks corresponding to the organization privacy of the original file F. In order to protect user privacyExposed to the cleaner, the client will index into the set K1The data block in (1) is hidden, for example, the content of user privacy is processed into a messy code, etc., to form the first data F*First data F*Can be divided into n data blocks
Figure BDA0002373527150000031
And, the user terminal generates a first signature. In this embodiment, the user side first divides the original file into data blocks and then performs the hiding process on the data blocks, and in other embodiments, the user side may also perform the hiding process on the original file and then divide the original file into data blocks. The processing algorithm of the user privacy and the generation algorithm of the first signature may refer to the subsequent embodiments.
S12: the first data and the first signature are sent to the cleaner.
The user end completes the first data F of the hiding process*And the generated first signature is sent to the cleaner. Wherein the first data F*The information hiding method includes hidden information, and the hidden information is the privacy of the user which is hidden.
In this way, the user side divides the original file into n data blocks, and hides the data blocks containing the user privacy to form first data F*And the information is sent to the cleaner, so that the privacy of the user is not exposed to the cleaner, and the privacy security of the user is protected.
Based on the processing method, the application provides the user side for processing the private data. Referring to fig. 2, fig. 2 is a schematic structural diagram of a user side for processing private data according to an embodiment of the present application. The user terminal 200 comprises a processor 21, and the processor 21 is configured to execute instructions to implement the method.
Please refer to fig. 3, and fig. 3 is a schematic flowchart illustrating another embodiment of a method for processing private data according to the present application.
S31: and receiving the first data and the first signature uploaded by the user terminal.
The cleaner receives first data F uploaded by a user side*And a first signature. First data F*And hidden information is contained, wherein the hidden information is the privacy of the user which is hidden.
S32: and hiding the organization privacy in the first data to obtain second data and generating a second signature.
Cleaner to first data F*The organization data in (1) is hidden, the processing steps can be the same as the processing process of the user privacy, and the finally obtained hidden information is messy codes processed by the user privacy and the organization privacy. For uniform format, the cleaner may also replace the hidden information with a wildcard, resulting in the second data F'. The second data F 'includes n data blocks (m'1,m′2,…,m′n) Where the organization privacy may be organization information, such as organization name, etc. Wildcards can generally use an asterisk (#) and a question mark (. The following examples are referred to as specific processing methods.
In order to save operation, the cleaner can directly replace hidden information in the first data and organization privacy by the wildcard characters without converting the organization privacy into messy codes and then replacing the messy codes by the wildcard characters.
In addition, the cleaner may further generate a second signature by converting the first signature into a valid second signature, wherein the conversion formula is:
Figure BDA0002373527150000051
wherein σiIs the first signature, σ'iIs the second signature, β ═ urIs a conversion value for converting the signature; m'iRepresenting an ith data block in the second data;
Figure BDA0002373527150000052
representing an ith data block in the first data; k1And K2Respectively an index set containing a user privacy data block and an index set containing an organization privacy data block; n is the number of data blocks of the original file, and l is the length of user identification;
ID is user identity; r isIDIs in a cryptography set ZpA value selected randomly from; r is in a cryptography set
Figure BDA0002373527150000053
A value selected randomly from; g1Is a large prime number multiplication cyclic group, G is G1Element μ', μ12,……,μ1,u,g2∈G1(ii) a H (name | | i) is Hash algorithm, the name is a random value selected as an identifier by a user ID, and | | | is a connector.
To facilitate understanding of the processing procedure of the private information, please refer to fig. 4, where fig. 4 is a schematic diagram illustrating processing of the private information in an embodiment of the private data processing method according to the present application. Φ is the first set of signatures and Φ' is the second set of signatures. If m1And m3Including user privacy, m5Including organization privacy. The user terminal processes the original file F to obtain first data F*Wherein
Figure BDA0002373527150000054
And
Figure BDA0002373527150000055
the content in (1) is hidden; cleaner to first data F*Processing to obtain second data F ', wherein m'1,m′3,m′5Are replaced by wildcards. And a first signature sigma corresponding to any one data block in the first signature set phiiIs converted into a second signature σ'iA second set of signatures Φ' is formed.
S33: and uploading the second data and the second signature to the cloud.
And the cleaning end uploads the second data and the second signature obtained by the operation to the cloud end, wherein the second signature can be applied to integrity audit of the data.
The embodiment provides a method for processing privacy data, wherein first data containing hidden information is processed again, organization privacy is hidden and replaced by wildcards, and by the method, both the user privacy and the organization privacy in an original file cannot be exposed to a cloud; the user privacy in the original file can not be exposed to the cleaner, and the privacy security of the user is further protected. Meanwhile, the first signature is also converted into a second signature, and the second signature is a valid signature and can verify the integrity of the data.
Based on the processing method, the application provides a cleaner for processing the private data. Referring to fig. 5, fig. 5 is a schematic structural diagram of an embodiment of a cleaner for processing private data according to the present application. The cleaner 500 includes a processor 51, and the processor 51 is configured to execute instructions to implement the above-described method.
Fig. 6 is a schematic structural diagram of an embodiment of the cloud storage system according to the present application.
The system comprises a user terminal 61, a cleaner 62 and a cloud terminal 63. The user terminal 61 is configured to generate first data and a first signature, and send the first data and the first signature to the cleaner 62, where the first data includes hidden information of the privacy of the hidden user; the cleaner 62 is configured to receive first data and a first signature sent by the user end 61, generate second data and a second signature, and upload the second data and the second signature to the cloud 63, where the second data includes hidden information for hiding user privacy and organizing privacy; the cloud 63 is configured to receive and store the second data and the second signature, where the second signature is used for data integrity audit. The user end 61 and the cleaner 62 in this embodiment may be the user end and the cleaner mentioned in the above embodiments, and specific steps may refer to the above embodiments, which are not described herein again. The cloud 63 is a device for cloud storage, and is a cloud computing system with data storage and management as a core. The user can be connected to the cloud end through any device capable of being connected with the network, and data can be conveniently accessed.
When the user terminal 61 needs to view the original document, the original document can be downloaded from the cloud 63. Specifically, the user 61 sends a request to the cleaner 62, the cleaner 62 downloads second data corresponding to the original file from the cloud 63 after receiving the request, the user 61 restores the second data to first data and sends the first data to the user 61, and the user 61 receives the first data and restores the first data to the original file according to the private key.
The system further comprises a private key generator 64 and a third party auditor 65. The user terminal 61 sends the user identity to the private key generator 64, and the private key generator 64 receives the user identity and generates a private key according to the user identity. The private key generator 64 returns the private key to the user terminal 61, and the user terminal 61 hides the original data according to the private key. The encryption of the private information in this embodiment is based on the user identity, so the private key generator 64 is responsible for generating the system public parameters and private keys for the user according to the user identity.
In particular, private key generator 64 runs the algorithm Setup (1)k) The security parameter k is taken as input and the master key msk and the system public parameter pp are then output.
The private key generator 64 runs an algorithm Extract (pp, msk, ID) with the system public parameter pp, the master key msk and the user identity ID as inputs. Outputting the private key sk of the userID. The client 61 can verify skIDThe validity of (2) is accepted as a private key only if it is verified.
The client 61 runs the algorithm SigGen (F, sk)IDSsk, name), the original file F, the user's private key skIDThe user's private signature key ssk and the file identifier name as inputs. Outputting the first data F*Its corresponding signature set Φ and file tag τ, which is used to ensure the correctness of the file identifier name and some verification value.
Cleaner 62 runs the algorithm Sanitization (F)*Φ), the first data F*And its signature set phi as inputs, outputs the second data F 'and its corresponding signature set phi'.
The cloud 63 runs an algorithm proofGen (F ', phi ', chal), and outputs an audit certificate P for proving that the cloud 63 really has the second data F ' by taking the second data F ' and the corresponding signature set phi ' and audit application chal as inputs.
The third party auditor 65 runs the algorithm ProofVerify (chal, pp, P) with the audit application chal, the system public parameters pp and the audit certificate P as inputs. A third party auditor 65 may be used to verify the correctness of the audit trail P.
The third party auditor 65 is used to verify the integrity of the data stored in the cloud 63 on behalf of the user terminal 61. The third-party auditor 65 sends the audit application chal to the cloud 63, the cloud 63 responds to the audit application chal, an audit certificate P is generated and sent to the third-party auditor 65, the third-party auditor 65 verifies the correctness of the audit certificate P, and if the correctness is achieved, the completeness of the second data in the cloud 63 is indicated. Wherein the verification formula may be:
Figure BDA0002373527150000071
wherein e is a computable bilinear map; grIs a verification value; g1Is a common variable; lambda is sigmai∈Im′iviIs a linear combination in the second data block;
Figure BDA00023735271500000810
is a random value generated in the audit application.
For the convenience of understanding the scheme of the present embodiment, the following describes the operation process of various algorithms in detail:
1)Setup(1k) Algorithm
The private key generator 64 issues the system parameter pp and saves the master key msk.
2) Extract (pp, msk, ID) algorithm
The user terminal 61 determines that the user identity ID Is (ID)1,ID2,…,IDl) And sends the user identity to private key generator 64. Private key generator 64 randomly selects a variable
Figure BDA0002373527150000081
And calculating the private key of the user's identity
Figure BDA0002373527150000082
The private key generator 64 sends the private key skIDTo the user terminal 61. By usingThe client 61 verifies the private key skIDIf the equation in this step is not satisfied, the verification fails, and the user end 61 does not accept the private key skID(ii) a If the verification is successful, the user end 61 accepts the private key skID
3)SigGen(F,skIDSsk, name) algorithm
4)Sanitization(F*,Φ)
The user terminal 61 randomly selects a variable
Figure BDA0002373527150000083
And calculate grRandomly select one
Figure BDA0002373527150000084
And calculating a blind factor
Figure BDA0002373527150000085
Computing a first data block
Figure BDA0002373527150000086
Figure BDA0002373527150000087
Generating a first signature set Φ ═ σi}1≤i≤nIs provided with
Figure BDA0002373527150000088
Calculating a document tag τ ═ τ0||Sigssk0) Calculating a transformed value β ═ u of the transformed signaturer. Finally, the client 61 will { F, Φ, τ, K1And β are sent to cleaner 62.
Cleaner 62 checks the validity of file tag τ and parses τ0Verifying the correctness of the first signature set phi, verifying the correctness of the converted value β, and comparing K in the first data2The data blocks of the set are subjected to hiding processing, and a second signature set phi 'is calculated as { sigma'i}1≤i≤nAnd sending { F ', phi' } to the cloud 63 and sending the file label tau to a third-party auditor 65.
The cleaner 62 verifies the first signatureThe formula for the set Φ is: e (sigma)i,g)=e(g1,g2
Figure BDA0002373527150000089
The verification is not established and the cleaner 62 considers the first signature invalid. Cleaner pass validation e (u, g)r) When the above equation holds, the cleaner 62 will process the data blocks corresponding to the user privacy and the organization privacy, the indexes of which are in the set K1And K2In SigGen algorithm, the index is in set K1The data blocks in (a) have been hidden by the user ID, which will make the content of these data blocks scrambled. To unify the format, the cleaner can replace the contents of these data blocks with wildcards. Cleaner 62 will K1And K2The second data F' for which the signature of the data blocks in the group turns to valid is as follows:
Figure BDA0002373527150000091
Figure BDA0002373527150000092
the cleaner 62 sends { F ', phi' } to the cloud 63, sends the file tag tau to the third-party auditor 65, and then the cleaner 62 deletes the information from the local storage, so that the private information is guaranteed not to be exposed to the cleaner.
5) ProofGen (F ', phi', chal) algorithm
The third party auditor 65 verifies the file label tau and does not perform an audit if the file label is not a rule. If it is correct, analyze τ0Generating and sending audit application chal ═ i, vi}i∈IAnd when the cloud 63 receives the audit application chal, the cloud 63 generates and returns an audit certificate P ═ lambda, sigma. The third party auditor 65 verifies the correctness of the audit trail P:
Figure BDA0002373527150000093
Figure BDA0002373527150000094
the above equation holds trueThen the second data stored in the cloud 63 is complete; if not, the second data stored in the cloud 63 is incomplete.
In the system proposed in this embodiment, three aspects of correctness are proposed, which are the correctness of the private key, the correctness of the first data and the first signature, and the correctness of the verification.
The correctness of the private key means that when the private key generator 64 sends the correct private key to the user terminal 61, the private key can pass the verification of the user terminal 61. In particular, by deriving the left side from the right side, based on the nature of the bilinear map, the correctness of the private key can be proven:
Figure BDA0002373527150000095
the correctness of the first data and the first signature means that when the user terminal 61 sends the first data and the first signature to the cleaner 62, the first data and the first signature can be verified by the cleaner 62. Specifically, based on the nature of the bilinear map, the correctness of the first data and the first signature is as follows:
Figure BDA0002373527150000101
the correctness of the audit means that when the cloud 63 correctly stores the second data of the user, the generated data can be verified by the third party auditor 65. In particular, by deriving the left side from the right side, based on the nature of the bilinear map, the correctness of the audit can be justified:
Figure BDA0002373527150000102
in addition, the system of the present application may be used in an electronic health record where the sensitive information includes two parts, one being user privacy (patient's privacy information), such as patient name and patient ID number, and the other being organization privacy (hospital's privacy information), such as hospital's name. When the electronic health record is uploaded to the cloud for research purposes, the privacy information can be replaced by wildcards by using the above mode. In general, the scrubber may be considered an administrator of an electronic health record information system in a hospital. The user privacy should not be exposed to the cleaner, and all private information should not be exposed to the cloud and sharing users. Before a doctor sends electronic health information of a patient to a cleaner, privacy information of the patient is processed, the doctor generates a signature for the processed electronic health information of the patient and sends the signature to the cleaner, the cleaner cleans data blocks corresponding to organization privacy in the electronic health information, generally, the data blocks are replaced by wildcards, the signatures of the data blocks are converted into effective signatures of effective electronic health records, and remote data integrity auditing can still be effectively executed. The cleaner stores the electronic health information subjected to the secondary processing in the electronic health information system. Furthermore, when a doctor needs the electronic health information, the doctor sends a request to the cleaner, the cleaner downloads the electronic health information from the electronic health information system, the organization privacy information is restored and sent to the doctor, the doctor does not need to operate, the user side can restore the user privacy information according to the private key, and the doctor can check the complete electronic health information of the patient.
The cleaner can also have the function of information management. The cleaner can process the electronic health record information in batches and upload the processed data to the cloud end at a fixed time.
Besides, the system management of the present application can be applied to other information management systems, such as human resource profile management, household data system management, and the like.
Through simulation operation, in each operation of this embodiment, the time spent on private key generation and private key verification is the same, which is close to 0.31s, the time spent on signature generation is 1.476 s, and the time spent on signature verification and first data processing is 2.318s and 0.061s, respectively.
The present application further provides a computer storage medium, as shown in fig. 7, fig. 7 is a schematic structural diagram of an embodiment of the computer storage medium of the present application. The computer storage medium 700 stores therein a computer program 71, and the computer program 71 realizes the above-described data processing method when executed by a processor. Further, the computer storage medium may be various media that can store program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic tape, or an optical disk.
The embodiment provides a system for carrying out cloud storage on private data, and the system can hide original data containing private information twice, hide user privacy of an original file by a user end to form first data, process organization privacy of the first data by a cleaner to form second data and upload the second data to a cloud end. The data stored in the cloud can be shared and used by other people, and privacy information is protected. In addition, the cloud can still effectively perform remote data integrity auditing.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (11)

1. A method for processing private data, comprising:
receiving first data and a first signature uploaded by a user side, wherein the first data comprises hidden information, and the hidden information is user privacy which is hidden;
hiding organization privacy in the first data to obtain second data and generating a second signature;
and uploading the second data and the second signature to a cloud, wherein the second signature is used for data integrity audit.
2. The method of claim 1, wherein the hiding the organization privacy in the first data into second data comprises:
replacing hidden information in the first data and the organization privacy with wildcards.
3. The method of claim 2, wherein generating the second signature comprises:
converting the first signature into the second signature, wherein a conversion formula is:
Figure FDA0002373527140000011
wherein σiIs the first signature, σiIs the second signature, β ═ urIs a conversion value for converting the signature; m'iRepresenting an ith data block in the second data;
Figure FDA0002373527140000013
representing an ith data block in the first data; k1And K2Respectively an index set containing a user privacy data block and an index set containing an organization privacy data block; n is the number of data blocks of the original file, and l is the length of user identification;
ID is user identity; r isIDIs in a cryptography set ZpA value selected randomly from; r is in a cryptography set
Figure FDA0002373527140000012
A value selected randomly from; g1Is a large prime number multiplication cyclic group, G is G1Element μ', μ1,μ2,......,μ1,u,g2∈G1(ii) a H (name | | i) is Hash algorithm, the name is a random value selected as an identifier by a user ID, and | | | is a connector.
4. A cleaner for private data processing, characterized in that the cleaner comprises a processor for executing instructions to implement the method of any of claims 1-3.
5. A method for processing private data, comprising:
dividing an original file into n data blocks, and hiding the data blocks containing user privacy to generate first data and a first signature;
and sending the first data and the first signature to a cleaner.
6. A user terminal for private data processing, the user terminal comprising a processor for executing instructions to implement the method of claim 5.
7. A system for cloud storage of private data, the system comprising:
the client is used for generating first data and a first signature and sending the first data and the first signature to the cleaner, wherein the first data comprises hidden information of the privacy of a hidden user;
the cleaner is used for receiving the first data and the first signature, generating second data and a second signature, and uploading the second data and the second signature to a cloud end, wherein the second data comprises hidden information for hiding the user privacy and the organization privacy;
the cloud is used for receiving and storing the second data and the second signature, wherein the second signature is used for data integrity auditing.
8. The system of claim 7, further comprising:
the private key generator is used for receiving the user identity sent by the user side, generating system public parameters and a private key according to the user identity and sending the private key to the user side;
and the third party auditor is used for verifying the integrity of the data stored in the cloud.
9. The system of claim 8, further comprising:
the cleaner downloads the second data from the cloud, restores the second data into the first data, and sends the first data to the user side;
and the user side restores the first data into the original file according to the private key.
10. The system of claim 8, further comprising:
the third party auditor is used for sending an audit application to the cloud end, receiving an audit certificate generated by the cloud end responding to the audit application, verifying whether the audit certificate is correct or not, and if so, indicating the integrity of the second data;
wherein, the verification formula is as follows:
Figure FDA0002373527140000031
wherein e is a computable bilinear map; grIs a verification value; g1Is a common variable; lambda is sigmai∈Im′iviIs a linear combination in the second data block;
Figure FDA0002373527140000032
is a random value generated in the audit application.
11. A computer storage medium, characterized in that a computer program is stored in the computer storage medium, which computer program, when being executed by a processor, carries out the method according to any one of claims 1-3 and claim 5.
CN202010058236.6A 2020-01-19 2020-01-19 Processing method of private data, cleaner and cloud storage system Pending CN111259449A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010058236.6A CN111259449A (en) 2020-01-19 2020-01-19 Processing method of private data, cleaner and cloud storage system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010058236.6A CN111259449A (en) 2020-01-19 2020-01-19 Processing method of private data, cleaner and cloud storage system

Publications (1)

Publication Number Publication Date
CN111259449A true CN111259449A (en) 2020-06-09

Family

ID=70945357

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010058236.6A Pending CN111259449A (en) 2020-01-19 2020-01-19 Processing method of private data, cleaner and cloud storage system

Country Status (1)

Country Link
CN (1) CN111259449A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113221133A (en) * 2021-04-09 2021-08-06 联想(北京)有限公司 Data transmission method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108400981A (en) * 2018-02-08 2018-08-14 陕西师范大学 The public cloud auditing system and method for lightweight and secret protection in smart city
CN109117672A (en) * 2018-08-24 2019-01-01 青岛大学 Carry out the hiding cloud storage Data Audit method of sensitive information
CN110049054A (en) * 2019-04-24 2019-07-23 电子科技大学 The plaintext shared data auditing method and system for supporting privacy information hiding

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108400981A (en) * 2018-02-08 2018-08-14 陕西师范大学 The public cloud auditing system and method for lightweight and secret protection in smart city
CN109117672A (en) * 2018-08-24 2019-01-01 青岛大学 Carry out the hiding cloud storage Data Audit method of sensitive information
CN110049054A (en) * 2019-04-24 2019-07-23 电子科技大学 The plaintext shared data auditing method and system for supporting privacy information hiding

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WENTING SHEN 等: "Enabling Identity-Based Integrity Auditing and Data Sharing With Sensitive Information Hiding for Secure Cloud Storage" *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113221133A (en) * 2021-04-09 2021-08-06 联想(北京)有限公司 Data transmission method and device

Similar Documents

Publication Publication Date Title
Shen et al. Enabling identity-based integrity auditing and data sharing with sensitive information hiding for secure cloud storage
Yang et al. Lightweight and privacy-preserving delegatable proofs of storage with data dynamics in cloud storage
Yu et al. Strong key-exposure resilient auditing for secure cloud storage
CN111914027B (en) Block chain transaction keyword searchable encryption method and system
US10521616B2 (en) Remote re-enrollment of physical unclonable functions
Yu et al. Enhanced privacy of a remote data integrity-checking protocol for secure cloud storage
Garg et al. RITS-MHT: Relative indexed and time stamped Merkle hash tree based data auditing protocol for cloud computing
Yu et al. Enabling cloud storage auditing with key-exposure resistance
US11895231B2 (en) Adaptive attack resistant distributed symmetric encryption
US8266439B2 (en) Integrity verification of pseudonymized documents
US8661247B2 (en) Computer implemented method for performing cloud computing on data being stored pseudonymously in a database
Zhou et al. Multicopy provable data possession scheme supporting data dynamics for cloud-based electronic medical record system
Barsoum et al. Provable possession and replication of data over cloud servers
US8195951B2 (en) Data processing system for providing authorization keys
Fu et al. DIPOR: An IDA-based dynamic proof of retrievability scheme for cloud storage systems
CN111339570B (en) Method, device, equipment and medium for verifying integrity of cloud storage file
US20230359631A1 (en) Updatable private set intersection
Zhang et al. A general framework to design secure cloud storage protocol using homomorphic encryption scheme
CN109117672A (en) Carry out the hiding cloud storage Data Audit method of sensitive information
CN115473703A (en) Identity-based ciphertext equivalence testing method, device, system and medium for authentication
CN111259449A (en) Processing method of private data, cleaner and cloud storage system
Zhang et al. Efficient integrity verification scheme for medical data records in cloud-assisted wireless medical sensor networks
EP3395032B1 (en) Method for providing a proof-of-retrievability
Xu et al. A generic integrity verification algorithm of version files for cloud deduplication data storage
Yarava et al. Efficient and Secure Cloud Storage Auditing Based on the Diffie-Hellman Key Exchange.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201229

Address after: 910 / F, building B, Xiangzhu garden, 29 nongxuan Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518000

Applicant after: Li Liang

Address before: 518000 901, Shenzhen International Culture Building, Futian Road, Futian District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN BOWEI CHUANGSHENG TECHNOLOGY Co.,Ltd.

WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200609