CN116011014A - Privacy computing method and privacy computing system - Google Patents

Privacy computing method and privacy computing system Download PDF

Info

Publication number
CN116011014A
CN116011014A CN202310034612.1A CN202310034612A CN116011014A CN 116011014 A CN116011014 A CN 116011014A CN 202310034612 A CN202310034612 A CN 202310034612A CN 116011014 A CN116011014 A CN 116011014A
Authority
CN
China
Prior art keywords
data
calculation
training
privacy
encryption
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310034612.1A
Other languages
Chinese (zh)
Inventor
阮安邦
魏明
邵革健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Octa Innovations Information Technology Co Ltd
Original Assignee
Beijing Octa Innovations Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Octa Innovations Information Technology Co Ltd filed Critical Beijing Octa Innovations Information Technology Co Ltd
Priority to CN202310034612.1A priority Critical patent/CN116011014A/en
Publication of CN116011014A publication Critical patent/CN116011014A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a privacy calculating method, which comprises the following steps: s1, gathering at least more than two participants, S2, directly calculating on a plurality of ciphertexts after encryption, S3, sending one of the ciphertexts to a receiver from the message to be sent by a message sender, S4, performing zero knowledge proof on the acquired content information, S5, establishing a model by using a federal learning system, S6, training a machine learning model by using the data, S7, and training by locally using respective data sets.

Description

Privacy computing method and privacy computing system
Technical Field
The invention belongs to the technical field of privacy calculation, and particularly relates to a privacy calculation method. Meanwhile, the invention also relates to a privacy computing system.
Background
The privacy calculation refers to a technical set for realizing data analysis and calculation on the premise of protecting the data itself from external leakage. Current privacy computing involves joint innovation of three major technological systems: the method comprises the steps of an artificial intelligence algorithm, a distributed system and underlying hardware, and a cryptographic protocol design. Compared with the traditional data using mode, the encryption mechanism of the privacy calculation can enhance the protection of the data and reduce the risk of data leakage.
The patent with application number 202111022108.7 discloses a privacy calculating method, which comprises the following steps: the participants split the respective single privacy numbers into N addends respectively, one of the addends is reserved, and the rest addends are distributed to other participants for storage; establishing a neural network model fitting objective function, and publishing the 0 th layer and the 1 st layer of the neural network model to all participants; each participant obtains an input number calculation of the layer 1 neuron; substituting the addends corresponding to the stored privacy numbers into the input number calculation formulas of the layer 1 neurons respectively, and broadcasting the calculation result signature; summing all the broadcast results to obtain the input number of the neurons; and after the output of all the layer 1 neurons is obtained, continuously calculating a neural network model to obtain the output of the neural network model, namely, the result of the objective function. The invention has the following substantial effects: the method has high calculation efficiency and low requirement on communication bandwidth, and has wide application range.
According to the scheme, although the calculation efficiency of privacy calculation is improved, the requirement on communication bandwidth is low, the security of data in the scheme is not high, the risk of leakage of data and information exists, the universality of the calculation method is not strong, and privacy protection is not convenient to combine with other technologies.
Disclosure of Invention
The invention aims to solve the defects existing in the prior art, based on the cryptology safety, the safety is proved by a strict cryptotheory, the absolute control right of each party to the data owned by the party is not based on trust of any party, operators, system, hardware or software, the basic data and information are guaranteed not to be revealed, meanwhile, the calculation accuracy is high, the programmable general calculation is supported, the private data is not directly revealed for each party, the parameter training data amount is not additionally increased, the privacy calculation method and the privacy calculation system can be independently used for the private calculation, the privacy can be protected by combining with other technologies, and especially, the privacy calculation method and the privacy calculation system are an important technical means for the scenes of large data, high performance and general privacy calculation such as safe and credible cloud calculation, large-scale data secret cooperation, and deep learning of privacy protection.
In order to achieve the above purpose, the present invention provides the following technical solutions: a privacy computing method comprising the steps of:
s1, gathering at least more than two participants, dividing the secret of each participant into N parts, reserving one part of each participant, managing each split part by different participants, and encrypting the secret to form a ciphertext;
s2, directly calculating on a plurality of ciphertexts after encryption, encrypting and disturbing each gate in a calculation circuit, ensuring that the original data and the intermediate data of calculation cannot be leaked outwards in the encryption calculation process, sequentially calculating according to respective inputs by two parties, decrypting to obtain a final correct result, and comparing the decrypted calculation result with the plain text to enable a decrypting party to acquire the final decrypted result;
s3, the message sender sends one of the messages to be sent to the receiver, but the receiver cannot acquire the residual data except the acquired content, and the sender does not know the selected content;
s4, performing zero-knowledge proof on the acquired content information, wherein the prover can enable the verifier to trust that a certain assertion is correct under the condition that no useful information is provided for the verifier;
s5, establishing a model by using a federal learning system, and utilizing an encryption-based user sample alignment technology to confirm the shared users of the two parties on the premise that the A and the B do not disclose respective data and not expose the users which are not overlapped with each other so as to combine the characteristics of the users for modeling;
s6, after the shared user group is determined, training a machine learning model by using the data, and carrying out encryption training by a third party collaborator C to ensure the confidentiality of the data in the training process;
s7, training is carried out by utilizing the data sets locally, the data of each participant does not leave the local area in the whole training process, information such as the gradient and the weight of the model is uploaded to a central server for aggregation and segmentation, the private data cannot be directly leaked, the training data amount cannot be additionally increased, and therefore the training task is completed.
Preferably, in S2, the homomorphic encryption technique may be used to calculate and then decrypt multiple ciphertexts, so that each ciphertext does not need to be decrypted, and thus the calculation cost is high; the homomorphic encryption technology is utilized to realize the calculation of the ciphertext by the keyless party, and the ciphertext calculation does not need to pass through the keyless party, so that the communication cost can be reduced, the calculation task can be transferred, and the calculation cost of each party can be balanced.
Preferably, in S3, the message transmission includes the steps of: 1) The sender generates two pairs of public and private keys and sends the two public keys puk0, puk1 to the recipient; 2) The receiver generates a random number, encrypts the random number by using one of the two received public keys, and sends the ciphertext result to the sender; 3) The sender decrypts the received random number ciphertext by using two private keys of the sender, obtains two decryption results k0 and k1, performs exclusive or on the two results and two pieces of information to be sent, and sends the two results e0 and e1 to the receiver; 4) The receiver uses own real random number to exclusive-or the received e0 and e1 respectively, and only one of the two obtained results is real data, and the other is random number.
Preferably, in S6, the process of encryption training includes the following steps: a) The collaborator C distributes the public key to A and B to encrypt the data to be exchanged in the training process; b) The A and the B interact with each other in an encrypted form to calculate an intermediate result of the gradient; c) The A and the B respectively calculate respective gradient values based on the decrypted interaction intermediate information, then the A and the B respectively upload the calculated gradient values of X1, X2, X3 and X4 to C, and the C calculates new parameters of the model based on the gradient values; d) C transmits four new parameters back to a and B, respectively, i.e. updates the model of a, B, for a new round of iteration.
The invention also provides a privacy computing system, which comprises a multiparty security computing module, a federal learning module, a confidential computing module and a differential privacy module, wherein the multiparty security computing module refers to that a participant participates in confidential computing by utilizing private data under the condition of not revealing respective private data to jointly complete a certain computing task, the federal learning module takes a central server as a central node, and realizes update iteration of an artificial intelligent model by exchanging network information with a plurality of local servers participating in training, and the confidential computing module is a hardware-based technology, isolates data, specific functions and application programs from an operating system, a system management program or a virtual machine manager and other specific processes, so that the data can be stored in a trusted execution environment, and even if a debugger is used, the data can not be checked from outside or the operation can not be executed; the differential privacy module removes individual characteristics on the premise of retaining statistical characteristics so as to ensure user privacy;
the multiparty security calculation module comprises a secret sharing unit, a homomorphic encryption unit, an careless transmission unit, a zero knowledge proof unit and a confusion circuit; the federal learning module comprises an encryption sample alignment unit, an encryption model training unit and an effect excitation unit; the core functions of the confidential calculation module are to protect confidentiality of data, integrity of data and security of data.
Preferably, the secret sharing unit splits the secret in a proper mode, each split share is managed by different participants, a single participant cannot recover secret information, only a plurality of participants cooperate together to recover the secret information, and when a problem occurs to the participants in any corresponding range, the secret can be completely protected; the homomorphic encryption unit allows direct calculation on the encrypted ciphertext, the calculation result is identical to the calculation result of the plaintext after decryption, and the homomorphic encryption unit can be used for calculating a plurality of ciphertexts before decryption, so that each ciphertext does not need to be decrypted and the high calculation cost is saved; the inadvertent transmission unit is a communication protocol of both parties capable of protecting privacy, a message sender sends one of the messages to be sent to a receiver, and the receiver cannot determine the sent information.
Preferably, the zero knowledge proof unit means that the prover is able to let the verifier trust that a certain assertion is correct without providing any useful information to the verifier; the confusion circuit is a Boolean circuit for safe calculation of both sides, encrypts and breaks up each gate in the calculation circuit, and ensures that the original data and the intermediate data of calculation cannot be leaked outwards in the encryption calculation process.
Preferably, the encryption sample alignment unit is configured to confirm the common users of both parties on the premise that the respective data are not disclosed by a and B, and to not expose the users that do not overlap each other, by using an encryption-based user sample alignment technique, so as to model in conjunction with the features of these users; the encryption model training unit is used for training a machine learning model by utilizing the data after the common user group is determined; in order to ensure the confidentiality of data in the training process, encryption training is needed by a third party collaborator C; the effect excitation unit is used for continuously exciting more mechanisms to join the data federation.
The invention has the technical effects and advantages that:
the invention is based on cryptography security, has strict cryptotheory proof of security, does not trust any participators, operators, systems, hardware or software as a basis, ensures that each participator has absolute control right on data owned by the participator, ensures that basic data and information cannot be revealed, has high calculation accuracy, supports programmable general calculation, does not directly reveal private data or additionally increase the amount of parameter training data for each participator, can be independently used for privacy calculation, can be combined with other technologies to protect privacy, and is an important technical means especially for scenes related to large data, high-performance and general privacy calculation such as safe trusted cloud calculation, large-scale data privacy cooperation, deep learning of privacy protection and the like.
Drawings
FIG. 1 is a flow chart of the present invention;
fig. 2 is a system block diagram of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and the embodiments, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1-2, the present invention provides a privacy calculating method, comprising the following steps:
s1, gathering at least more than two participants, dividing the secret of each participant into N parts, reserving one part of each participant, managing each split part by different participants, and encrypting the secret to form a ciphertext;
s2, directly calculating on a plurality of ciphertexts after encryption, encrypting and disturbing each gate in a calculation circuit, ensuring that the original data and the intermediate data of calculation cannot be leaked outwards in the encryption calculation process, sequentially calculating according to respective inputs by two parties, decrypting to obtain a final correct result, and comparing the decrypted calculation result with the plain text to enable a decrypting party to acquire the final decrypted result;
when the receiver receives the encrypted information, the sender sends the key corresponding to the input of the sender to the receiver, for example, if the input of Alice is 0, then K0x is sent, if the input is 1, K1x is sent, meanwhile, keys related to the receiver are sent to the receiver through the careless transmission module, namely, K0y and K1y, then the receiver selects the related keys according to the input of the sender, but the sender does not know which key the receiver selects, the receiver tries to decrypt each line of the encrypted information according to the received Kx and the received Ky, only one line can be decrypted successfully finally, and the corresponding Kz is extracted, the receiver sends Kz to the sender, and the sender knows whether the calculation result is 0 or 1 through comparing whether the K0z or the K1 z.
The homomorphic encryption technology can be utilized to calculate and then decrypt a plurality of ciphertexts, and each ciphertext does not need to be decrypted, so that high calculation cost is saved; the homomorphic encryption technology is utilized to realize the calculation of the ciphertext by the keyless party, and the ciphertext calculation does not need to pass through the keyless party, so that the communication cost can be reduced, the calculation task can be transferred, and the calculation cost of each party can be balanced.
S3, the message sender sends one of the messages to be sent to the receiver, but the receiver cannot acquire the residual data except the acquired content, and the sender does not know the selected content;
the message transmission includes the steps of: 1) The sender generates two pairs of public and private keys and sends the two public keys puk0, puk1 to the recipient; 2) The receiver generates a random number, encrypts the random number by using one of the two received public keys, and sends the ciphertext result to the sender; 3) The sender decrypts the received random number ciphertext by using two private keys of the sender, obtains two decryption results k0 and k1, performs exclusive or on the two results and two pieces of information to be sent, and sends the two results e0 and e1 to the receiver; 4) The receiver uses own real random number to exclusive-or the received e0 and e1 respectively, and only one of the two obtained results is real data, and the other is random number.
S4, performing zero-knowledge proof on the acquired content information, wherein the prover can enable the verifier to trust that a certain assertion is correct under the condition that no useful information is provided for the verifier;
s5, establishing a model by using a federal learning system, and utilizing an encryption-based user sample alignment technology to confirm the shared users of the two parties on the premise that the A and the B do not disclose respective data and not expose the users which are not overlapped with each other so as to combine the characteristics of the users for modeling;
s6, after the shared user group is determined, training a machine learning model by using the data, and carrying out encryption training by a third party collaborator C to ensure the confidentiality of the data in the training process;
the encryption training process comprises the following steps: a) The collaborator C distributes the public key to A and B to encrypt the data to be exchanged in the training process; b) The A and the B interact with each other in an encrypted form to calculate an intermediate result of the gradient; c) The A and the B respectively calculate respective gradient values based on the decrypted interaction intermediate information, then the A and the B respectively upload the calculated gradient values of X1, X2, X3 and X4 to C, and the C calculates new parameters of the model based on the gradient values; d) C transmits four new parameters back to a and B, respectively, i.e. updates the model of a, B, for a new round of iteration.
S7, training is carried out by utilizing the data sets locally, the data of each participant does not leave the local area in the whole training process, information such as the gradient and the weight of the model is uploaded to a central server for aggregation and segmentation, the private data cannot be directly leaked, the training data amount cannot be additionally increased, and therefore the training task is completed.
The invention also provides a privacy computing system, which comprises a multiparty security computing module, a federal learning module, a confidential computing module and a differential privacy module, wherein the multiparty security computing module refers to that a participant participates in confidential computing by utilizing private data under the condition of not revealing respective private data to jointly complete a certain computing task, the federal learning module takes a central server as a central node, and realizes update iteration of an artificial intelligent model by exchanging network information with a plurality of local servers participating in training, and the confidential computing module is a hardware-based technology, isolates data, specific functions and application programs from an operating system, a system management program or a virtual machine manager and other specific processes, so that the data can be stored in a trusted execution environment, and even if a debugger is used, the data can not be checked from outside or the operation can not be executed; the differential privacy module removes individual characteristics on the premise of retaining statistical characteristics so as to ensure user privacy;
the multiparty security calculation module comprises a secret sharing unit, a homomorphic encryption unit, an careless transmission unit, a zero knowledge proof unit and a confusion circuit; the federal learning module comprises an encryption sample alignment unit, an encryption model training unit and an effect excitation unit; the core functions of the confidential calculation module are to protect confidentiality of data, integrity of data and security of data.
Protecting confidentiality of data: the data in the memory is encrypted, and even if an attacker steals the memory data, the data cannot be revealed;
protecting the integrity of the data: the measurement value ensures the integrity of data and codes, and any change of the data or codes in use can cause the change of the measurement value;
protecting the security of data: the confidential computing module application has a smaller TCB than the normal application, meaning a smaller attack surface and also meaning more secure.
The secret sharing unit splits the secret in a proper mode, each split share is managed by different participants, a single participant cannot recover secret information, only a plurality of participants cooperate together to recover the secret information, and when a problem occurs to the participants in any corresponding range, the secret can be completely protected; the homomorphic encryption unit allows direct calculation on the encrypted ciphertext, the calculation result is identical to the calculation result of the plaintext after decryption, and the homomorphic encryption unit can be used for calculating a plurality of ciphertexts before decryption, so that each ciphertext does not need to be decrypted and the high calculation cost is saved; the inadvertent transmission unit is a communication protocol of both parties capable of protecting privacy, a message sender sends one of the messages to be sent to a receiver, and the receiver cannot determine the sent information.
The zero knowledge proof unit refers to a proof person which can make the verifier trust that a certain assertion is correct without providing any useful information to the verifier; the confusion circuit is a Boolean circuit for safe calculation of both sides, encrypts and breaks up each gate in the calculation circuit, and ensures that the original data and the intermediate data of calculation cannot be leaked outwards in the encryption calculation process.
Assuming two data owners a and B, the encrypted sample alignment unit uses an encrypted-based user sample alignment technique to confirm the common users of both parties on the premise that a and B do not disclose respective data, and not expose users that do not overlap each other, so as to model in conjunction with the features of these users; the encryption model training unit is used for training a machine learning model by utilizing the data after the common user group is determined; in order to ensure the confidentiality of data in the training process, encryption training is needed by a third party collaborator C; the effect excitation unit is used for continuously exciting more mechanisms to join the data federation.
In summary, the security of the invention is based on the cryptology security, which has strict cryptology, and is not based on trust of any party, operator, system, hardware or software, each party has absolute control right on the data owned by the party, so that the basic data and information can not be revealed, meanwhile, the computing accuracy is high, and programmable general computing is supported, and for each party, the private data can not be directly revealed, the amount of the parameter training data can not be additionally increased, and the invention can be independently used for the private computing, can also be combined with other technologies to protect privacy, and is especially an important technical means for the scenes of large data, high performance and general privacy computing, such as safe and credible cloud computing, large-scale data secret cooperation, and deep learning of privacy protection.
Finally, it should be noted that: the foregoing description is only illustrative of the preferred embodiments of the present invention, and although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments described, or equivalents may be substituted for elements thereof, and any modifications, equivalents, improvements or changes may be made without departing from the spirit and principles of the present invention.

Claims (8)

1. A method of privacy calculation comprising the steps of:
s1, gathering at least more than two participants, dividing the secret of each participant into N parts, reserving one part of each participant, managing each split part by different participants, and encrypting the secret to form a ciphertext;
s2, directly calculating on a plurality of ciphertexts after encryption, encrypting and disturbing each gate in a calculation circuit, ensuring that the original data and the intermediate data of calculation cannot be leaked outwards in the encryption calculation process, sequentially calculating according to respective inputs by two parties, decrypting to obtain a final correct result, and comparing the decrypted calculation result with the plain text to enable a decrypting party to acquire the final decrypted result;
s3, the message sender sends one of the messages to be sent to the receiver, but the receiver cannot acquire the residual data except the acquired content, and the sender does not know the selected content;
s4, performing zero-knowledge proof on the acquired content information, wherein the prover can enable the verifier to trust that a certain assertion is correct under the condition that no useful information is provided for the verifier;
s5, establishing a model by using a federal learning system, and utilizing an encryption-based user sample alignment technology to confirm the shared users of the two parties on the premise that the A and the B do not disclose respective data and not expose the users which are not overlapped with each other so as to combine the characteristics of the users for modeling;
s6, after the shared user group is determined, training a machine learning model by using the data, and carrying out encryption training by a third party collaborator C to ensure the confidentiality of the data in the training process;
s7, training is carried out by utilizing the data sets locally, the data of each participant does not leave the local area in the whole training process, information such as the gradient and the weight of the model is uploaded to a central server for aggregation and segmentation, the private data cannot be directly leaked, the training data amount cannot be additionally increased, and therefore the training task is completed.
2. A method of privacy calculation as defined in claim 1, wherein: in S2, the homomorphic encryption technology can be utilized to calculate and then decrypt a plurality of ciphertexts, so that each ciphertext does not need to be decrypted, and high calculation cost is not required; the homomorphic encryption technology is utilized to realize the calculation of the ciphertext by the keyless party, and the ciphertext calculation does not need to pass through the keyless party, so that the communication cost can be reduced, the calculation task can be transferred, and the calculation cost of each party can be balanced.
3. A method of privacy calculation as defined in claim 1, wherein: in S3, the message transmission includes the steps of: 1) The sender generates two pairs of public and private keys and sends the two public keys puk0, puk1 to the recipient; 2) The receiver generates a random number, encrypts the random number by using one of the two received public keys, and sends the ciphertext result to the sender; 3) The sender decrypts the received random number ciphertext by using two private keys of the sender, obtains two decryption results k0 and k1, performs exclusive or on the two results and two pieces of information to be sent, and sends the two results e0 and e1 to the receiver; 4) The receiver uses own real random number to exclusive-or the received e0 and e1 respectively, and only one of the two obtained results is real data, and the other is random number.
4. A method of privacy calculation as defined in claim 1, wherein: in S6, the encryption training process includes the steps of: a) The collaborator C distributes the public key to A and B to encrypt the data to be exchanged in the training process; b) The A and the B interact with each other in an encrypted form to calculate an intermediate result of the gradient; c) The A and the B respectively calculate respective gradient values based on the decrypted interaction intermediate information, then the A and the B respectively upload the calculated gradient values of X1, X2, X3 and X4 to C, and the C calculates new parameters of the model based on the gradient values; d) C transmits four new parameters back to a and B, respectively, i.e. updates the model of a, B, for a new round of iteration.
5. A privacy computing system based on a privacy computing method according to any of claims 1-4, characterized in that: the system comprises a multiparty safety calculation module, a federal learning module, a confidential calculation module and a differential privacy module, wherein the multiparty safety calculation module refers to that participants participate in confidential calculation by utilizing the private data under the condition of not revealing the private data, a certain calculation task is jointly completed, the federal learning module takes a central server as a central node, and realizes update iteration of an artificial intelligent model by exchanging network information with a plurality of local servers participating in training, the confidential calculation module is a hardware-based technology, and isolates data, specific functions, application programs, an operating system, a system management program or a virtual machine manager from other specific processes, so that the data can be stored in a trusted execution environment, and even if a debugger is used, the data can not be checked or operated from the outside; the differential privacy module removes individual characteristics on the premise of retaining statistical characteristics so as to ensure user privacy;
the multiparty security calculation module comprises a secret sharing unit, a homomorphic encryption unit, an careless transmission unit, a zero knowledge proof unit and a confusion circuit; the federal learning module comprises an encryption sample alignment unit, an encryption model training unit and an effect excitation unit; the core functions of the confidential calculation module are to protect confidentiality of data, integrity of data and security of data.
6. A privacy computing system as defined in claim 5, wherein: the secret sharing unit splits the secret in a proper mode, each split share is managed by different participants, a single participant cannot recover secret information, only a plurality of participants cooperate together to recover the secret information, and when a problem occurs to the participants in any corresponding range, the secret can be completely protected; the homomorphic encryption unit allows direct calculation on the encrypted ciphertext, the calculation result is identical to the calculation result of the plaintext after decryption, and the homomorphic encryption unit can be used for calculating a plurality of ciphertexts before decryption, so that each ciphertext does not need to be decrypted and the high calculation cost is saved; the inadvertent transmission unit is a communication protocol of both parties capable of protecting privacy, a message sender sends one of the messages to be sent to a receiver, and the receiver cannot determine the sent information.
7. A privacy computing system as defined in claim 5, wherein: the zero knowledge proof unit refers to a proof person which can make the verifier trust that a certain assertion is correct without providing any useful information to the verifier; the confusion circuit is a Boolean circuit for safe calculation of both sides, encrypts and breaks up each gate in the calculation circuit, and ensures that the original data and the intermediate data of calculation cannot be leaked outwards in the encryption calculation process.
8. A privacy computing system as defined in claim 5, wherein: assuming two data owners a and B, the encrypted sample alignment unit uses an encrypted-based user sample alignment technique to confirm the common users of both parties on the premise that a and B do not disclose respective data, and not expose users that do not overlap each other, so as to model in conjunction with the features of these users; the encryption model training unit is used for training a machine learning model by utilizing the data after the common user group is determined; in order to ensure the confidentiality of data in the training process, encryption training is needed by a third party collaborator C; the effect excitation unit is used for continuously exciting more mechanisms to join the data federation.
CN202310034612.1A 2023-01-10 2023-01-10 Privacy computing method and privacy computing system Pending CN116011014A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310034612.1A CN116011014A (en) 2023-01-10 2023-01-10 Privacy computing method and privacy computing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310034612.1A CN116011014A (en) 2023-01-10 2023-01-10 Privacy computing method and privacy computing system

Publications (1)

Publication Number Publication Date
CN116011014A true CN116011014A (en) 2023-04-25

Family

ID=86019055

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310034612.1A Pending CN116011014A (en) 2023-01-10 2023-01-10 Privacy computing method and privacy computing system

Country Status (1)

Country Link
CN (1) CN116011014A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116757275A (en) * 2023-06-07 2023-09-15 京信数据科技有限公司 Knowledge graph federal learning device and method
CN116842578A (en) * 2023-08-31 2023-10-03 武汉大数据产业发展有限公司 Privacy computing platform, method, electronic equipment and medium in data element transaction

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116757275A (en) * 2023-06-07 2023-09-15 京信数据科技有限公司 Knowledge graph federal learning device and method
CN116842578A (en) * 2023-08-31 2023-10-03 武汉大数据产业发展有限公司 Privacy computing platform, method, electronic equipment and medium in data element transaction

Similar Documents

Publication Publication Date Title
CN107911216B (en) Block chain transaction privacy protection method and system
CN110138802B (en) User characteristic information acquisition method, device, block chain node, network and storage medium
CN110719159A (en) Multi-party privacy set intersection method for resisting malicious enemies
CN109194523A (en) The multi-party diagnostic model fusion method and system, cloud server of secret protection
CN116011014A (en) Privacy computing method and privacy computing system
CN101309137B (en) Uni-directional function tree multicast key management method based on cipher sharing
CN106301788A (en) A kind of group key management method supporting authenticating user identification
CN110233826B (en) Privacy protection method based on data confusion among users and terminal data aggregation system
CN108880801A (en) The distributed nature base encryption method of fine granularity attribute revocation is supported on a kind of lattice
CN109547199A (en) A kind of method that multi-party joint generates SM2 digital signature
CN105978689A (en) Anti-key-exposure cloud data safe sharing method
JP2023552263A (en) Redistribution of secret sharing
Kohlweiss et al. Accountable metadata-hiding escrow: A group signature case study
CN106169996A (en) Multi-area optical network key management method based on key hypergraph and identification cipher
CN117201132A (en) Multi-committee attribute base encryption method capable of achieving complete decentralization and application of multi-committee attribute base encryption method
CN113300835A (en) Encryption scheme receiver determining method and active secret sharing method
Zhong et al. MPC-based privacy-preserving serverless federated learning
CN116132012A (en) Trusted privacy data comparison method, storage device and intelligent terminal thereof
Li et al. An efficient privacy-preserving bidirectional friends matching scheme in mobile social networks
Zhou et al. A survey of security aggregation
CN110321722B (en) DNA sequence similarity safe calculation method and system
Shen et al. Verifiable Privacy-Preserving Federated Learning Under Multiple Encrypted Keys
CN109218016B (en) Data transmission method and device, server, computer equipment and storage medium
He et al. Efficient group key management for secure big data in predictable large‐scale networks
CN112769766B (en) Safe aggregation method and system for data of power edge internet of things based on federal learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination