CN115438381A - Privacy set intersection method, device, equipment and medium based on equal strategies - Google Patents

Privacy set intersection method, device, equipment and medium based on equal strategies Download PDF

Info

Publication number
CN115438381A
CN115438381A CN202211219632.8A CN202211219632A CN115438381A CN 115438381 A CN115438381 A CN 115438381A CN 202211219632 A CN202211219632 A CN 202211219632A CN 115438381 A CN115438381 A CN 115438381A
Authority
CN
China
Prior art keywords
intersection
groups
sample
screening conditions
elements
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211219632.8A
Other languages
Chinese (zh)
Inventor
李武璐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CCB Finetech Co Ltd
Original Assignee
CCB Finetech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CCB Finetech Co Ltd filed Critical CCB Finetech Co Ltd
Priority to CN202211219632.8A priority Critical patent/CN115438381A/en
Publication of CN115438381A publication Critical patent/CN115438381A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application provides a privacy set intersection method based on equal strategies, which relates to the technical field of privacy calculation, can be executed by a data demand party and comprises the following steps: obtaining a first to-be-matched set according to a hash processing result of M groups of elements in the first sample set, wherein the M groups of elements are obtained by combining M first samples and M groups of screening conditions one by one, and each group of screening conditions comprises at least one condition; receiving a second set to be matched sent by a data provider; and acquiring an intersection between the first set to be matched and the second set to be matched, wherein the characteristics of each group of samples in the intersection are equal to the screening conditions of the corresponding group in the M groups of screening conditions. The equal strategy privacy set intersection effect that the screening conditions can not be leaked and the sample safety outside the intersection can be protected can be achieved.

Description

Privacy set intersection method, device, equipment and medium based on equal strategies
Technical Field
The present application relates to the field of privacy computing technologies, and in particular, to a privacy set intersection method, apparatus, device, medium, and program product based on equal policies.
Background
The data sharing interactive cooperation is beneficial to better exerting data value, related scenes and mechanisms are enabled, and potential safety hazards such as data leakage and the like need to be avoided in the process of data sharing cooperation among the mechanisms along with the continuous improvement of the requirements on data safety and privacy protection. In the process of falling to the ground of a data service for carrying out list/sample sharing among organizations, technologies such as privacy set intersection and the like are mostly adopted to realize the extraction and screening of common samples of both parties, and data outside the sample intersection is protected from leaking, so that the method is widely applied to scenes such as joint marketing, federal learning, advertisement putting, customer fishing and the like.
The traditional privacy set intersection technology can only carry out intersection screening in a single dimension of a sample set, and cannot take other characteristics contained in the sample and judgment conditions thereof as screening bases. Taking a sample screening scenario between a data demander and a data provider as an example, when the data demander wants to perform intersection between a local sample set and a sample satisfying a constraint condition of "scholarly = master" in an opposite sample set, only an additional screening condition can be sent to the data provider.
Disclosure of Invention
Before the inventive concept of the present application was achieved, the inventors found that: the screening condition is directly sent to the data provider, so that the privacy of the screening condition cannot be effectively protected, for example, in the industries such as banks, securities or the internet, data related to the core judgment rule is leaked, and the application security requirement of a relevant organization cannot be completely met.
In view of the foregoing, the present application provides a privacy set intersection method, apparatus, device, medium, and program product based on an equal policy, which can protect privacy screening conditions from being leaked.
One aspect of the embodiments of the present application provides a privacy set intersection method based on an equal policy, which is executed by a data demander, and includes: obtaining a first to-be-matched set according to a hash processing result of M groups of elements in the first sample set, wherein the M groups of elements are obtained by combining M first samples and M groups of screening conditions one by one, each group of screening conditions comprises at least one condition, and M is an integer greater than or equal to 2; receiving a second set to be matched sent by a data provider, wherein the data provider is configured to: obtaining a second set to be matched according to the Hash processing result of N groups of elements in a second sample set, wherein the N groups of elements are obtained by combining N second samples and N groups of sample characteristics one by one, the characteristic quantity of each group of sample characteristics is equal to the condition quantity of each group of screening conditions, and N is an integer greater than or equal to 1; and acquiring an intersection between the first set to be matched and the second set to be matched, wherein the characteristics of each group of samples in the intersection are equal to the screening conditions of the corresponding group in the M groups of screening conditions.
According to an embodiment of the present application, before performing hash processing on M groups of elements in the first sample set, the method further includes obtaining the M groups of elements, specifically including: combining the M first samples different between any two samples with the M groups of screening conditions having the same content between any two groups of screening conditions one by one; or combining the M first samples that are the same between any two samples with the M sets of screening conditions that differ in content between any two sets of screening conditions one-to-one.
According to an embodiment of the application, before combining the M first samples that are the same between any two samples with the M sets of screening conditions that differ in content between any two sets of screening conditions one to one, the method further comprises: acquiring a preset number of accessory contents of each sample feature to be retrieved of the first sample, wherein the preset number of accessory contents comprise part or all feature values of the first sample feature; and taking the predetermined number of accessory contents of each sample feature to be retrieved as the M groups of screening conditions.
According to an embodiment of the application, after obtaining the intersection, the method further includes: determining a first corresponding relation between each group of elements in the intersection and M groups of elements in the first set to be matched; determining a set of feature attachments of the first sample according to the first correspondence, the set of feature attachments including at least one attachment content.
According to an embodiment of the present application, if the M first samples different between any two samples are combined with the M sets of screening conditions having the same content between any two sets of screening conditions one by one, after obtaining the intersection, the method further includes: determining a second corresponding relation between each group of elements in the intersection and M groups of elements in the first set to be matched; and determining an intersection sample set between the data demand side and the data supply side according to the second corresponding relation.
Another aspect of the embodiments of the present application provides a privacy set intersection method based on an equal policy, which is performed by a data provider, and includes: obtaining a second set to be matched according to the hash processing result of N groups of elements in a second sample set, wherein the N groups of elements are obtained by combining N second samples with N groups of sample characteristics one by one, each group of sample characteristics comprises at least one characteristic, and N is an integer greater than or equal to 1; sending the second to-be-matched set to a data demander, wherein the data demander is configured to execute the privacy set intersection method of any one of claims 1 to 5.
Another aspect of the embodiments of the present application provides an equal policy-based privacy set submission device, which is used for a data demander, and includes: the first hash module is used for obtaining a first set to be matched according to a hash processing result of M groups of elements in the first sample set, wherein the M groups of elements are obtained by combining M first samples and M groups of screening conditions one by one, each group of screening conditions comprises at least one condition, and M is an integer greater than or equal to 2; a set receiving module, configured to receive a second set to be matched sent by a data provider, where the data provider is configured to: obtaining a second set to be matched according to the hash processing result of N groups of elements in a second sample set, wherein the N groups of elements are obtained by combining N second samples with N groups of sample characteristics one by one, the characteristic quantity of each group of sample characteristics is equal to the condition quantity of each group of screening conditions, and N is an integer greater than or equal to 1; and the privacy intersection module is used for acquiring an intersection between the first to-be-matched set and the second to-be-matched set, wherein each group of sample features in the intersection is equal to the screening conditions of the corresponding group in the M groups of screening conditions.
The privacy set submission device for the data demander comprises modules which are respectively used for executing the steps of the method.
Another aspect of the embodiments of the present application provides an equal policy-based privacy aggregation request apparatus, for a data provider, including: a second hashing module, configured to obtain a second set to be matched according to a hash processing result of N groups of elements in a second sample set, where the N groups of elements are obtained by combining N second samples with N groups of sample features one by one, each group of sample features includes at least one feature, and N is an integer greater than or equal to 1; a set sending module, configured to send the second to-be-matched set to a data demander, where the data demander is configured to execute the privacy set intersection method as described in any one of the above.
Another aspect of an embodiment of the present application provides an electronic device, including: one or more processors; a storage device to store one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method as described above.
Yet another aspect of the embodiments of the present application provides a computer-readable storage medium having stored thereon executable instructions, which when executed by a processor, cause the processor to perform the method as described above.
Yet another aspect of embodiments of the present application provides a computer program product comprising a computer program which, when executed by a processor, implements a method as described above.
One or more of the above embodiments have the following advantageous effects: by using a cryptographic technology combining feature recoding and privacy set intersection, a data demand side and a data supply side respectively embed screening conditions or sample features in a preprocessing stage for recoding to obtain a first set to be matched and a second set to be matched, and then carry out privacy intersection to obtain intersection, so that the equal strategy privacy set intersection effect that screening conditions can be protected from leaking and samples outside the intersection can be protected from being safe can be finally realized. Compared with the traditional privacy asking for mutual participation, the method has the advantages that the performance loss is not caused, the privacy, effectiveness, high efficiency, novelty and practicability in the privacy gathering asking for mutual participation process are realized, and the method has higher application potential in the sample asking for mutual participation and screening scene under the relevant high-sensitivity scene.
Drawings
The foregoing and other objects, features and advantages of the application will be apparent from the following description of embodiments of the application with reference to the accompanying drawings in which:
fig. 1 schematically shows a flow chart of a privacy set intersection method based on equal policy according to an embodiment of the present application;
FIG. 2 schematically illustrates a flow diagram of a privacy set intersection method based on equal policy according to another embodiment of the present application;
FIG. 3 schematically illustrates a flow chart for determining screening conditions according to an embodiment of the present application;
FIG. 4 schematically illustrates a flow chart for determining a set of feature attachments according to an embodiment of the present application;
FIG. 5 schematically illustrates a flow chart for determining an intersection sample set according to an embodiment of the present application;
FIG. 6 is a flow chart schematically illustrating interaction between a requestor and a provider according to an embodiment of the present application;
FIG. 7 is a block diagram schematically illustrating a privacy set submission apparatus for a data requester according to an embodiment of the present application;
fig. 8 is a block diagram schematically illustrating a privacy aggregation request apparatus for a data provider according to an embodiment of the present application; and
fig. 9 schematically illustrates a block diagram of an electronic device adapted to implement equal policy based privacy set intersection according to an embodiment of the present application.
Detailed Description
In order to facilitate understanding of technical solutions of the embodiments of the present application, some technical terms related to the present application are first described.
Hash function (Hash function): the hash function is also called as a hash function, can convert an input with any length into a random number with a fixed length, and has safety characteristics such as randomness, irreversibility and collision resistance, and common hash functions include SHA3, SHA256, SN3 and the like.
Random elliptic point generating function (Hash to point): elliptic curve random element generation algorithm H with hash function similarity characteristic p Given an input τ of arbitrary length, the outputBelong to elliptic curve group
Figure BDA0003875425060000051
Of (2) element(s)
Figure BDA0003875425060000052
The method also has the advantages of pseudo-randomness, irreversibility and collision resistance, and is widely applied to public key cryptographic protocols.
Privacy Set Interaction (PSI): the common privacy set intersection protocol comprises the technical routes of RSA blind signature based, elliptic curve based, extended oblivious transmission (OT-Extension) based and the like.
Multi-policy privacy set intersection (Ns-PSI): a special privacy deal agreement can add a certain privacy policy (deal triggering condition or requirement) in the deal process, input the sample set and the privacy policy of both parties, output the intersection sample set meeting the policy, and do not reveal the privacy information in the policy in the deal process, thereby realizing more accurate and more effective deal calculation process and better protecting the privacy of all parties.
Privacy query (PIR): the inquiry party carries out inquiry and obtains inquiry results, but the data provider can not obtain any inquiry condition information in the inquiry and result return processes, and can not obtain the plaintext output of the inquiry results. (e.g., whether Zhang III belongs to set A, and whether the returned content belongs to/does not belong to).
Privacy with attachment query (PIR-P): on the basis of a privacy query algorithm, a query party carries out accessory query and acquires a query result to a data provider, but the data provider cannot acquire any query condition information and cannot acquire plaintext output of the query result and accessory information in the query and result returning process.
Forward security: also known as Forward Secrecy. The method is a security attribute of a communication protocol in cryptography, and means that the leakage of a master key used for a long time does not cause the leakage of a past session key.
Accessories: and taking the sample as a main retrieval value, wherein the characteristics of the hit returned sample are the attachments.
Hereinafter, embodiments of the present application will be described with reference to the accompanying drawings. It is to be understood that such description is merely illustrative and not intended to limit the scope of the present application. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the application. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present application.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B, and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, and C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.).
Such as data requirementsSolution P A With data provider P B Performing data sharing cooperation and data demander P A Sample set of (d) is a = { ID = 1 ,ID 2 ,…,ID m }, data provider P B Is combined with the feature set
Figure BDA0003875425060000071
Wherein
Figure BDA0003875425060000072
Is a sample, x k Is characterized in that. In addition, the data demander P A Providing an additional privacy judgment screening condition a, wherein the objective of the intersection under the equal strategy is as follows: data demander P A Obtaining a feature equal to x i All intersection samples of = a, i.e.: c = A # =a B={ID k } k Satisfies ID k Belongs to A # B and x k = a, wherein & =a And recording as a sample intersection symbol under the equal strategy.
The above-mentioned scene demand can be partially solved to utilize traditional privacy set to ask for the technology of handing over, but can reveal some privacy, and the scheme is as follows:
the first step is as follows: data demander P A And a data provider P B Respectively preparing initial sample sets A and B of the self;
the second step is that: the data demander will filter the conditions (x) i = a) to a data provider, the data provider performs initial screening in the data sample set after receiving the data, and the extraction meets x i Set of samples B' = a;
the third step: both parties perform a privacy set intersection algorithm with respect to sets A and B', by P A Obtaining A and B'.
The above conventional privacy aggregation intersection method may cause leakage of the screening condition, which is not favorable for privacy protection of the data demanding party. In an actual scene, screening conditions proposed to an opposite side by a commercial bank behavior case often include core parameters of a bank risk decision, and after the screening conditions are exposed to the outside, other organizations can perform data modification and beautification according to a risk decision bottom line rule of the commercial bank, so that the benefits of the bank are damaged.
The embodiment of the application provides a privacy set submitting method, a device, equipment, a medium and a program product based on equal strategies, which can effectively protect the sample set and the screening condition privacy of a data demand party, and a data provider can not identify any effective information about the screening condition, so that a higher-level privacy protection effect is realized, and safer, more accurate and more effective sample submitting and sharing are realized.
Specifically, the privacy set intersection scheme based on the equal policy provided by the embodiment of the present application is applied to the data demander P A In other words, a data requiring party can correctly obtain a sample intersection set with corresponding sample characteristics equal to the screening value (with an equal strategy), and the data requiring party cannot reversely deduce any sample and characteristic data information except the intersection result. For data provider P B In other words, the data provider P is satisfied B The screening condition a cannot be obtained, and the intersection result cannot be obtained independently (the reverse deduction of the screening threshold value may be caused by obtaining the intersection result), so that the P-level screening method is realized B Data sharing synergistic effect (P) that cannot acquire any effective information under view angle B The intersection set, the number of the intersection elements and the privacy screening condition are not mastered), and the accurate intersection solving effect of the maximum privacy degree is realized.
The privacy set intersection method based on the equal policy according to the embodiment of the present application will be described in detail below with reference to fig. 2 to 6.
Fig. 1 schematically shows a flowchart of a privacy set intersection method based on equal policy according to an embodiment of the present application.
As shown in fig. 1, the privacy set intersection method based on the equal policy of this embodiment is performed by a data demander, and includes operations S110 to S130.
In operation S110, a first set to be matched is obtained according to a hash processing result of M groups of elements in the first sample set, where M groups of elements are obtained by combining M first samples with M groups of screening conditions one by one, each group of screening conditions includes at least one condition, and M is an integer greater than or equal to 2;
according to an embodiment of the present application, before performing hash processing on M groups of elements in the first sample set, obtaining M groups of elements further includes: combining M first samples different between any two samples with M groups of screening conditions with the same content between any two groups of screening conditions one by one; or combining the M first samples which are the same between any two samples with M groups of screening conditions which are different in content between any two groups of screening conditions one by one.
In some embodiments, M first samples different between any two samples are combined with M screening conditions having the same content between any two sets of screening conditions to obtain M elements, in other words, M screening conditions are the same screening conditions, and are M sets, for example, the first sample is a plurality of identification numbers, and the screening conditions are all equal to the master history (e.g., master identifier). E.g. data demander P A Determining the screening condition a of the privacy intersection of the equal strategies, and then calculating a first sample set
Figure BDA0003875425060000081
(ID m And a) is one group of elements, and M is more than 1 and less than or equal to M. (ID) m And a) comprises a screening condition a.
In other embodiments, M first samples that are the same between any two samples are combined with M sets of screening conditions that differ in content between any two sets of screening conditions one-to-one to obtain M sets of elements. In other words, the M first samples are the same samples, the number of the samples is M, for example, the first sample is the identification number of Zhang III, and the screening condition is the credit score of 0-100. E.g. data demander P A Counting a first sample set into
Figure BDA0003875425060000091
(ID,a m ) I.e. one set of elements, (ID, a) m ) Including the same sample ID.
The sample ID may be, for example, an identification number, an employee number, a company credit code, a mobile phone number, a bank card number, a terminal IMEI number, or the like, which can be bound to the identified object one by one. The screening condition or sample characteristic can be user characteristic information such as account balance, credit score, age, academic calendar, ethnicity and the like.
The one-to-one combination of the M first samples and the M sets of screening conditions can be realized in a binary system. Combining the original sample with the screening condition to generate a first sample set, and calculating the set by using a Hash function H
Figure BDA0003875425060000095
The hash value of each group of elements: c. C i =H(ID i A), i =1,2, \8230;, m, and finally obtains a hash processing result S 1 ={c 1 ,c 2 ,…,c m }。
In operation S120, a second set to be matched sent by a data provider is received, where the data provider is configured to: obtaining a second set to be matched according to the Hash processing result of N groups of elements in the second sample set, wherein the N groups of elements are obtained by combining N second samples with N groups of sample characteristics one by one, the characteristic quantity of each group of sample characteristics is equal to the condition quantity of each group of screening conditions, and N is an integer greater than or equal to 1;
illustratively, the combination of N second samples and N sets of sample features may also be implemented in a binary manner. Data provider P B In a similar way, for a second set of samples of itself
Figure BDA0003875425060000092
By reaction with P A The same hash function is processed to obtain a set S 2 I.e. by
Figure BDA0003875425060000093
Figure BDA0003875425060000094
Wherein S can be directly reacted with 2 As the second set to be matched, it can also send S 2 And further processing according to the adopted privacy set intersection algorithm to obtain a new second set to be matched. Likewise, S can be directly substituted 1 As the first to beMatching set, also for S 1 And further processing according to the adopted privacy set intersection algorithm to obtain a new first set to be matched.
In operation S130, an intersection between the first to-be-matched set and the second to-be-matched set is obtained, where each group of sample features in the intersection is equal to the screening condition of the corresponding group in the M groups of screening conditions.
Illustratively, the two parties use a privacy set intersection algorithm to calculate a set S 1 And S 2 The intersection information is obtained by the data demand side
Figure BDA0003875425060000101
Figure BDA0003875425060000102
(assume there are l elements in the intersection). Subsequent data demander P A Through the intersection result C and the hash set S to be matched with the local side 1 ={c 1 ,c 2 ,…,c m The element corresponding relation between the privacy sets outputs the intersection sample of the privacy set intersection under the equal strategy
Figure BDA0003875425060000103
The equality strategy refers to the data demander P A The samples in the local sample set and the counterpart sample set both satisfy the constraint of "academic = master (for example only)" and are subjected to intersection to obtain an intersection.
In some embodiments, the intersection may be performed using a hash collision library. But the hash-to-pool approach only allows for use in a limited number of intersection sets to ensure forward security. Specifically, the way of directly using hash collision library (i.e. data provider P) B Directly mix S 2 Sent to the data demander P A Subsequently from P A Perform intersection calculations locally), although P will not be revealed during transmission B Any original data information (because
Figure BDA0003875425060000104
All are hash values, so that no plaintext reversal is causedPush attacks), but then the risk that can be posed is: if P is A Long term storage
Figure BDA0003875425060000105
But collects more IDs and corresponding characteristic information in the subsequent service according to the information which is not hit in the transaction
Figure BDA0003875425060000106
For real-time matching of hash values (e.g. P) in (1) A Subsequently, a new ID and the corresponding feature x are collected, which is calculated locally directly c = H (ID, x) and is compared with
Figure BDA0003875425060000107
By comparing the elements in (1), it is possible to know about P B The additional information of the square sample set realizes the information acquisition quantity beyond the single privacy set intersection, for P B The party belongs to information leakage and does not meet the requirement of forward security in cryptography. )
In other embodiments, according to the requirement of forward security, a privacy set intersection cryptography protocol satisfying the forward security is used, and on the premise of excluding the hash collision library, a privacy set intersection protocol in a range including but not limited to the technical route based on RSA, based on elliptic curve, based on extended oblivious transmission, and the like can be selected. In particular, the application does not relate to the selection of a specific privacy set intersection algorithm, and cannot enumerate all algorithms one by one, and the application requirements of the scheme are met as long as the algorithm protocols which meet the security under the semi-honest attack model and meet the forward security are met.
According to the embodiment of the application, a cryptographic technology combining characteristic recoding and privacy set intersection is used, a data demand side and a data provider side respectively embed screening conditions or sample characteristics in a preprocessing stage to recode, a first set to be matched and a second set to be matched are obtained, then privacy set intersection is carried out to obtain intersection, the equal strategy privacy set intersection effect that screening conditions can be protected from being leaked and samples outside the intersection can be protected can be finally achieved, compared with the traditional privacy set intersection, performance loss is not caused, privacy, effectiveness, high efficiency, innovativeness and practicability in the privacy set intersection process are achieved, and the cryptographic technology has high application potential in sample intersection and screening scenes in related high-sensitivity scenes.
Fig. 2 schematically shows a flow chart of a privacy set intersection method based on equal policy according to another embodiment of the present application.
As shown in fig. 2, the privacy set intersection method based on the equal policy of this embodiment is performed by a data provider, and includes operations S210 to S220.
In operation S210, a second set to be matched is obtained according to hash processing results of N groups of elements in the second sample set, where N groups of elements are obtained by combining N second samples with N groups of sample features one by one, each group of sample features includes at least one feature, and N is an integer greater than or equal to 1;
in operation S220, the second to-be-matched set is sent to the data demander, which is configured to perform operations S210 to S230.
According to the embodiment of the application, the data demand side and the data provider side are matched with each other in a recoding process (namely, the data demand side and the data provider side respectively carry out hash processing to obtain the set to be matched), so that the method has strong privacy and high availability, supports an intersection scene with a characteristic screening condition (equal strategy), has high technical and application values, and supports forward security.
By taking fig. 3 and fig. 4 as an example, in a scenario where M first samples that are the same between any two samples and M groups of filtering conditions that are different in content between any two groups of filtering conditions are combined one by one (the scenario is a query scenario with an accessory privacy), an accessory query is performed and a query result is obtained, but in a query and result return process, a data provider cannot obtain any query condition information, and cannot obtain a plaintext output and accessory information of the query result. (e.g., zhang III is queried, and ethnic and academic calendars of Zhang III are returned as accessories).
Fig. 3 schematically shows a flow chart for determining screening conditions according to an embodiment of the present application.
Before combining M first samples that are the same between any two samples with M sets of screening conditions that differ in content between any two sets of screening conditions one by one, as shown in fig. 3, the embodiment determines the screening conditions including operations S310 to S320.
In operation S310, a predetermined number of attached contents of each to-be-retrieved sample feature of the first sample are obtained, where the predetermined number of attached contents include some or all feature values of the first sample feature;
illustratively, the first sample may be subjected to an attached privacy query for one or more sample features. For example, data demander P A Computing collections
Figure BDA0003875425060000122
Wherein a is 1 ,…,a m Is the total possible accessory content of the feature, using an exhaustive approach (e.g. total credits are 0,1., 100; total scholars are elementary school, junior middle school, high school, university., doctor).
It is to be understood that the exhaustive list is only one of the ways, and that a sufficient number of accessory contents may also be determined according to a predetermined number. For example, the credit score (feature to be retrieved) is 0,1, \ 8230;, 100, each feature value in these 0-100 is an attachment content, and part or all of the feature values include, for example, 10, 20, or 101 predetermined number of feature values.
In operation S320, a predetermined number of attached contents of each sample feature to be retrieved are used as M sets of filtering conditions.
Further, for example, the number M of samples may be 101 ID card numbers of three, a 1 ,…,a m Corresponding to 0,1.., 100, respectively. Wherein each set of screening conditions comprises a single condition, i.e. a credit score.
Fig. 4 schematically shows a flow chart of determining a set of feature attachments according to an embodiment of the application.
After the intersection is obtained, as shown in fig. 4, determining the feature attachment set according to this embodiment includes operations S410 to S420.
In operation S410, a first correspondence between each group of elements in the intersection and M groups of elements in the first set to be matched is determined;
the elements in the intersection of this embodiment also belong to the first set to be matched, and the first correspondence includes that each group of elements in the intersection is respectively located at a specific position in the first set to be matched.
In operation S420, a feature attachment set of the first sample is determined according to the first corresponding relationship, the feature attachment set including at least one attachment content.
For ease of understanding, this list is repeated
Figure BDA0003875425060000121
And the first set S to be matched 1 ={c 1 ,c 2 ,…,c m }. E.g. determining c from the first correspondence 1 ,c 2 A is the data in intersection in the same order 1 ,a 2 I.e. to form a set of feature attachments, a 1 ,a 2 Each being an attachment content.
Compared with the current mainstream privacy set intersection algorithm protocol, the embodiment of the application can determine the attached feature screening condition, support the sample intersection and renewal mechanism with the attached feature screening condition (equal strategy), and support a more accurate and more effective sample matching and data sharing mode.
Fig. 5 schematically shows a flow chart of determining an intersection sample set according to an embodiment of the application.
If M first samples different between any two samples are combined with M sets of screening conditions having the same content between any two sets of screening conditions one by one, after obtaining the intersection, as shown in fig. 5, determining the intersection sample set in this embodiment includes operations S510 to S520.
In operation S510, a second correspondence between each group of elements in the intersection and M groups of elements in the first set to be matched is determined;
the elements in the intersection of this embodiment also belong to the first set to be matched, and the second correspondence includes that each group of elements in the intersection is located at a specific position in the first set to be matched.
In operation S520, an intersection sample set between the data demander and the data provider is determined according to the second correspondence.
For ease of understanding, this list is repeated
Figure BDA0003875425060000131
And the first set S to be matched 1 ={c 1 ,c 2 ,…,c m }. E.g. determining c from the second correspondence 1 ,c 2 ID of same order for data in intersection 1 ,ID 2 I.e. the set of intersection samples.
According to the embodiment of the application, in the process of obtaining the intersection sample set, for all data demanders
Figure BDA0003875425060000132
(samples not in the final result output set) that cannot be distinguished because they are not in the sample intersection
Figure BDA0003875425060000133
Or because it does not satisfy the judgment condition x i Not equal to a; for the data provider, any intersection information and screening conditions are not mastered, and a data supply mode of only providing data and not acquiring any information is really realized. In addition, the forward security is also met, namely, the information and data communicated in the task of the transaction cannot cause any information leakage in the future.
The calculation of S using the elliptic curve based privacy ensemble intersection algorithm is described below with reference to FIG. 6 1 ∩S 2 The process of (1).
Fig. 6 schematically shows a flowchart of interaction between a demander and a provider according to an embodiment of the present application.
As shown in fig. 6, the calculating of the intersection by the demander and the provider using the elliptic curve-based privacy set intersection algorithm in this embodiment may include operations S610 to S690.
In operation S610, the data demander and the data provider each perform data preparation.
In a 1In some embodiments, the data demander may determine M elements of the first sample set as required to determine the sample intersection, e.g.
Figure BDA0003875425060000141
In other embodiments, the data demander may determine, for example, based on the requirements of the privacy query with the attachment
Figure BDA0003875425060000142
It will be appreciated that the data provider P B Is characterized by the sample and feature set
Figure BDA0003875425060000143
Wherein
Figure BDA0003875425060000144
Xk is a feature for the sample.
In some embodiments, for a case of multiple features, that is, a data provider sample includes multidimensional features, and in the intersection process, sample matching intersection needs to be performed according to multiple screening conditions, for example, in a two-dimensional feature mode: a = { ID 1 ,ID 2 ,…,ID m }, data provider P B Is characterized by the sample and feature set
Figure BDA0003875425060000145
Wherein
Figure BDA0003875425060000146
Is a sample, x k ,y k There are two types of features. The calculation tasks of the two parties are to obtain the intersection of the equal screening conditions in the A and B sets, wherein C = A # =a,b B={ID k } k Satisfies ID k Belongs to A # B and x k =a,y k = b, for the requirements of such strategies, only multidimensional extension coding is required in the recoding process of the data preprocessing stage, namely, the calculation of the data demander
Figure BDA0003875425060000147
Figure BDA0003875425060000148
Or data demander computation
Figure BDA0003875425060000149
Figure BDA00038754250600001410
And then the intersection task of the dual-feature equal strategy can be completed according to the scheme.
For the multi-dimensional feature patterns, according to a similar method, aiming at the data provider samples and the multi-dimensional feature set form, the screening condition of each feature is recoded in the data preprocessing stage to calculate a first sample set:
Figure BDA00038754250600001411
namely, the technical scheme of privacy set intersection of equal strategies for r characteristics can be realized. It is to be understood that the first set of samples in the context of private query with attachment can also describe the attachment in multiple dimensions, as in a 11 ,a 12 …a 1r And will not be described herein.
In operation S620, the data demander and the data provider are paired with each other
Figure BDA00038754250600001412
And B, carrying out data preprocessing.
The data demand side obtains a hash processing result S 1 ={c 1 ,c 2 ,…,c m And the data provider obtains a hash processing result
Figure BDA0003875425060000151
In operation S630, both the data demander and the data provider perform an initialization operation.
Both parties determine elliptic curve by negotiation
Figure BDA0003875425060000152
And a random point generating function H p (. O) and determining an elliptic curve generator
Figure BDA0003875425060000153
To complete the initialization process (the multiplication operations of elliptic curves are expressed in terms of exponentiation). The two parties respectively generate self matching private keys sk A
Figure BDA0003875425060000154
Wherein sk A Is the data demander P A Matching private key of B Is a data provider P B The two parties respectively calculate the own matching public key
Figure BDA0003875425060000155
And discloses. Optionally, both parties calculate the shared key K of the task according to a key agreement mechanism, for example, the shared key K can be calculated by using a Diffie-Hellman-based key agreement algorithm
Figure BDA0003875425060000156
(P A Square calculation
Figure BDA0003875425060000157
P B Square calculation
Figure BDA0003875425060000158
)。
In operation S640, for S 1 ={c 1 ,c 2 ,…,c m Data Requirements computation
Figure BDA0003875425060000159
And sent to the data provider.
In this example, S 'may be considered' 1 Is the first set to be matched.
In operation S650, the data provider pair
Figure BDA00038754250600001510
Computing
Figure BDA00038754250600001511
Then S' 2 And sending the data to a data demand side.
In this example, S 'may be considered' 2 Is the second set to be matched.
In operation S660, S 'is received by the data provider' 1 After that, S' is calculated 1 And apply S ″ 1 The data is sent to the data demand side,
Figure BDA00038754250600001512
in operation S670, S 'is received from the data demand side' 2 Calculate S ″) 2 And no transmission is required. Wherein the content of the first and second substances,
Figure BDA00038754250600001513
in operation S680, the data consumer receives S ″ 1 S' obtained by later and local calculation 2 Comparing and matching to obtain intersection
Figure BDA00038754250600001514
In operation S690, the data demander follows S ″ 1 And S 1 Element correspondence between them, determining an intersection
Figure BDA00038754250600001515
After operation S690, operations S410 to S420 may be performed to determine a feature accessory set, or operations S510 to S520 may be performed to determine an intersection sample set.
In particular, although the operations S610 to S690 are described in the sequence, the sequence is not meant to be performed in the actual implementation, for example, the operations S640 and S650 may be performed simultaneously.
Compared with a traditional sample intersection mode of a hash collision library, the embodiment has forward security, and communication contents in the current intersection task cannot cause leakage risks to future data. On the other hand, a screening condition recoding mode is adopted, a sample set to be matched is calculated in a data preprocessing stage, privacy set intersection under an equal strategy can be completed by directly calling a privacy set intersection technology, and the operation efficiency is the same as that of conventional privacy set intersection. On the other hand, flexible feature expansibility is supported, so that the requirement of equal strategy privacy set intersection in a single feature mode can be met, the requirement of dynamic feature quantity expansion can also be met, and the requirement of equal strategy privacy set intersection in a multi-feature mode is supported. Therefore, the method has the advantages of forward security, sample privacy and screening condition privacy, and data security and privacy protection in the process of asking for transaction are ensured to the maximum extent.
Based on the privacy set intersection method based on the equal strategy, the application also provides privacy set intersection devices for data demanders or data providers respectively. The apparatus will be described in detail below with reference to fig. 7 and 8.
Fig. 7 is a block diagram schematically illustrating a structure of a privacy set submission apparatus for a data demander according to an embodiment of the present application.
As shown in fig. 7, the privacy set intersection apparatus 700 based on the equal policy of this embodiment includes a first hashing module 710, a set receiving module 720, and a privacy intersection module 730.
The first hashing module 710 may perform operation S110 to obtain a first set to be matched according to a hash processing result on M groups of elements in the first sample set, where the M groups of elements are obtained by combining M first samples with M groups of filtering conditions one by one, each group of filtering conditions includes at least one condition, and M is an integer greater than or equal to 2;
the set receiving module 720 may perform operation S120 to receive a second set to be matched sent by a data provider, where the data provider is configured to: obtaining a second set to be matched according to the Hash processing result of N groups of elements in the second sample set, wherein the N groups of elements are obtained by combining N second samples with N groups of sample characteristics one by one, the characteristic quantity of each group of sample characteristics is equal to the condition quantity of each group of screening conditions, and N is an integer greater than or equal to 1;
the privacy intersection module 730 may perform operation S130, where each group of sample features in the intersection is equal to the screening conditions of the corresponding group in the M groups of screening conditions, to obtain an intersection between the first to-be-matched set and the second to-be-matched set.
According to an embodiment of the present application, the privacy set intersection apparatus 700 may further include an element obtaining module, configured to, before performing hash processing on M groups of elements in the first sample set, obtain M groups of elements, specifically including: combining M first samples different between any two samples with M groups of screening conditions with the same content between any two groups of screening conditions one by one; or combining the M first samples which are the same between any two samples with the M groups of screening conditions which are different in content between any two groups of screening conditions one by one.
According to an embodiment of the present application, the privacy set intersection apparatus 700 may further include a screening condition determining module, configured to perform operations S310 to S320 before the M first samples that are the same between any two samples are combined with the M sets of screening conditions that have different contents between any two sets of screening conditions one by one, which is not described herein again.
According to an embodiment of the present application, the privacy set submission apparatus 700 may further include a feature attachment set module, and may perform operations S410 to S420, which are not described herein again.
According to an embodiment of the present application, the privacy set intersection apparatus 700 may further include an intersection sample set module, which may perform operations S510 to S520, which are not described herein again.
It is to be understood that the privacy set intersection apparatus 700 includes modules for performing the respective steps of any one of the embodiments described above with reference to fig. 1-6.
Fig. 8 is a block diagram schematically illustrating a structure of a privacy set submission apparatus for a data provider according to an embodiment of the present application.
As shown in fig. 8, the privacy set intersection apparatus 800 based on the equal policy of this embodiment includes a second hash module 810 and a set transmission module 820.
The second hashing module 810 may perform operation S210, and is configured to obtain a second set to be matched according to a hash processing result on N groups of elements in the second sample set, where the N groups of elements are obtained by combining N second samples and N groups of sample features one by one, each group of sample features includes at least one feature, and N is an integer greater than or equal to 1;
the set sending module 820 may perform operation S220 to send the second to-be-matched set to a data demander, where the data demander is configured to perform any one of the embodiments of fig. 1 to 6 as described above.
It should be noted that the implementation, solved technical problems, implemented functions, and achieved technical effects of each module/unit/subunit and the like in the apparatus part embodiment are respectively the same as or similar to the implementation, solved technical problems, implemented functions, and achieved technical effects of each corresponding step in the method part embodiment, and are not described herein again.
According to the embodiment of the present application, any plurality of modules in the privacy set submission apparatus 700 or 800 may be combined into one module to be implemented, or any one of the modules may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of other modules and implemented in one module.
According to an embodiment of the present application, at least one module in the privacy aggregation evaluation device 700 or 800 may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by any other reasonable manner of integrating or packaging a circuit, such as hardware or firmware, or implemented by any one of three implementations of software, hardware, and firmware, or any suitable combination of any of them. Alternatively, at least one module of the privacy set intersection apparatus 700 or 800 may be implemented at least in part as a computer program module, which when executed, may perform a corresponding function.
Fig. 9 schematically illustrates a block diagram of an electronic device adapted to implement equal policy based privacy set intersection according to an embodiment of the present application.
As shown in fig. 9, an electronic device 900 according to an embodiment of the present application includes a processor 901, which can perform various appropriate actions and processes according to a program stored in a read only memory (RON) 902 or a program loaded from a storage section 908 into a random access memory (RAN) 903. Processor 901 can include, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or related chipset(s) and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 901 may also include on-board memory for caching purposes. The processor 901 may comprise a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the application.
In the RAN 903, various programs and data necessary for the operation of the electronic apparatus 900 are stored. The processor 901, the RON 902, and the RAN 903 are connected to each other by a bus 904. Processor 901 performs various operations of method flows according to embodiments of the present application by executing programs in RON 902 and/or RAN 903. It is noted that the programs may also be stored in one or more memories other than RON 902 and RAN 903. The processor 901 may also perform various operations of the method flows according to the embodiments of the present application by executing programs stored in the one or more memories.
According to an embodiment of the application, the electronic device 900 may also include an input/output (I/O) interface 905, the input/output (I/O) interface 905 also being connected to the bus 904. The electronic device 900 may also include one or more of the following components connected to the I/O interface 905: an input portion 906 including a keyboard, a mouse, and the like; an output section 907 including components such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 908 including a hard disk and the like; and a communication section 909 including a network interface card such as a LAN card, a modem, or the like. The communication section 909 performs communication processing via a network such as the internet. The drive 910 is also connected to the I/O interface 905 as necessary. A removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 910 as necessary, so that a computer program read out therefrom is mounted into the storage section 908 as necessary.
The present application also provides a computer-readable storage medium, which may be contained in the device/apparatus/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the application.
According to embodiments of the present application, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a random access memory (RAN), a read-only memory (RON), an erasable programmable read-only memory (EPRON or flash memory), a portable compact disc read-only memory (CD-RON), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. For example, according to embodiments of the application, the computer-readable storage medium may include the RON 902 and/or the RAN 903 described above and/or one or more memories external to the RON 902 and the RAN 903.
Embodiments of the present application also include a computer program product comprising a computer program comprising program code for performing the method illustrated by the flow chart. When the computer program product runs in a computer system, the program code is used for causing the computer system to realize the method provided by the embodiment of the application.
The computer program performs the above-mentioned functions defined in the system/apparatus of the embodiment of the present application when being executed by the processor 901. According to embodiments of the present application, the above-described systems, apparatuses, modules, units, etc. may be implemented by computer program modules.
In one embodiment, the computer program may be hosted on a tangible storage medium such as an optical storage device, a magnetic storage device, or the like. In another embodiment, the computer program may also be transmitted, distributed in the form of a signal on a network medium, and downloaded and installed through the communication section 909 and/or installed from the removable medium 911. The computer program containing program code may be transmitted using any suitable network medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
In such an embodiment, the computer program may be downloaded and installed from a network via the communication section 909 and/or installed from the removable medium 911. The computer program, when executed by the processor 901, performs the above-described functions defined in the system of the embodiment of the present application. According to embodiments of the present application, the above-described systems, devices, apparatuses, modules, units, etc. may be implemented by computer program modules.
According to embodiments of the present application, program code for executing computer programs provided by embodiments of the present application may be written in any combination of one or more programming languages, and in particular, these computer programs may be implemented using a high level procedural and/or object oriented programming language, and/or assembly/machine language. The programming language includes, but is not limited to, programming languages such as Java, C + +, python, the "C" language, or the like. The program code may execute entirely on the user computing device, partly on the user device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It will be appreciated by a person skilled in the art that various combinations and/or combinations of features described in the various embodiments and/or claims of the present application are possible, even if such combinations or combinations are not explicitly described in the present application. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present application may be made without departing from the spirit and teachings of the present application. All such combinations and/or associations are intended to fall within the scope of this application.
The embodiments of the present application are described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present application. Although the embodiments are described separately above, this does not mean that the measures in the embodiments cannot be used advantageously in combination. The scope of the application is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be devised by those skilled in the art without departing from the scope of the present application, and such alternatives and modifications are intended to be within the scope of the present application.

Claims (11)

1. A privacy set intersection method based on an equal strategy is executed by a data demander and comprises the following steps:
obtaining a first to-be-matched set according to a hash processing result of M groups of elements in the first sample set, wherein the M groups of elements are obtained by combining M first samples and M groups of screening conditions one by one, each group of screening conditions comprises at least one condition, and M is an integer greater than or equal to 2;
receiving a second set to be matched sent by a data provider, wherein the data provider is configured to: obtaining a second set to be matched according to the Hash processing result of N groups of elements in a second sample set, wherein the N groups of elements are obtained by combining N second samples and N groups of sample characteristics one by one, the characteristic quantity of each group of sample characteristics is equal to the condition quantity of each group of screening conditions, and N is an integer greater than or equal to 1;
and acquiring an intersection between the first set to be matched and the second set to be matched, wherein the characteristics of each group of samples in the intersection are equal to the screening conditions of the corresponding group in the M groups of screening conditions.
2. The method according to claim 1, wherein before hashing M groups of elements in the first sample set, the method further includes obtaining the M groups of elements, and specifically includes:
combining the M first samples that differ between any two samples with the M sets of screening conditions that have the same content between any two sets of screening conditions one-to-one; or
Combining the M first samples that are the same between any two samples with the M sets of screening conditions that differ in content between any two sets of screening conditions one-to-one.
3. The method of claim 2, wherein prior to combining the M first samples that are the same between any two samples with the M sets of screening conditions that differ in content between any two sets of screening conditions one-to-one, the method further comprises:
acquiring a preset number of accessory contents of each sample characteristic to be retrieved of the first sample, wherein the preset number of accessory contents comprise part or all characteristic values of the first sample characteristic;
and taking the predetermined number of accessory contents of each sample feature to be retrieved as the M groups of screening conditions.
4. The method of claim 3, wherein after obtaining the intersection, the method further comprises:
determining a first corresponding relation between each group of elements in the intersection and M groups of elements in the first set to be matched;
determining a feature attachment set of the first sample according to the first corresponding relation, wherein the feature attachment set comprises at least one attachment content.
5. The method of claim 2, wherein if the M first samples that are different between any two samples are combined with the M sets of screening conditions that are identical between any two sets of screening conditions, the method further comprises, after obtaining the intersection set:
determining a second corresponding relation between each group of elements in the intersection and M groups of elements in the first set to be matched;
and determining an intersection sample set between the data demand side and the data supply side according to the second corresponding relation.
6. A privacy set intersection method based on equal policy, which is executed by a data provider, comprises the following steps:
obtaining a second set to be matched according to the hash processing result of N groups of elements in a second sample set, wherein the N groups of elements are obtained by combining N second samples with N groups of sample characteristics one by one, each group of sample characteristics comprises at least one characteristic, and N is an integer greater than or equal to 1;
sending the second to-be-matched set to a data demander, wherein the data demander is configured to execute the privacy set intersection method of any one of claims 1 to 5.
7. A privacy set intersection device based on equal policy, for a data demander, comprising:
the first hash module is used for obtaining a first set to be matched according to a hash processing result of M groups of elements in the first sample set, wherein the M groups of elements are obtained by combining M first samples and M groups of screening conditions one by one, each group of screening conditions comprises at least one condition, and M is an integer greater than or equal to 2;
a set receiving module, configured to receive a second set to be matched sent by a data provider, where the data provider is configured to: obtaining a second set to be matched according to the hash processing result of N groups of elements in a second sample set, wherein the N groups of elements are obtained by combining N second samples with N groups of sample characteristics one by one, the characteristic quantity of each group of sample characteristics is equal to the condition quantity of each group of screening conditions, and N is an integer greater than or equal to 1;
and the privacy intersection module is used for acquiring an intersection between the first set to be matched and the second set to be matched, wherein each group of sample features in the intersection is equal to the screening conditions of the corresponding group in the M groups of screening conditions.
8. An equal policy based privacy aggregation intersection apparatus for a data provider, comprising:
a second hashing module, configured to obtain a second set to be matched according to a hash processing result of N groups of elements in a second sample set, where the N groups of elements are obtained by combining N second samples with N groups of sample features one by one, each group of sample features includes at least one feature, and N is an integer greater than or equal to 1;
a set sending module, configured to send the second to-be-matched set to a data demander, where the data demander is configured to execute the privacy set intersection method according to any one of claims 1 to 5.
9. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method recited in any of claims 1-6.
10. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to perform the method according to any one of claims 1 to 6.
11. A computer program product comprising a computer program which, when executed by a processor, carries out the method according to any one of claims 1 to 6.
CN202211219632.8A 2022-09-30 2022-09-30 Privacy set intersection method, device, equipment and medium based on equal strategies Pending CN115438381A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211219632.8A CN115438381A (en) 2022-09-30 2022-09-30 Privacy set intersection method, device, equipment and medium based on equal strategies

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211219632.8A CN115438381A (en) 2022-09-30 2022-09-30 Privacy set intersection method, device, equipment and medium based on equal strategies

Publications (1)

Publication Number Publication Date
CN115438381A true CN115438381A (en) 2022-12-06

Family

ID=84250605

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211219632.8A Pending CN115438381A (en) 2022-09-30 2022-09-30 Privacy set intersection method, device, equipment and medium based on equal strategies

Country Status (1)

Country Link
CN (1) CN115438381A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115795547A (en) * 2022-12-09 2023-03-14 京信数据科技有限公司 Method, device, terminal and computer storage medium for querying data

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115795547A (en) * 2022-12-09 2023-03-14 京信数据科技有限公司 Method, device, terminal and computer storage medium for querying data

Similar Documents

Publication Publication Date Title
WO2020024995A1 (en) Privacy transaction method and system, and device
Galal et al. Trustee: full privacy preserving vickrey auction on top of ethereum
EP3598336A1 (en) Information processing device and information processing method
CN108898021B (en) Threat information processing method, system and computing device based on block chain
US20180025767A1 (en) Method and system for determining market estimates with market based measures
US11900366B2 (en) System and method for securing crypto-asset transactions
CN111222178B (en) Data signature method and device
CN111027981B (en) Method and device for multi-party joint training of risk assessment model for IoT (Internet of things) machine
CN114500093B (en) Safe interaction method and system for message information
CN113420049B (en) Data circulation method, device, electronic equipment and storage medium
WO2022156594A1 (en) Federated model training method and apparatus, electronic device, computer program product, and computer-readable storage medium
US20230281671A1 (en) Decentralized privacy-preserving rewards with cryptographic black box accumulators
CN115618380A (en) Data processing method, device, equipment and medium
CN115438381A (en) Privacy set intersection method, device, equipment and medium based on equal strategies
US20140337239A1 (en) Method and system for obtaining offers from sellers using privacy-preserving verifiable statements
US11720900B1 (en) Systems and methods for adaptive learning to replicate peak performance of human decision making
CN116432040B (en) Model training method, device and medium based on federal learning and electronic equipment
EP4320540A1 (en) Privacy secure batch retrieval using private information retrieval and secure multi-party computation
CN115618381A (en) Data processing method, device, equipment and storage medium
US9553787B1 (en) Monitoring hosted service usage
CN115603905A (en) Data sharing method, device, equipment and storage medium
CN116894727A (en) Data processing method and device based on block chain and related equipment
CN112182594A (en) Data encryption method and device
CN111461178B (en) Data processing method, system and device
CN113626881B (en) Object evaluation method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination