CN111831979A - Method and device for analyzing data privacy protection protocol - Google Patents

Method and device for analyzing data privacy protection protocol Download PDF

Info

Publication number
CN111831979A
CN111831979A CN202010640122.2A CN202010640122A CN111831979A CN 111831979 A CN111831979 A CN 111831979A CN 202010640122 A CN202010640122 A CN 202010640122A CN 111831979 A CN111831979 A CN 111831979A
Authority
CN
China
Prior art keywords
data
operator
flow graph
privacy protection
data flow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010640122.2A
Other languages
Chinese (zh)
Other versions
CN111831979B (en
Inventor
徐世真
王鲲鹏
朱晓芳
田天
朱军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Real AI Technology Co Ltd
Original Assignee
Beijing Real AI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Real AI Technology Co Ltd filed Critical Beijing Real AI Technology Co Ltd
Priority to CN202010640122.2A priority Critical patent/CN111831979B/en
Publication of CN111831979A publication Critical patent/CN111831979A/en
Application granted granted Critical
Publication of CN111831979B publication Critical patent/CN111831979B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/10Protecting distributed programs or content, e.g. vending or licensing of copyrighted material ; Digital rights management [DRM]
    • G06F21/12Protecting executable software
    • G06F21/14Protecting executable software against software analysis or reverse engineering, e.g. by obfuscation

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Technology Law (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer And Data Communications (AREA)

Abstract

The invention provides an analysis method and a device of a data privacy protection protocol, and relates to the technical field of computers, wherein the method comprises the following steps: acquiring a program code of a linear regression algorithm; converting the program codes based on a data flow graph generating tool to obtain a data flow graph corresponding to a linear regression algorithm; wherein the dataflow graph includes a series of operators; aiming at any operator in the data flow graph, judging whether the current operator meets a preset data leakage condition; if so, determining that the operator has a data leakage risk, and determining that the program code does not meet the data privacy protection protocol. The method and the device can improve the analysis efficiency of the data privacy protection protocol and reduce the updating difficulty of the data privacy protection project.

Description

Method and device for analyzing data privacy protection protocol
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for analyzing a data privacy protection protocol.
Background
Data privacy protection generally means that characteristics or labels of a place where data belongs (an authorized party) cannot be transmitted to a non-data place where data belongs (an unauthorized party) in a plaintext mode, and in machine learning scenes needing participation of multiple parties such as intelligent analysis and intelligent decision making, not only does the multiple parties need to participate in information interaction to complete model training, but also needs to ensure data security and protect data privacy. The main technique for data privacy protection in current machine learning scenarios is federated learning (also known as shared learning).
However, in the federate learning framework FATE and its variant schemes, the data privacy protection protocol needs to be derived by a manual analysis calculation formula, and the manual analysis mode has high labor cost and low efficiency; more importantly, when algorithms such as linear regression are modified (for example, modification such as addition or deletion of regular terms) the calculation formula must be re-analyzed to determine whether the data meets the requirement of data privacy protection, which causes that a large number of data privacy protection items are difficult to be specifically updated subsequently. Therefore, the manual analysis mode aiming at the data privacy protection protocol has the problems of high cost, low efficiency and difficulty in updating the data privacy protection items.
Disclosure of Invention
To solve the above technical problem or at least partially solve the above technical problem, the present disclosure provides an analysis method of a data privacy protection protocol.
The present disclosure provides a method for analyzing a data privacy protection protocol, including: acquiring a program code of a linear regression algorithm; converting the program code based on a data flow graph generating tool to obtain a data flow graph corresponding to the linear regression algorithm; wherein the dataflow graph includes a series of operators; judging whether a current operator meets a preset data leakage condition or not aiming at any operator in the data flow graph; and if so, determining that the operator has a data leakage risk, and determining that the program code does not meet a data privacy protection protocol.
Further, the dataflow graph generation tool includes: the Google-JAX computing framework; the step of converting the program code based on the data flow graph generation tool to obtain the data flow graph corresponding to the linear regression algorithm includes: adopting the Google-JAX computing framework to construct data type conversion from a Python program with a Numpy style to a JAX intermediate layer; and converting the program code according to the data type conversion to obtain a data flow graph corresponding to the linear regression algorithm.
Further, the operators include: operands, operator names, and output results, and the operands are used to represent data input by different parties.
Further, the data leakage condition is as follows: and when an operand in the operator contains preset sensitive information and the executing party of the operator is different from the source party of the operand in subsequent calculation, determining that the operator has a data leakage risk.
Further, the method further comprises: for an operator with a risk of data leakage, encrypting an operand of the operator by using a local key; and sending the encrypted operand to the executing party.
Further, after the step of determining that the program code does not satisfy the data privacy protection protocol, the method further comprises: and carrying out privacy transformation on the data flow graph to obtain a new data flow graph meeting a data privacy protection protocol.
Further, the method further comprises: and visualizing the new data flow graph, and taking the visualized new data flow graph as an external verification interface of a data privacy protection protocol.
The present disclosure also provides an analysis apparatus for a data privacy protection protocol, including: the acquisition module is used for acquiring a program code of a linear regression algorithm; the conversion module is used for converting the program code based on a data flow graph generation tool to obtain a data flow graph corresponding to the linear regression algorithm; wherein the dataflow graph includes a series of operators; the judging module is used for judging whether the current operational character meets a preset data leakage condition or not when the linear regression algorithm calculates the operational character aiming at any operational character in the data flow graph; and the determining module is used for determining that the operator has a data leakage risk and determining that the program code does not meet a data privacy protection protocol under the condition of satisfaction.
The present disclosure also provides an electronic device, including: a processor and a storage device; the storage device has stored thereon a computer program which, when executed by the processor, performs the above-described analysis method of the data privacy protection protocol.
The present disclosure also provides a computer-readable storage medium, on which a computer program is stored, wherein the computer program is executed by a processor to perform the steps of the above method for analyzing a data privacy protection protocol.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages:
the embodiment of the disclosure provides an analysis method and device for a data privacy protection protocol, which can firstly obtain a program code of a linear regression algorithm, and then convert the program code into a data flow graph based on a data flow graph generation tool; the data flow diagram is different from a calculation formula, is an intermediate layer expression between the calculation formula and the bottom layer calculation logic, and can describe an intermediate calculation process and data dependence; thereby providing a basis for the next decision process. On one hand, whether the program code meets the data privacy protection protocol is determined by automatically judging whether the operational characters in the data flow graph meet the preset data leakage condition, so that the calculation formula can be prevented from being analyzed in a manual mode, and the analysis efficiency is effectively improved; on the other hand, when the linear regression algorithm is changed, the calculation formula is not analyzed again, and whether the data meet the requirement of data privacy protection can be determined by analyzing the data flow diagram, so that the updating difficulty of the data privacy protection project is effectively reduced.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the description of the embodiments or prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart of an analysis method of a data privacy protection protocol according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram illustrating an implementation scenario of a data privacy protection protocol according to an embodiment of the disclosure;
FIG. 3 is pseudo code according to an embodiment of the disclosure;
FIG. 4 is a schematic diagram of a dataflow graph according to an embodiment of the present disclosure;
fig. 5 is a block diagram illustrating an analysis apparatus of a data privacy protection protocol according to an embodiment of the disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, aspects of the present disclosure will be further described below. It should be noted that the embodiments and features of the embodiments of the present disclosure may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced in other ways than those described herein; it is to be understood that the embodiments disclosed in the specification are only a few embodiments of the present disclosure, and not all embodiments.
Currently, federal learning is used as a main data privacy protection technology, and in order to realize that multi-party data participates in information interaction to complete model training and ensure the requirements of data security and data privacy protection, a homomorphic encryption algorithm is used in the bottom layer. The encryption means that an asymmetric encryption algorithm is used, a private key is locally reserved, and a public key is externally displayed; data is encrypted by a secret key before being transmitted to the outside, and an unauthorized party cannot acquire plaintext data. Homomorphism means that simple calculation such as addition and subtraction can be carried out among a plurality of ciphertext data, and plaintext data can be obtained through decryption. And a plurality of participants of the federal learning encrypt respective data by using a homomorphic encryption technology algorithm and complete a machine learning training process. It should be noted that, a plurality of ciphertexts encrypted by using the same key can be simply calculated, and the ciphertexts and part of complex operations obtained by encrypting different keys cannot be applied to homomorphic operations.
The federal learning technology based on the homomorphic encryption algorithm needs to carefully analyze the execution logic of different machine learning algorithms to obtain a data privacy protection protocol. The data privacy protection protocol here refers to the phase to which the calculation is made, which variables are communicated, and whether or not to encrypt before communication or what key to encrypt with.
However, in the federate learning framework FATE and its variant, the manual analysis method for the data privacy protection protocol has the problems of high cost, low efficiency and difficult updating of the data privacy protection items.
In order to solve the above problem, embodiments of the present disclosure provide an analysis method and an analysis device for a data privacy protection protocol, which can improve analysis efficiency and reduce update difficulty of data privacy protection items. For the convenience of understanding the present embodiment, a detailed description is first given of an analysis method of a data privacy protection protocol disclosed in the embodiments of the present disclosure.
The first embodiment is as follows:
referring to a flow chart of an analysis method of a data privacy protection protocol shown in fig. 1, the method mainly includes the following steps:
step S102, program codes of the linear regression algorithm are obtained.
In practical applications, program codes for training the linear regression algorithm need to be executed by two different parties, and the program codes executed by the different parties are different, so that the two parties complete the training of the linear regression algorithm without revealing respective part of the program codes. The program code can be divided into X data and Y data according to the participants, the X data representing the program code portions executed at the X-side participant and the Y data representing the program code portions executed at the Y-side participant. In the present embodiment, the program code of the linear regression algorithm is obtained as a complete program code including both X data and Y data.
Step S104, converting the program code based on the data flow graph generating tool to obtain a data flow graph corresponding to a linear regression algorithm; wherein the dataflow graph includes a series of operators, the operators including: operands, operator names, and output results, and the operands are used to represent data input by different parties.
In this embodiment, the dataflow graph generating tool is a generating tool having a front-end interface and a preset dataflow graph format, and the front-end interface is used to acquire program code. A dataflow graph generation tool such as the Google-JAX computing framework.
And step S106, aiming at any operator in the data flow graph, judging whether the current operator meets a preset data leakage condition.
The data leakage conditions are as follows: when an operand in the operator contains preset sensitive information and the execution party of the operator is different from the source party of the operand in subsequent calculation, determining that the operator has a data leakage risk. Based on the data leakage condition, if the current operator satisfies the condition, the following step S108 is performed; if the current operator does not satisfy the condition, it is determined that the operator does not present a risk of data leakage.
And step S108, determining that the operator has a data leakage risk, and determining that the program code does not meet a data privacy protection protocol.
The operator has data leakage risk, and the data leakage risk of the data flow graph comprising the operator is shown; since the data flow graph is obtained by converting the program code, the program code inevitably has a data leakage risk, so that the program code can be determined not to meet the data privacy protection protocol.
Of course, if it is determined that each operator does not satisfy the preset data leakage condition, each operator does not have a data leakage risk, and thus it can be determined that the program code and the data flow graph corresponding to the linear regression algorithm satisfy the data privacy protection protocol.
According to the analysis method of the data privacy protection protocol, the program code of the linear regression algorithm can be obtained firstly, and then the program code is converted into the data flow graph based on the data flow graph generating tool; the data flow diagram is different from a calculation formula, is an intermediate layer expression between the calculation formula and the bottom layer calculation logic, and can describe an intermediate calculation process and data dependence; thereby providing a basis for the next decision process. On one hand, whether the program code meets the data privacy protection protocol is determined by automatically judging whether the operational characters in the data flow graph meet the preset data leakage condition, so that the calculation formula can be prevented from being analyzed in a manual mode, and the analysis efficiency is effectively improved; on the other hand, when algorithms such as linear regression and the like are changed, the calculation formula is not analyzed again, whether the data meet the requirement of data privacy protection can be determined by analyzing the data flow diagram, and the updating difficulty of data privacy protection items is effectively reduced.
In order to facilitate understanding of the analysis method of the data privacy protection protocol, the present embodiment will be described in detail.
For the program code of the linear regression algorithm, the present embodiment provides one possible application scenario as follows: in an existing open-source federal learning framework, the data privacy protection protocol is executed as shown in fig. 2, and includes steps one to ten (the calculation expression involved is an existing formula and is used only as an example, so the meaning of each letter therein is not explained here), and mathematical expressions corresponding to each step. The data privacy protection protocol is implemented by a preset programming language (e.g., C + +/Python, etc.), and program codes of a linear regression algorithm are obtained. In this scenario, the two different parties to which the program code corresponds are party a and party B, respectively. It is to be understood that the scenario shown in fig. 2 provided in this embodiment is only an example, and should not be construed as a limitation.
Taking Google-JAX computing framework as a data flow graph generation tool as an example, the embodiment provides an implementation manner of converting the program code into the data flow graph in step S104, and may refer to the following steps 1 and 2:
step 1, adopting a Google-JAX computing framework to construct data type conversion from a Python program of a Numpy-like style to JAX intermediate layer representation (JAXPR).
In a specific implementation, pseudo code as shown in fig. 3 can be constructed based on a Google-JAX computing box architecture, and a data conversion type based on the implementation can be understood as a "value _ and _ secure" computing primitive for supplementing a linear regression algorithm loss function written by a user.
And 2, converting the program codes according to the data type conversion to obtain a data flow graph corresponding to the linear regression algorithm.
Through supplementary description, the JAX intermediate layer representation is triggered and generated, namely, the program codes are converted, and the data flow graph is obtained. Taking the calculation expression of the linear regression algorithm shown in fig. 2 as an example, the JAX intermediate layer obtained by conversion is shown in fig. 4, and the JAX intermediate layer representation can describe an intermediate calculation process and data dependence, that is, a data flow diagram containing a series of operators is completely expressed.
The Google-JAX computing framework is used as a data flow graph generation tool in the embodiment, because a JAX front end uses a programming interface of a Numpy-like library, the JAX front end is considered to be very consistent with the use habit of a python machine learning developer. Of course, in other possible implementations, the dataflow graph generation tool may also be a deep learning framework (such as tensrflow, pytorreh, and the like) with a front-end interface and a dataflow graph, or may also be a generation tool that defines a front-end interface and a dataflow graph format for a user. It can be appreciated that there are certain differences in the front-end interface and dataflow graph formats of different generation tools.
And aiming at any operator in the data flow graph, judging the data leakage risk of the operator based on a preset data leakage condition so as to analyze whether the program code meets a data privacy protection protocol. Wherein, the data leakage condition is as follows: when an operand in the operator contains preset sensitive information and the execution party of the operator is different from the source party of the operand in subsequent calculation, determining that the operator has a data leakage risk. The data leakage condition can be understood as: and the original data of the source is not encrypted and calculated by an operator of the obfuscated information, and is directly sent to the other party, so that the operator is determined to have the risk of data leakage.
For example, taking the data flow graph corresponding to the linear regression algorithm shown in fig. 4 as an example, each row of the data flow graph represents an operator, and the operator includes an operand, an operator name, and an output result. In the dataflow diagram, input data, i.e. X data and Y data, are represented by operand c and operand d, respectively. 1 to 3 operators in the data flow graph are used for calculating X data; operator 4 produces an operand h resulting from the joint calculation of X data and Y data. Further, operand h, if sent directly to one side of the X data, may result in leakage of the Y data information. The computation of the 24 th operator depends on both h (see the 21 st to 23 rd operators, where the operand bc is computed from h by multiplying a constant, directly revealing h) and X.
According to the above example, each operator can deduce whether the operator contains sensitive local input data information during calculation; if the output operand contains sensitive information and subsequent calculations of the operand need to be performed on the other side, i.e. a data leakage condition is met, the operator is at risk of data leakage.
For an operator with a risk of data leakage, the embodiment may further encrypt an operand of the operator by using a local key; and sending the encrypted operand to the executing party.
Illustratively, if the computation of the 24 th operator is performed on the X side, h needs to be homomorphically encrypted by using the Y side public key and then sent to the X side, and the matrix multiplication computation of the 24 th operator needs to be performed in a ciphertext.
According to the above embodiment, after the step of determining that the program code does not satisfy the data privacy protection protocol, the present embodiment may further include the following method:
and carrying out privacy transformation on the data flow graph to obtain a new data flow graph meeting a data privacy protection protocol. The method for privacy modification of the data flow graph may be various, for example, privacy modification of the data flow graph is performed in a manual modification manner, and the modification content may include: determining operand origin, determining execution position of the operator, and the like.
The new data flow graph after privacy modification can meet the data privacy protection protocol, and in this case, the new data flow graph can be visualized in the embodiment, and the visualized new data flow graph is used as an external verification interface of the data privacy protection protocol.
It can be understood that if it is determined that the program code satisfies the data privacy protection protocol through judgment, the data flow graph can be directly visualized to obtain a visualized data flow graph.
In the data privacy protection project, the visualized data flow graph or the visualized new data flow graph can enable the data privacy protection protocol to be analyzed and verified quickly.
In summary, the analysis method for the data privacy protection protocol provided by the embodiment can automatically analyze whether the program code meets the data privacy protection protocol based on the data flow graph obtained by conversion, thereby avoiding analyzing the calculation formula in a manual manner, effectively reducing the analysis cost and improving the analysis efficiency; when algorithms such as linear regression and the like are changed, whether the data meet the requirement of data privacy protection can be determined by analyzing the data flow graph, and the updating difficulty of data privacy protection items is effectively reduced.
Example two:
the embodiment provides an analysis device for a data privacy protection protocol, which is used for implementing the analysis method for the data privacy protection protocol provided by the embodiment. Referring to fig. 5, the apparatus includes:
an obtaining module 502, configured to obtain a program code of a linear regression algorithm;
a conversion module 504, configured to convert the program code based on the dataflow graph generation tool to obtain a dataflow graph corresponding to the linear regression algorithm; wherein the dataflow graph includes a series of operators;
a determining module 506, configured to determine, for any operator in the dataflow graph, whether a current operator meets a preset data leakage condition;
and a determining module 508, configured to determine that the operator is at risk of data leakage if the operator is satisfied, and determine that the program code does not satisfy the data privacy protection protocol.
In one embodiment, the dataflow graph generation tool includes: the Google-JAX computing framework; the conversion module 504 is further configured to: adopting a Google-JAX computing framework to construct data type conversion from a Python program with a Numpy style to a JAX intermediate layer; and converting the program codes according to the data type conversion to obtain a data flow graph corresponding to the linear regression algorithm.
In one embodiment, the apparatus further comprises an encryption module (not shown) for: for an operator with a risk of data leakage, encrypting an operand of the operator by using a local key; and sending the encrypted operand to an executing party.
In one embodiment, the apparatus further comprises a retrofit module (not shown) for: and carrying out privacy transformation on the data flow graph to obtain a new data flow graph meeting a data privacy protection protocol.
In one embodiment, the apparatus further comprises a visualization module (not shown) for: and visualizing the new data flow graph, and taking the visualized new data flow graph as an external verification interface of the data privacy protection protocol.
The device provided by the embodiment of the present invention has the same implementation principle and technical effect as the method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the method embodiments without reference to the device embodiments.
Example three:
an embodiment of the present invention provides an electronic device, including: a processor and a storage device; the storage device has stored thereon a computer program which, when executed by the processor, performs the method of analyzing a data privacy protection protocol as described in embodiment one.
An embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the method for analyzing a data privacy protection protocol in the first embodiment is executed.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present disclosure, which enable those skilled in the art to understand or practice the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method for analyzing a data privacy protection protocol is characterized by comprising the following steps:
acquiring a program code of a linear regression algorithm;
converting the program code based on a data flow graph generating tool to obtain a data flow graph corresponding to the linear regression algorithm; wherein the dataflow graph includes a series of operators;
judging whether a current operator meets a preset data leakage condition or not aiming at any operator in the data flow graph;
and if so, determining that the operator has a data leakage risk, and determining that the program code does not meet a data privacy protection protocol.
2. The method of claim 1, wherein the dataflow graph generation tool includes: the Google-JAX computing framework;
the step of converting the program code based on the data flow graph generation tool to obtain the data flow graph corresponding to the linear regression algorithm includes:
adopting the Google-JAX computing framework to construct data type conversion from a Python program with a Numpy style to a JAX intermediate layer;
and converting the program code according to the data type conversion to obtain a data flow graph corresponding to the linear regression algorithm.
3. The method of claim 1, wherein the operators comprise: operands, operator names, and output results, and the operands are used to represent data input by different parties.
4. The method of claim 3, wherein the data leak condition is: and when an operand in the operator contains preset sensitive information and the executing party of the operator is different from the source party of the operand in subsequent calculation, determining that the operator has a data leakage risk.
5. The method of claim 4, further comprising:
for an operator with a risk of data leakage, encrypting an operand of the operator by using a local key;
and sending the encrypted operand to the executing party.
6. The method of claim 1, wherein after the step of determining that the program code does not satisfy a data privacy protection protocol, the method further comprises:
and carrying out privacy transformation on the data flow graph to obtain a new data flow graph meeting a data privacy protection protocol.
7. The method of claim 6, further comprising:
and visualizing the new data flow graph, and taking the visualized new data flow graph as an external verification interface of a data privacy protection protocol.
8. An apparatus for analyzing a data privacy protection protocol, comprising:
the acquisition module is used for acquiring a program code of a linear regression algorithm;
the conversion module is used for converting the program code based on a data flow graph generation tool to obtain a data flow graph corresponding to the linear regression algorithm; wherein the dataflow graph includes a series of operators;
the judging module is used for judging whether the current operator meets a preset data leakage condition or not aiming at any operator in the data flow graph;
and the determining module is used for determining that the operator has a data leakage risk and determining that the program code does not meet a data privacy protection protocol under the condition of satisfaction.
9. An electronic device, comprising: a processor and a storage device;
the storage device has stored thereon a computer program which, when executed by the processor, performs the method of any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of the claims 1 to 7.
CN202010640122.2A 2020-07-06 2020-07-06 Method and device for analyzing data privacy protection protocol Active CN111831979B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010640122.2A CN111831979B (en) 2020-07-06 2020-07-06 Method and device for analyzing data privacy protection protocol

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010640122.2A CN111831979B (en) 2020-07-06 2020-07-06 Method and device for analyzing data privacy protection protocol

Publications (2)

Publication Number Publication Date
CN111831979A true CN111831979A (en) 2020-10-27
CN111831979B CN111831979B (en) 2021-08-17

Family

ID=72900188

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010640122.2A Active CN111831979B (en) 2020-07-06 2020-07-06 Method and device for analyzing data privacy protection protocol

Country Status (1)

Country Link
CN (1) CN111831979B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112287396A (en) * 2020-12-24 2021-01-29 北京瑞莱智慧科技有限公司 Data processing method and device based on privacy protection
CN114861230A (en) * 2022-07-07 2022-08-05 支付宝(杭州)信息技术有限公司 Privacy protection method and device in terminal equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101271421A (en) * 2007-03-22 2008-09-24 北京邮电大学 Software fault test system and method
CN105787367A (en) * 2016-02-23 2016-07-20 华中科技大学 Patch security detecting method and system for software update
CN110365620A (en) * 2018-03-26 2019-10-22 中移(苏州)软件技术有限公司 A kind of stream data method for secret protection and device
CN110598443A (en) * 2019-09-12 2019-12-20 卓尔智联(武汉)研究院有限公司 Data processing device and method based on privacy protection and readable storage medium
CN110955898A (en) * 2019-12-12 2020-04-03 杭州安恒信息技术股份有限公司 Vulnerability auditing method and system of station building system and related device
CN111240982A (en) * 2020-01-09 2020-06-05 华东师范大学 Static analysis method for source code
CN111240687A (en) * 2020-01-09 2020-06-05 华东师范大学 Source code static analysis device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101271421A (en) * 2007-03-22 2008-09-24 北京邮电大学 Software fault test system and method
CN105787367A (en) * 2016-02-23 2016-07-20 华中科技大学 Patch security detecting method and system for software update
CN110365620A (en) * 2018-03-26 2019-10-22 中移(苏州)软件技术有限公司 A kind of stream data method for secret protection and device
CN110598443A (en) * 2019-09-12 2019-12-20 卓尔智联(武汉)研究院有限公司 Data processing device and method based on privacy protection and readable storage medium
CN110955898A (en) * 2019-12-12 2020-04-03 杭州安恒信息技术股份有限公司 Vulnerability auditing method and system of station building system and related device
CN111240982A (en) * 2020-01-09 2020-06-05 华东师范大学 Static analysis method for source code
CN111240687A (en) * 2020-01-09 2020-06-05 华东师范大学 Source code static analysis device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112287396A (en) * 2020-12-24 2021-01-29 北京瑞莱智慧科技有限公司 Data processing method and device based on privacy protection
CN114861230A (en) * 2022-07-07 2022-08-05 支付宝(杭州)信息技术有限公司 Privacy protection method and device in terminal equipment
CN114861230B (en) * 2022-07-07 2022-11-01 支付宝(杭州)信息技术有限公司 Privacy protection method and device in terminal equipment

Also Published As

Publication number Publication date
CN111831979B (en) 2021-08-17

Similar Documents

Publication Publication Date Title
Gai et al. Blend arithmetic operations on tensor-based fully homomorphic encryption over real numbers
Li et al. Privacy-preserving machine learning with multiple data providers
CN110008717B (en) Decision tree classification service system and method supporting privacy protection
US20230087864A1 (en) Secure multi-party computation method and apparatus, device, and storage medium
CN105453481B (en) Calculating equipment including table network
JP2020532771A (en) High-precision privacy protection real-valued function evaluation
CN110166446B (en) Method for realizing geographical weighted average center based on safe multi-party calculation
KR102550812B1 (en) Method for comparing ciphertext using homomorphic encryption and apparatus for executing thereof
US20180204284A1 (en) Cryptographically secure financial instruments
CN111831979B (en) Method and device for analyzing data privacy protection protocol
JP2012163960A (en) Method and device for classification based upon tree using encryption technique
WO2020011200A1 (en) Cross-domain data fusion method and system, and storage medium
CN107004084A (en) Multiplicative masking for cryptographic operation
CN113761563B (en) Data intersection calculation method and device and electronic equipment
Fan et al. PPMCK: Privacy-preserving multi-party computing for K-means clustering
CN112905187A (en) Compiling method, compiling device, electronic equipment and storage medium
CN116484415A (en) Privacy decision tree reasoning method based on isomorphic encryption
Jang et al. Parallel quantum addition for Korean block ciphers
CN111859440B (en) Sample classification method of distributed privacy protection logistic regression model based on mixed protocol
El Mestari et al. Preserving data privacy in machine learning systems
Malik et al. A homomorphic approach for security and privacy preservation of Smart Airports
CN113055153B (en) Data encryption method, system and medium based on fully homomorphic encryption algorithm
JP5668549B2 (en) Confidential analysis processing method, program, and apparatus
CN117521102A (en) Model training method and device based on federal learning
Tillem et al. Privacy-preserving alpha algorithm for software analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20201027

Assignee: Beijing Intellectual Property Management Co.,Ltd.

Assignor: Beijing Ruili Wisdom Technology Co.,Ltd.

Contract record no.: X2023110000073

Denomination of invention: Analysis Method and Device of a Data Privacy Protection Protocol

Granted publication date: 20210817

License type: Common License

Record date: 20230531