DATA PROCESSING METHOD, DEVICE AND SYSTEM FOR MACHINE LEARNING MODEL
Technical Field
The present specification relates to the field of computer technologies, and in particular, to a data processing method, device and system for a machine learning model.
Background Art
With the development of the computer technologies, the machine learning model is used increasingly, and it can be adopted for risk prediction and evaluation, etc. When a machine learning model is to be used, data needs be collected for a model training or model use. Usually, a data holder and a model holder may be different users. A data set owned by the data holder may contain sensitive information that cannot be revealed and cannot be used by the model holder. The model parameters owned by the model holder also cannot be directly used by the data holder. How to ensure the security and privacy of the data interaction during the training and use of the machine model is a technical problem urgently to be solved in the field.
Summary of the Disclosure
An objective of the present disclosure is to provide a data processing method, device and system for a machine learning model, so as to achieve the security and privacy of the data interaction of the machine learning model.
In one aspect, an embodiment of the present disclosure provides a data processing method for a machine learning model, comprising:
obtaining data information of a second terminal by adopting an oblivious transfer protocol or a homomorphic encryption algorithm;
inputting the data information into a tree model, calling a function for realizing comparison functionality by each level of the tree model, and obtaining a Hash value corresponding to each node of the tree model according to the data information, wherein the function for realizing comparison functionality generates two Hash vectors for Hash computation based on the inputted data information; and
generating node attribute value encryption information according to a Hash value corresponding to a leaf node of the tree model, and transmitting the node attribute value encryption information to the second terminal, so that the second terminal generates a second prediction value share corresponding thereto according to the node attribute value encryption information.
In another aspect, the present disclosure provides a data processing method for a machine learning model, comprising:
transmitting data information to a first terminal by adopting an oblivious transfer protocol or a homomorphic encryption algorithm, so that a tree model in the first terminal calls a function for realizing comparison functionality, and obtaining a Hash value corresponding to each node of the tree model according to the data information, wherein the function for realizing comparison functionality generates two Hash vectors for Hash computation based on the data information;
receiving node attribute value encryption information generated by the first terminal according to a Hash value corresponding to a leaf node of the tree model; and
decrypting the node attribute value encryption information to obtain a second prediction value share.
In another aspect, the present disclosure provides a data processing device for a machine learning model, comprising:
a first data transmission module configured to obtain data information of a second terminal by adopting an oblivious transfer protocol or a homomorphic encryption algorithm;
a first model computation module configured to input the data information into a tree model, call a function for realizing comparison functionality by each level of the tree model, and obtain a Hash value corresponding to each node of the tree model according to the data information, wherein the function for realizing comparison functionality generates two Hash vectors for Hash computation based on the inputted data information; and
a first prediction module configured to generate node attribute value encryption information according to a Hash value corresponding to a leaf node of the tree model, and transmit the node attribute value encryption information to the second terminal, so that the second terminal generates a second prediction value share corresponding thereto according to the node attribute value encryption information.
In another aspect, the present disclosure provides a data processing apparatus for a machine learning model, comprising at least one processor and a memory configured to store instructions executable by the processor, wherein the processor implements a method corresponding to a first terminal in the embodiments of the present disclosure when executing the instructions.
In another aspect, the present disclosure provides a data processing device for a machine learning model, comprising:
a second data transmission module configured to transmit data information to a first terminal by adopting an oblivious transfer protocol or a homomorphic encryption algorithm, so that a tree model in the first terminal calls a function for realizing comparison functionality, and obtain a Hash value corresponding to each node of the tree model according to the data information, wherein the function for realizing comparison functionality generates two Hash vectors for Hash computation based on the data information;
a data reception module configured to receive node attribute value encryption information generated by the first terminal according to a Hash value corresponding to a leaf node of the tree model; and
a second prediction module configured to decrypt the node attribute value encryption information to obtain a second prediction value share.
In yet another aspect, the present disclosure provides a navigation processing apparatus for an application, comprising at least one processor and a memory configured to store instructions executable by the processor, wherein the processor implements a method corresponding to a second terminal in the embodiments of the present disclosure when executing the instructions.
In still another aspect, the present disclosure provides a data processing system for a machine learning model, comprising: a model client, a data client, a data transmission module and a comparison functionality module, wherein the model client comprises one or more tree models, the data client comprises data information, and the data transmission module comprises an oblivious transfer protocol or a homomorphic encryption algorithm;
the model client and the data client perform a data transmission by calling the data transmission module, and call the comparison functionality module to perform data processing for each level of the tree model in the model client; the model client is configured to execute a method corresponding to a first terminal in the embodiment of the present disclosure, and the data client is configured to execute a method corresponding to a second terminal in the embodiment of the present disclosure.
The data processing method, the data processing device, the data processing apparatus and the data processing system for the machine learning model provided by the present disclosure can transmit data of a data holder to a model holder by adopting an oblivious transfer protocol or a homomorphic encryption algorithm, and call a function for realizing comparison functionality by each level of a tree model, while ignoring a path choice of the data in the tree model. The oblivious transfer protocol or the homomorphic encryption algorithm is combined with the function for realizing comparison functionality, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
Brief Description of the Drawings
In order to more clearly describe the technical solutions in the embodiments of the present disclosure or the prior art, the drawings to be used in the description of the embodiments or the prior art are briefly introduced as follows. Obviously, the drawings in the following description just illustrate some embodiments of the present disclosure, and those skilled in the art can obtain other drawings from them without paying any creative effort.
Fig. 1 is a flow schematic diagram of a data processing method for a machine learning model in an embodiment of the present disclosure;
Fig. 2 is a flow schematic diagram of a data processing method for a machine learning model in another embodiment of the present disclosure;
Fig. 3 is a module structure schematic diagram of a data processing device for a machine learning model in an embodiment of the present disclosure;
Fig. 4 is a module structure schematic diagram of a data processing device for a machine learning model in another embodiment of the present disclosure;
Fig. 5 is a structure schematic diagram of a data processing device for a machine learning model in still another embodiment of the present disclosure;
Fig. 6 is a structure schematic diagram of a data processing system for a machine learning model in an embodiment of the present disclosure; and
Fig. 7 is a block diagram of a hardware structure of a data processing server for a machine learning model in an embodiment of the present disclosure.
Detailed Description of the Preferred embodiments
In order that those skilled in the art better understands the technical solutions in the present disclosure, the technical solutions in the embodiments of the present disclosure will be clearly and completely described as follows with reference to the drawings in the embodiments of the present disclosure. Obviously, those described are just a part rather than all of the embodiments of the present disclosure. Based on the embodiments of the present disclosure, any other embodiment obtained by those skilled in the art without any creative effort should fall within the protection scope of the present disclosure.
With the continuous development of the computer and Internet technologies, more and more application scenarios use the machine learning model for data training and processing. The machine learning model usually can be classified into a supervised machine learning model and an unsupervised machine learning model. The machine learning model can be used for risk predictions such as disease prediction, fraud prediction, etc. The machine learning model in the embodiments of the present disclosure can mainly include tree models such as a decision tree model, a random forest tree model, a gradient enhanced decision tree model, and the like. The tree model can be understood as a tree-based learning algorithm with a structure similar to that of a tree and having many nodes. Each node of the tree model may include the following five attributes: ‘leaf’ , ‘threshold’ , ‘attribute’ , ‘value’ and ‘rotate’ , wherein ‘leaf’ may indicate whether the current node is a leaf node, ‘threshold’ may indicate a threshold of the current node, ‘attribute’ may indicate an attribute of the current node, ‘value’ may indicate an attribute value of the attribute of the current node, and ‘rotate’ may indicate whether the two children of the current node are exchanged.
Data collection is usually required during the training and use of a model. The data holder and the model holder are often different users, and neither of them wishes his own sensitive data or model parameters to be obtained by the other party. The embodiment of the present disclosure provides a data processing method for a machine learning model, which can transmit data of a data holder to a model holder by adopting an oblivious transfer protocol or a homomorphic encryption algorithm, so as to ensure the privacy of a data transmission. In addition, a function for realizing comparison functionality is called by each level of a tree model, while a path choice of the data in the tree model is ignored, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
The data processing method for the machine learning model of the present disclosure, can be applied into a client or server, the client may be an electronic device such as a smart phone, a tablet computer, a smart wearable device (a smart watch, virtual reality glasses, a virtual reality helmet, etc. ) , a smart vehicle-mounted device, and the like.
In the embodiment of the present disclosure, a first terminal may represent a client or server owning a model, and a second terminal may represent a client or server owning data.
Specifically, Fig. 1 is a flow schematic diagram of a data processing method for a machine learning model in an embodiment of the present disclosure. As illustrated in Fig. 1, a data processing method for a machine learning model provided by one embodiment of the present disclosure may represent a data processing procedure of a model holder, specifically comprising:
Step 102: obtaining data information of a second terminal by adopting an oblivious transfer protocol or a homomorphic encryption algorithm.
The oblivious transfer protocol may represent a two-party communication protocol that protects privacy, enabling both parties involved in a communication to transmit messages in a manner of choice fuzzification, so that the service receiver obliviously gets some messages input by the service sender, and then the privacy of the receiver will be protected from being acquired by the sender. In some embodiments of the present disclosure, an oblivious transfer protocol module may be defined to realize the following functions by cryptographic technology:
receiving and storing data information x= (x [1] , …x [n] ) transmitted by a terminal 1;
receiving information (transmission, i) transmitted by a terminal 2, and returning c=x [i] to the terminal 2, wherein the terminals 1 and 2 cannot obtain any other information.
It can be seen that the terminal 1 is a data holder, and by adopting the oblivious transfer protocol, the terminal 2 can obtain the data information assigned by the terminal 1, while the terminal 2 does not know other data information in the terminal 1.
The homomorphic encryption is a cryptographic technology based on the computational complexity theory of mathematical difficulty. The data subjected to the homomorphic encryption is processed to obtain an output, and the output is decrypted to show that its result is the same as an output result obtained by processing the unencrypted raw data in the same method. The homomorphic encryption algorithm can protect the privacy from being influenced by the data processor himself, i.e., the personal detailed information being processed cannot be viewed, and only the final result of the processing is seen.
In the embodiment of the present disclosure, the data holder and the model holder can perform a data transmission by adopting the oblivious transfer protocol or the homomorphic encryption algorithm, so as to enable the model holder to obtain the data information of the data holder, and guarantee that other data information will not be obtained, thereby ensuring the privacy of the data transmission.
Step 104: inputting the data information into a tree model, calling a function for realizing comparison functionality by each level of the tree model, and obtaining a Hash value corresponding to each node of the tree model according to the data information, wherein the function for realizing comparison functionality generates two Hash vectors for Hash computation based on the inputted data information.
During the specific implementation, a function for realizing comparison functionality may be pre-defined in some embodiments of the present disclosure, and it may be understood as a component capable of realizing certain functionality and it can be designed based on the cryptographic algorithms. In the embodiment of the present disclosure, the function for realizing comparison functionality can realize the following: based on the data input by the model holder and the data holder, two Hash vectors are generated and returned to the model holder and the data holder, respectively. The model holder and the data holder can perform a Hash computation by using the received data to complete the data processing in the tree model.
In some embodiments of the present disclosure, after obtaining the data information of the data holder, the model holder may input the obtained data information into the tree model. When processing the inputted data information, the tree model may call the function for realizing comparison functionality during the data processing by each level of the tree model, and compute a Hash value corresponding to each node of the tree model according to the Hash vector retuned by the function for realizing comparison functionality. The computation data of one level can be used to compute the Hash value of the node in the next level, until the computation for the last level of the tree model is completed.
It can be seen that in the embodiment of the present disclosure, when the tree model processes the inputted data information, a pre-defined function for realizing comparison functionality may be called by each level of the tree model to perform data processing for each node of the tree model, and it is unnecessary to compare the inputted data information with each node, which ignores the path choice of the data in the tree model, and ensures the privacy of the parameters of the tree model and the data transmission.
Step 106: generating node attribute value encryption information according to a Hash value corresponding to a leaf node of the tree model, and transmitting the node attribute value encryption information to the second terminal, so that the second terminal generates a second prediction value share corresponding thereto according to the node attribute value encryption information.
During the implementation, the Hash values of respective nodes of the tree model can be obtained after the computation for each level of the tree model is completed, and node attribute value encryption information can be generated according to the Hash value corresponding to the node (i.e., the leaf node) in the last level of the tree model. The node attribute value encryption information is transmitted to the data holder, i.e., the second terminal, which can obtain a second prediction value share by decryption.
Based on the above embodiment, in some embodiments of the present disclosure, generating node attribute value encryption information according to a Hash value corresponding to a leaf node of the tree model may comprise:
randomly generating a first prediction value share, and obtaining an attribute value of the leaf node of the tree model;
computing XOR information of the attribute value of the leaf node with the first prediction value share; and
encrypting corresponding XOR information of the leaf node with the Hash value corresponding to the leaf node, to obtain the node attribute value encryption information.
During the specific implementation, after the computation for each level of the tree model is completed, it is possible to randomly generate the first prediction value share, obtain the attribute value of the leaf node of the tree model, and perform an XOR computation for the attribute value of the leaf node and the first prediction value share to obtain the XOR information corresponding to the leaf node. The computed XOR information corresponding to the leaf node is encrypted with the Hash value corresponding to the leaf node, to obtain the node attribute value encryption information corresponding to the leaf node.
In which, the leaf node can represent a node having no children in the tree model.
The embodiment of the present disclosure provides a data processing method for a machine learning model, which can transmit data of a data holder to a model holder by adopting an oblivious transfer protocol or a homomorphic encryption algorithm, and call a function for realizing comparison functionality in each level of a tree model, while ignoring a path choice of the data in the tree model. The oblivious transfer protocol or the homomorphic encryption algorithm is combined with the function for realizing comparison functionality, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
Based on the above embodiment, in one embodiment of the present disclosure, calling a function for realizing comparison functionality by each level of the tree model, and obtaining a Hash value corresponding to each node of the tree model according to the data information comprises:
generating first comparison data according to the data information, and calling the function for realizing comparison functionality according to second comparison data generated by the second terminal to obtain a first Hash vector;
generating Hash values of the nodes of the tree model according to the first Hash vector, and encrypting corresponding attribute information of nodes with the Hash values of the nodes to obtain node encryption information of each node;
transmitting the node encryption information to the second terminal, so that the second terminal decrypts the node encryption information according to a second Hash vector returned by the function for realizing comparison functionality, to obtain media data; and
performing a computation for a next level of the tree model according to the first comparison data regenerated from the media data and the second comparison data updated by the second terminal, until a computation for a last level of the tree model is completed.
During the specific implementation, a function for realizing comparison functionality may be pre-defined in the embodiment of the present disclosure, and it may be understood as a component capable of realizing certain functionality and it can be designed based on the cryptographic algorithms. In some embodiments of the present disclosure, two functions, F
cmp and F
cmp*, for realizing comparison functionalities can be provided, and defined as follows to realize corresponding functionalities:
Definition of F
cmp:
receiving comparison data 1: (t
A, x
A, r
A) transmitted by the terminal 1 and comparison data 2: (t
B, x
B, r
B) transmitted by the terminal 2; and
computing
and randomly generating and transmitting a vector (L
0, L
1) to the terminal 1, and then transmitting the vector (L
b, b) to the terminal 2.
In which,
represents a Boolean operation, and specifically an XOR operation; b is valued as 0 or 1; when b=0, L
b is L
0, and when b=1, L
b is L
1.
Some other embodiments of the present disclosure further provide a definition of the comparison functionality F
cmp* to realize corresponding functionality:
receiving comparison data 1: (t
A, x
A, r
A) transmitted by the terminal 1 and comparison data 2: (t
B, x
B, r
B) transmitted by the terminal 2;
computing
and randomly generating and transmitting a vector (L
0, L
1) to the terminal 1, and then transmitting the vector (L
b, b) to the terminal 2.
In which,
represents a Boolean operation, and specifically an XOR operation; b is valued as 0 or 1; when b=0, L
b is L
0, and when b=1, L
b is L
1.
After the functions for realizing comparison functionalities are defined, the model holder and the data holder may generate the first comparison data, i.e., (t
A, x
A, r
A) , and the second comparison data, i.e., (t
B, x
B, r
B) , respectively, and may call the functions for realizing comparison functionalities to generate the first Hash vector, i.e., (X
0, X
1) , and the second Hash vector, i.e., (X
b, b) , respectively. In which, the first comparison data of the model holder may be generated according to the attributes of respective nodes in the model and the obtained data information of the data holder; t
A may represent a threshold of a node, i.e., a node attribute ‘threshold’ ; x
A may represent the data information of the data holder obtained through the oblivious transfer protocol, or an attribute value of a node, i.e., a node attribute ‘value’ ; r
A may represent whether two children of a node are exchanged, i.e., a node attribute ‘rotate’ . During the computation for the first level of the tree model, the second comparison data of the data holder, i.e., the second terminal, can be randomly generated, and the second comparison data for the computation for the second level of the tree model can be generated by using a result of the computation for the first level of the tree model, and so on in a similar fashion.
After F
cmp and F
cmp* are called, the model holder, i.e., the first terminal can obtain the first Hash vector, and the second terminal can obtain the second Hash vector. The model holder may use the first Hash vector to compute Hash values corresponding to the respective nodes, and encrypt corresponding attribute information of the nodes according to the computed Hash values to obtain node encryption information of the respective nodes, and transmit the node encryption information to the second terminal. When the attribute information of the node is to be encrypted, during the computation for one level of the tree model, the encryption may be carried out after performing a Boolean operation or an arithmetic operation on the attribute information of the node and the randomly generated value. In the embodiment of the present disclosure, three pieces of attribute information of the node, i.e., ‘threshold’ , ‘attribute’ and ‘rotate’ , may be processed and then encrypted. Of course, other attribute information of the node may also be encrypted based on the actual need. The second terminal may decrypt the received node encryption information with the second Hash vector to obtain the media data. The first comparison data and the second comparison data for the computation for the next level of the tree model may be generated based on the media data, and then the computation for the next level of the tree model is performed, until the computation for the last level of the tree model is completed.
In some embodiments of the present disclosure, if the model holder and the data holder adopt the oblivious transfer protocol, the symmetric encryption algorithm and the symmetric decryption algorithm may be used for the encryption and decryption of the data information of the data holder during the computation for each level of the tree model. By adopting the encryption method of the single-key cryptosystem, the same key can be used for both encryption and decryption of the information, and such an encryption method may be called as symmetric encryption, also known as single-key encryption. If the model holder and the data holder adopt the homomorphic encryption algorithm, the homomorphic encryption and the homomorphic decryption may be used for the encryption and decryption of the data information of the data holder during the computation for each level of the tree model.
The embodiment of the present disclosure provides a data processing method for a machine learning model, calls a pre-defined function for realizing comparison functionality by each level of the tree model, while ignoring the path choice of the data in the tree model, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
Based on the above embodiment, the data transfer protocol for the model holder and the data holder in the embodiment of the present disclosure, i.e., the data processing procedure of the decision tree model, is specifically introduced as follows:
1. For each i∈ [1, 2
l) , the first terminal, i.e., the model holder, randomly configures a value of T [i] . rotate, and if T [i] . rotate=1, rotates the two children of the node; wherein i may represent a node number, l may represent a level number of the tree model, and T [i] . rotate may represent a rotate attribute of an i-th node in the tree model, i.e., whether the children of the i-th node need to be exchanged.
2. The first terminal performs a parameter initialization, and the following parameters may be set specifically: [t]
A=T [1] . threshold, [a]
A=T [1] . attribute, [r]
A=T [1] . rotate, and [x]
A=0, K
1=0; wherein T [1] . threshold may represent a threshold of a first node, T [1] . attribute may represent an attribute of the first node, and T [1] . rotate may represent whether the children of the first node are exchanged. [t]
A, [r]
A and [x]
A may represent the data used when calling the defined function for realizing comparison functionality, and K1 may represent an initial Hash parameter value.
3. The second terminal, i.e., the data holder, performs a parameter initialization, and the following parameters may be set specifically: [t]
B=0, [a]
B=0, [r]
B=0, K
1=0, and p=1; wherein [t]
B and [r]
B may represent the data used when calling the defined function for realizing comparison functionality, K
1 may represent an initial Hash parameter value, and p may represent a pointer parameter.
4. For each level of the tree model, the following operations are performed, wherein l∈ [0, h] , and h represents a total number of the levels of the tree model:
(a) The second terminal randomly generates and adds a parameter [x]
B into data information x held by the second terminal, and performs a leftward cyclic shift [a]
B operation on the data information added with [x]
B, i.e., moves the data information added with [x]
B by [a]
B positions leftward and cyclically, to obtain encrypted data information x′. The oblivious transfer protocol function is called, wherein the first terminal provides the parameter [a]
A, the second terminal provides the parameter x′, the oblivious transfer protocol function may return a result of computation to the first terminal, and a returned value is denoted as [x]
A.
(b) The function for realizing comparison functionality F
cmp is called, wherein the first terminal provides the first comparison data ( [t]
A, [x]
A, [r]
A) , and the second terminal provides the second comparison data ( [t]
B, [x]
B, [r]
B) . For the definition of F
cmp, please refer to the content of the foregoing embodiments, and it is omitted herein. As can be seen from the definition in the above embodiment, the first terminal is equivalent to the terminal 1, the second terminal is equivalent to the terminal 2, and by calling F
cmp, the first Hash vector (L
0, L
1) may be returned to the first terminal, and the second Hash vector (L
b, b) may be returned to the second terminal.
(c) The first terminal randomly generates [t]
A, [a]
Aand [r]
A, and for the nodes of each i∈ [2
l, 2
l+1) , computes Hash values K
2i+c=H (K
i, L
c) of the respective nodes according to the received first Hash vector (L
0, L
1) , wherein H represents a Hash computation, and c∈ {0, 1} ;
the node encryption information of the respective nodes is computed by using the Hash values of those nodes:
in the above equation, Enc may represent an encryption operation, wherein K
2i+c may represent a key, and
T [2i+c] . attribute- [a]
A,
is encrypted with K
2i+c to obtain the node encryption information;
and {e
i} is transmitted to the second terminal, wherein i∈ [2
l+1, 2
l+2) .
(d) The second terminal updates the pointer parameter p=2p+b, and computes, according to the second Hash vector,
K
p=H (K
p/2, L
b)
( [t]
B, [a]
B, [r]
B) =Dec (K
p; e
p)
wherein, H may represent a Hash computation, Dec may represent a decryption computation, and e
p is decrypted with the key K
p, wherein [t]
A, [t]
B, [a]
A, [a]
B, [r]
A, and [r]
B satisfy the following condition:
5. The first terminal randomly generates a first prediction value share [R]
A, and searches for node j for each i∈ [2
h, 2
h+1) , wherein the node j satisfies a condition: T [j] . leaf=ture, and the node j is a parent of the node i, i.e., to obtain the leaf node of the tree model. The node attribute value encryption informationv
i is computed according to the first prediction value share [R]
Aand the attribute value of the node j:
wherein Enc may represent an encryption operation, and
is encrypted with a key K
i, and
the computed v
i is transmitted to the second terminal.
It should be noted that in the embodiment of the present disclosure, during data processing, the tree model may be padded into a complete binary tree, i.e., some nodes may be dummy nodes. In the embodiment of the present disclosure, the leaf node of the tree model represents the actual leaf node of the tree model, without including the padded dummy nodes.
6. The second terminal performs a decryption operation on the received v
i to obtain a second prediction value share [R]
B=Dec (K
p; v
p) , wherein Dec represents a decryption operation, and a key K
p is used for a decryption operation on v
p, wherein p represents the pointer parameter in step 4.
Some other embodiments of the present disclosure may further provide a way to perform data processing by adopting a homomorphic encryption algorithm, specifically comprising:
1. For each i∈ [1, 2
l) , the first terminal, i.e., the model holder, randomly configures a value of T [i] . rotate, and if T [i] . rotate=1, rotates the two children of the node; wherein i may represent a node number, l may represent a level number of the tree model, and T [i] . rotate may represent a rotate attribute of an i-th node in the tree model, i.e., whether the children of the i-th node need to be exchanged.
2. The second terminal encrypts the held data information by adopting the homomorphic encryption algorithm, i.e., computes X [i] =Enc (x [i] ) , wherein x [i] may represent an i-th data in the data information held by the second terminal, and Enc may represent a homomorphic encryption, i.e., X [i] may represent ciphertext information of x [i] . The encrypted data information X [i] is transmitted to the first terminal.
3. For any i∈T, T may represent a node of the tree model, and i∈T may represent each node of the tree model; the first terminal randomly defines the value of the parameter T [i] . value, and computes the following information:
Add (X [T [i] . attribute] , T [i] . value)
wherein X [i] =Enc (x [i] ) may represent a homomorphic encryption, X [T [i] . attribute] may represent a ciphertext of x [T [i] . attribute] , Add (X [T [i] . attribute] , T [i] . value) may represent ciphertext information of x [T [i] . attribute] +T [i] . value, T [i] . attribute may represent an attribute of an i-th node, and T [i] . value may represent an attribute value of the i-th node.
The first terminal transmits the computedAdd (X [T [i] . attribute] , T [i] . value) to the second terminal, and the second terminal decrypts the received data to obtain T', wherein T' [i] . value+T [i] . value=x [T [i] . attribute] .
4. The first terminal performs a parameter initialization, and the following parameters may be set specifically: [t]
A=T [1] . threshold, <x>
A=T [1] . value, [r]
A=T [1] . rotate, and K
1=0; wherein T [1] . threshold may represent a threshold of a first node, T [1] . attribute may represent an attribute value of the first node, and T [1] . rotate may represent whether the children of the first node are exchanged. [t]
A, [r]
A and <x>
A may represent the data used when calling the defined function for realizing comparison functionality, and K
1 may represent an initial Hash parameter value.
5. The second terminal, i.e., the data holder, performs a parameter initialization, and the following parameters may be set specifically: [t]
B=0, <x>
B=T' [1] . value, [r]
B=0, K
1=0, and p=1; wherein [t]
B, <x>
B and [r]
B may represent the data used when calling the defined function for realizing comparison functionality, K
1 may represent an initial Hash parameter value, and p may represent a pointer parameter.
6. For each level of the tree model, the following operations are performed, wherein l∈ [0, h] , and h represents a total number of the levels of the tree model:
(a) calling a function for realizing comparison functionality F
cmp*, wherein the first terminal provides first comparison data ( [t]
A, <x>
A, [r]
A) , and the second terminal provides second comparison data ( [t]
B, <x>
B, [r]
B) . For the definition of F
cmp*, please refer to the content of the foregoing embodiments, and it is omitted herein. As can be seen from the definition in the above embodiment, the first terminal is equivalent to the terminal 1, the second terminal is equivalent to the terminal 2, and by calling F
cmp*, the first Hash vector (L
0, L
1) may be returned to the first terminal, and the second Hash vector (L
b, b) may be returned to the second terminal.
(b) The first terminal randomly generates [t]
A, <x>
A and [r]
A, and for the nodes of each i∈ [2
l, 2
l+1) , computes Hash values K
2i+c=H (K
i, L
c) of the respective nodes according to the received first Hash vector (L
0, L
1) , wherein H represents a Hash computation, and c∈ {0, 1} ;
the node encryption information of the respective nodes is computed by using the Hash values of those nodes:
In the above equation, Enc may represent an encryption operation, wherein K
2i+c may represent a key, and
T [T [2i+c] . attribute] - <x>
A,
) is encrypted with K
2i+c to obtain the node encryption information;
and {e
i} is transmitted to the second terminal, wherein i∈ [2
l+1, 2
l+2) .
(c) The second terminal updates the pointer parameter p=2p+b, and computes, according to the second Hash vector,
K
p=H (K
p/2, L
b)
( [t]
B, <x>
B, [r]
B) =Dec (K
p; e
p)
wherein, H may represent a Hash computation, Dec may represent a decryption computation, and e
p is decrypted with the key K
p, wherein [t]
A, [t]
B, <x>
A and <x>
B satisfy the following condition:
(d) The second terminal updates <x>
B, <x>
B=<x>
B+T' [T [p] . attribute] . value, wherein <x>
A+ <x>
B=x [T [p] . attribute] .
7. The first terminal randomly generates a first prediction value share [R]
A, and searches for node j for each i∈ [2
h, 2
h+1) , wherein the node j satisfies a condition: T [j] . leaf=ture, and the node j is a parent of the node i, i.e., to obtain the leaf node of the tree model. The node attribute value encryption information v
i is computed according to the first prediction value share [R]
A and the attribute value of the node j:
wherein Enc may represent an encryption operation, and
is encrypted with a key K
i; and
the computed v
i is transmitted to the second terminal.
8. The second terminal performs a decryption operation on the received v
i to obtain a second prediction value share [R]
B=Dec (K
p; v
p) , wherein Dec represents a decryption operation, and a key K
p is used for a decryption operation on v
p, wherein p represents the pointer parameter in step 5.
It should be noted that in the above embodiment, [. ] may represent a Boolean operation, and <·> may represent an arithmetic operation. For example: a Boolean operation string x∈ {0, 1}
n with a length of n is given, the operation [x] may represent a selection of a uniform random string t∈ {0, 1}
n, and then
is computed.
[x]
A and [x]
B are numerical shares of the first terminal and the second terminal, respectively. Similarly, if one x∈F is given, F may represent a set, <x> may represent a selection of a uniform random string t∈F, and then x-t∈F is computed. <x>
A=t, <x>
B=x-t, <x>
A and <x>
B are numerical shares of the first terminal and the second terminal, respectively.
In the embodiment of the present disclosure, the prediction values precited with the tree model may be divided into two parts, i.e., the first prediction value share [R]
A and the second prediction value share [R]
B, which are stored in the first terminal and the second terminal, respectively. The above embodiment may illustrate the data processing procedure by one tree model of the model holder, and one first prediction value share [R]
A and one node attribute value encryption information v
i may be obtained; the second terminal may obtain one second prediction value share [R]
B according to [R]
A and v
i generated by the first terminal. If the first terminal has a plurality of tree models, a plurality of [R]
A and a plurality of v
i may be obtained, thereby further obtaining a plurality of second prediction value shares [R]
B, while the second terminal may linearly combine the plurality of second prediction value shares [R]
B to obtain the final prediction model value.
The embodiment of the present disclosure provides a data processing method for a machine learning model, which combines an oblivious transfer protocol or a homomorphic encryption algorithm with a function for realizing comparison functionality, transmits data of a data holder to a model holder by adopting the oblivious transfer protocol or the homomorphic encryption algorithm, so that the model holder can only obtain the data assigned by the data holder. The function for realizing comparison functionality is called by each level of the tree model, while a path choice of the data in the tree model is ignored, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
Based on the above embodiment, one embodiment of the present disclosure further provides a data processing method for a machine learning model, which is performed based on a data holder. Fig. 2 is a flow schematic diagram of a data processing method for a machine learning model in another embodiment of the present disclosure. As illustrated in Fig. 2, the data processing method for the machine learning model provided by the embodiment of the present disclosure may comprise:
Step 202: transmitting data information to a first terminal by adopting an oblivious transfer protocol or a homomorphic encryption algorithm, so that a tree model in the first terminal calls a function for realizing comparison functionality, and obtaining a Hash value corresponding to each node of the tree model according to the data information, wherein the function for realizing comparison functionality generates two Hash vectors for Hash computation based on the inputted data information.
Please refer to the introduction in the foregoing embodiments for the specific meanings of the oblivious transfer protocol and the homomorphic encryption algorithm, which are omitted herein. The data holder and the model holder transmit the data information of the data holder to the model holder by adopting the oblivious transfer protocol or the homomorphic encryption algorithm. The model holder performs data processing for the nodes of each level of the tree model according to the received data, and during the data processing for each level, calls the function for realizing comparison functionality to compute the Hash values corresponding to the respective nodes. For the definition of the function for realizing comparison functionality, please refer to the content of the foregoing embodiments, and it is omitted herein.
Step 204: receiving node attribute value encryption information generated by the first terminal according to a Hash value corresponding to a leaf node of the tree model.
During the specific implementation, the first terminal, i.e., the model holder, performs data processing for each level of the tree model according to the received data information, until the computation for the last level of the tree model is completed, and the model holder randomly generates the first prediction value share. The Hash value corresponding to the leaf node in the last level of the tree model is obtained, and the node attribute value encryption information is computed according to the Hash value corresponding to the leaf node, the attribute value of the leaf node and the first prediction value share. The specific computation mode of the node attribute encryption information may refer to the content of the foregoing embodiments, and it is omitted herein.
Step 206: decrypting the node attribute value encryption information to obtain a second prediction value share.
During the specific implementation, the first terminal may transmit the computed node attribute value encryption information to the second terminal, and the second terminal may decrypt the received node attribute value encryption information with the corresponding Hash value of the nodes to obtain the second prediction value share.
Based on the above embodiment, when the model holder has a plurality of tree models, one first prediction value share and one piece of node attribute value encryption information may be correspondingly generated according to each tree model. The second terminal i.e., the data holder may generate a plurality of second prediction value shares according to the first prediction value shares and the node attribute value encryption information generated according to the respective tree models, and the second terminal may linearly superpose the second prediction value shares to obtain the model prediction information.
The embodiment of the present disclosure provides a data processing method for a machine learning model, which combines an oblivious transfer protocol or a homomorphic encryption algorithm with a function for realizing comparison functionality, transmits data of a data holder to a model holder by adopting the oblivious transfer protocol or the homomorphic encryption algorithm, so that the model holder can only obtain the data assigned by the data holder. The function for realizing comparison functionality is called by each level of the tree model, while a path choice of the data in the tree model is ignored, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
Based on the above embodiment, in one embodiment of the present disclosure, after transmitting the data information to the first terminal, the method may further comprise:
initialization-generating second comparison data, and calling the function for realizing comparison functionality according to first comparison data generated by the first terminal, to obtain a second Hash vector;
receiving node encryption information generated by the first terminal according to the first Hash vector, and decrypting the node encryption information according to the second Hash vector to obtain media data; and
performing a computation for a next level of the tree model according to the second comparison data regenerated from the media data and the first comparison data updated by the first terminal, until a computation for a last level of the tree model is completed.
During the specific implementation, before the data processing for the first level of the tree model, the second terminal may initialization-generate the second comparison data, and the first terminal may generate the first comparison data according to the obtained data information of the second terminal. The specific generation modes of the first comparison data and the second comparison data may refer to the content of the foregoing embodiments, and they are omitted herein. The first comparison data and the second comparison data may be used to call the function for realizing comparison functionality to obtain the first Hash vector and the second Hash vector, respectively. The first terminal may perform data processing for the node attributes of the tree model according to the first Hash vector, to obtain the node encryption information of each node and the Hash value corresponding to each node, wherein the data processing mode for the node attributes may refer to the content of the foregoing embodiments, and it is omitted herein. The second terminal may decrypt the node encryption information according to the obtained second Hash vector to obtain the media data. Based on the media data, the second terminal may update the second comparison data, and the first terminal may update the first comparison data. The updated first comparison data and second comparison data is used for the computation for a next level of the tree model, until the computation for a last level of the tree model is completed.
In the embodiment of the present disclosure, when the data is processed at the tree model, the pre-defined function for realizing comparison functionality is called by each level of the tree model to perform data processing for each node, while the path choice of the data in the tree model is ignored, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
The above method embodiments of the present disclosure are all described in a progressive manner, and the same or similar portions of the embodiments can refer to each other. Each embodiment lays an emphasis on its distinctions from other embodiments. Refer to the descriptions of the method embodiment for the relevant portions.
Based on the navigation method of the application described above, one or more embodiments of the present disclosure further provide a data processing device for a machine learning model. The device may include those using a system (including a distributed system) , software (application) , a module, a component, a client, etc. for the method in the embodiment of the present disclosure and combining necessary implementation hardware. Based on the same innovative conception, the device (s) provided by one or more embodiments of the present disclosure will be described in the following embodiments. Since the solution of the device to solve the problem is similar to that of the method, the implementation of the device in the embodiment of the present disclosure may refer to the implementation of the foregoing method, and the repeated content will be omitted. The term ‘unit’ or ‘module’ used below may implement a combination of software and/or hardware of predetermined functions. Although the devices described in following embodiments are preferably implemented by software, an implementation by hardware or a combination of software and hardware is also possible and conceivable.
Specifically, Fig. 3 is a module structure schematic diagram of a data processing device for a machine learning model provided by the present disclosure. As illustrated in Fig. 3, the data processing device for the machine learning model provided by the present disclosure is mainly applied to a model holder, and may comprise: a first data transmission module 31, a first model computation module 32 and a first prediction module 33, wherein
the first data transmission module 31 may be configured to obtain data information of a second terminal by adopting an oblivious transfer protocol or a homomorphic encryption algorithm;
the first model computation module 32 may be configured to input the data information into a tree model, call a function for realizing comparison functionality by each level of the tree model, and obtain a Hash value corresponding to each node of the tree model according to the data information, wherein the function for realizing comparison functionality generates two Hash vectors for Hash computation based on the inputted data information;
the first prediction module 33 may be configured to generate node attribute value encryption information according to a Hash value corresponding to a leaf node of the tree model, and transmit the node attribute value encryption information to the second terminal, so that the second terminal generates a second prediction value share corresponding thereto according to the node attribute value encryption information.
The embodiment of the present disclosure provides a data processing device for a machine learning model, which can transmit data of a data holder to a model holder by adopting an oblivious transfer protocol or a homomorphic encryption algorithm, and call a function for realizing comparison functionality in each level of a tree model, while ignoring a path choice of the data in the tree model. The oblivious transfer protocol or the homomorphic encryption algorithm is combined with the function for realizing comparison functionality, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
Based on the above embodiment, the first model computation module is specifically configured to:
generate first comparison data according to the data information, and call the function for realizing comparison functionality according to second comparison data generated by the second terminal to obtain a first Hash vector;
generate Hash values of the nodes of the tree model according to the first Hash vector, and encrypt corresponding attribute information of the nodes with the Hash values of the nodes to obtain node encryption information of each node;
transmit the node encryption information to the second terminal, so that the second terminal decrypts the node encryption information according to a second Hash vector returned by the function for realizing comparison functionality, to obtain media data; and
perform a computation for a next level of the tree model according to the first comparison data regenerated from the media data and the second comparison data updated by the second terminal, until a computation for a last level of the tree model is completed.
In the embodiment of the present disclosure, the function for realizing comparison functionality is called by each level of the tree model, while a path choice of the data in the tree model is ignored, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
Based on the above embodiment, the first model computation module encrypts the attribute information by adopting a symmetric encryption algorithm.
Based on the above embodiment, the first prediction module is specifically configured to:
randomly generate a first prediction value share, and obtain an attribute value of the leaf node of the tree model;
compute XOR information of the attribute value of the leaf node with the first prediction value share;
encrypt corresponding XOR information of the leaf node with the Hash value corresponding to the leaf node, to obtain the node attribute value encryption information.
Based on the combination of the oblivious transfer protocol or the homomorphic encryption and the function for realizing comparison functionality, the embodiment of the present disclosure performs data processing for each level of the tree model, while ignoring a path choice of the data in the tree model, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
Based on the above embodiment, the first prediction module is further configured to:
if there are a plurality of tree models, generate and transmit a plurality of pieces of node attribute value encryption information to the second terminal, so that the second terminal obtains a plurality of second prediction value shares according to the plurality of pieces of node attribute value encryption information.
If there are a plurality of tree models, the embodiment of the present disclosure performs a linear superposition based on prediction results of the plurality of tree models to generate final prediction information, thereby improving the model output accuracy.
It should be noted that the device described above may include other embodiments according to the descriptions of the method embodiments. The specific implementations may refer to the descriptions of relevant method embodiments, and they are omitted herein.
The embodiment of the present disclosure further provides a data processing apparatus for a machine learning model, comprising at least one processor and a memory configured to store instructions executable by the processor, the processor implements a navigation method of an application of the above embodiment when executing the instructions, e.g.:
obtaining data information of a second terminal by adopting an oblivious transfer protocol or a homomorphic encryption algorithm;
inputting the data information into a tree model, calling a function for realizing comparison functionality by each level of the tree model, and obtaining a Hash value corresponding to each node of the tree model according to the data information, wherein the function for realizing comparison functionality generates two Hash vectors for Hash computation based on the inputted data information; and
generating node attribute value encryption information according to a Hash value corresponding to a leaf node of the tree model, and transmitting the node attribute value encryption information to the second terminal, so that the second terminal generates a second prediction value share corresponding thereto according to the node attribute value encryption information.
The storage medium may include a physical device for storing information. Usually the information is digitized and then stored in an electrical, magnetic or optical medium. The storage medium may include: means for storing information by way of electrical energy, such as various memories, like RAM, ROM, etc.; means for storing information by way of magnetic energy, such as a hard disk, a floppy disk, a magnetic tape, a magnetic core memory, a magnetic bubble memory, and a U disk; and means for storing information optically, such as CD or DVD. Of course, there are also readable storage mediums of other ways, such as a quantum memory, a graphene memory, etc.
It should be noted that the above processing device may further include other embodiments according to the descriptions of the method embodiments. The specific implementations may refer to the descriptions of relevant method embodiments, and they are omitted herein.
Some embodiments of the present disclosure further provide a data processing device for a machine learning model, and apply it to a data holder. Fig. 4 is a module structure schematic diagram of a data processing device for a machine learning model in another embodiment of the present disclosure. As illustrated in Fig. 4, the data processing device for the machine learning model provided by the present disclosure may comprise: a second data transmission module 41, a data reception module 42, and a second prediction module 43, wherein
the second data transmission module 41 may be configured to transmit data information to a first terminal by adopting an oblivious transfer protocol or a homomorphic encryption algorithm, so that a tree model in the first terminal calls a function for realizing comparison functionality, and obtain a Hash value corresponding to each node of the tree model according to the data information, wherein the function for realizing comparison functionality generates two Hash vectors for Hash computation based on the inputted data information;
the data reception module 42 may be configured to receive node attribute value encryption information generated by the first terminal according to a Hash value corresponding to a leaf node of the tree model; and
the second prediction module 43 may be configured to decrypt the node attribute value encryption information to obtain a second prediction value share.
The embodiment of the present disclosure combines an oblivious transfer protocol or a homomorphic encryption algorithm with a function for realizing comparison functionality, transmits data of a data holder to a model holder by adopting the oblivious transfer protocol or the homomorphic encryption algorithm, so that the model holder can only obtain the data assigned by the data holder. The function for realizing comparison functionality is called by each level of the tree model, while a path choice of the data in the tree model is ignored, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
Fig. 5 is a structure schematic diagram of a data processing device for a machine learning model in still another embodiment of the present disclosure. As illustrated in Fig. 5, based on the above embodiment, the device further comprises a second model computation module 51 configured to:
initialization-generate second comparison data after the data information is transmitted to the first terminal, and call the function for realizing comparison functionality according to first comparison data generated by the first terminal, to obtain a second Hash vector;
receive node encryption information generated by the first terminal according to the first Hash vector, and decrypt the node encryption information according to the second Hash vector to obtain media data; and
perform a computation for a next level of the tree model according to the second comparison data regenerated from the media data and the first comparison data updated by the first terminal, until a computation for a last level of the tree model is completed.
The embodiment of the present disclosure calls a pre-defined function for realizing comparison functionality in each level of a tree model, while ignoring a path choice of the data in the tree model, so as to guarantee that the model holder and the data holder do not reveal any information to each other, except a model output result, thereby ensuring the security and privacy of the data interaction.
Based on the above embodiment, the second prediction module is further configured to:
if the first terminal comprises a plurality of tree models, receive a plurality of pieces of node attribute value encryption information generated by the plurality of tree models;
generate a plurality of second prediction value shares according to the plurality of pieces of node attribute value encryption information; and
linearly superpose the plurality of second prediction value shares to obtain model prediction information.
If there are a plurality of tree models, the embodiment of the present disclosure performs a linear superposition based on prediction results of the plurality of tree models to generate final prediction information, thereby improving the model output accuracy.
It should be noted that the device described above may include other embodiments according to the descriptions of the method embodiments. The specific implementations may refer to the descriptions of relevant method embodiments, and they are omitted herein.
The embodiment of the present disclosure further provides a data processing apparatus for a machine learning model, comprising at least one processor and a memory configured to store instructions executable by the processor, the processor implements a navigation method of an application of the above embodiment when executing the instructions, e.g.:
transmitting data information to a first terminal by adopting an oblivious transfer protocol or a homomorphic encryption algorithm, so that a tree model in the first terminal calls a function for realizing comparison functionality, and obtaining a Hash value corresponding to each node of the tree model according to the data information, wherein the function for realizing comparison functionality generates two Hash vectors for Hash computation based on the inputted data information;
receiving node attribute value encryption information generated by the first terminal according to a Hash value corresponding to a leaf node of the tree model;
decrypting the node attribute value encryption information to obtain a second prediction value share.
The storage medium may include a physical device for storing information. Usually the information is digitized and then stored in an electrical, magnetic or optical medium. The storage medium may include: means for storing information by way of electrical energy, such as various memories, like RAM, ROM, etc.; means for storing information by way of magnetic energy, such as a hard disk, a floppy disk, a magnetic tape, a magnetic core memory, a magnetic bubble memory, and a U disk; and means for storing information optically, such as CD or DVD. Of course, there are also readable storage mediums of other ways, such as a quantum memory, a graphene memory, etc.
It should be noted that the above processing device may further include other embodiments according to the descriptions of the method embodiments. The specific implementations may refer to the descriptions of relevant method embodiments, and they are omitted herein.
Fig. 6 is a structure schematic diagram of a data processing system for a machine learning model in an embodiment of the present disclosure. As illustrated in Fig. 6, the data processing system for the machine learning model in the embodiment of the present disclosure may comprise a model client 61, a data client 62, a data transmission module 63 and a comparison functionality module 64, wherein the model client 61 comprises one or more tree models, the data client 62 comprises data information, and the data transmission module 63 comprises an oblivious transfer protocol or a homomorphic encryption algorithm;
the model client 61 and the data client 62 perform a data transmission by calling the data transmission module 63, and call the comparison functionality module 64 to perform data processing for each level of the tree model in the model client 61, the comparison functionality module 64 being capable of realizing the functions of F
cmp and F
cmp* in the above embodiment. The model client 61 is configured to execute methodological steps corresponding to the model holder in the above embodiment, and the data client 62 is configured to execute methodological steps corresponding to the data holder in the above embodiment.
The data processing system for the machine learning model provided by the present disclosure may be independent, or applied into various data processing and analysis systems. The system may include a navigation device for any application in the above embodiments. The system may be a separate server, or include a terminal device using a server cluster, a system (including a distributed system) , software (application) , an actual operating device, a logic gate circuit device, a quantum computer, etc. for one or more methods or one or more devices in the embodiments of the present disclosure and combining necessary implementation hardware. The detection system for verifying discrepant data may include at least one processor and a memory for storing computer-executable instructions, and the processor implements the steps of the method in any one or more of the above embodiments when executing the instructions.
The method embodiment provided by the present disclosure may be executed in a mobile terminal, a computer terminal, a server or a similar computing device. For example, in case of a server, Fig. 7 is a block diagram of a hardware structure of a data processing server for a machine learning model in an embodiment of the present disclosure, and the server may execute a method corresponding to a data holder or a model holder. As illustrated in Fig. 7, a server 10 may comprise one or more (only one is illustrated) processor 100 (including, but not limited to, a microprocessor MCU or a programmable logic device FPGA, etc. ) , a memory 200 configured to store data, and a transmission module 300 for communication functionality. It will be appreciated by those skilled in the art that the structure as illustrated in Fig. 7 is merely schematic and does not limit the structure of the electronic device. For example, the server 10 may further comprise more or fewer components than those illustrated in Fig. 7. For example, the server 10 may further comprise other processing hardware, such as a database or multi-level cache or GPU, or have a different configuration from that illustrated in Fig. 7.
The memory 200 may be configured to store a software program of an application software and modules, such as program instructions/modules corresponding to a navigation method of an application of the embodiment of the present disclosure; the processor 100 runs the software program and the modules stored in the memory 200 to execute various functional applications and data processing. The memory 200 may comprise a high-speed random memory and may also comprise a non-volatile memory, such as one or more magnetic storage devices, a flash memory, or other non-volatile solid memory. In some examples, the memory 200 may further comprise memories which are remotely disposed relative to the processor 100 and may be connected to a computer terminal through a network. The examples of the network include, but not limited to, Internet, Intranet, local area network, mobile communication network, and combinations thereof.
The transmission module 300 is configured to receive or transmit data via a network. The specific example of the network may include a wireless network provided by a communication provider of a computer terminal. In one example, the transmission module 300 comprises a Network Interface Controller (NIC) , which can be connected to other network device through a base station to communicate with the Internet. In one example, the transmission module 300 may be a Radio Frequency (RF) module configured to communicate wirelessly with the Internet.
The particular embodiments of the present disclosure are described as above, and other embodiments fall within the scope of the accompanied claims. In some cases, the actions or steps recited in the claims can be performed in an order different from that in the embodiments, while the desired results still can be achieved. In addition, the processes depicted in the drawings do not necessarily require that the desired results have to be achieved in the illustrated particular order or consecutive order. In some embodiments, multitask processing and parallel processing are also possible or advantageous.
The methods or devices described in the above embodiments of the present disclosure may implement the service logic by a computer program and record it on a storage medium, which can be read and executed by a computer to achieve the effects of the solutions described in the embodiments of the present disclosure.
The navigation methods or devices for the above applications provided in the embodiments of the present disclosure may be implemented by a processor executing corresponding program instructions in a computer, e.g., implemented in a Linux system on a PC side by adopting the c++ language of the Windows operating system, or implemented at a smart terminal by adopting the programming languages of the android or iOS system, and implemented based on the processing logics of the quantum computer, etc.
It should be noted that the device, the computer storage medium, and the system described above may further include other embodiments according to the descriptions of relevant method embodiments. For the specific implementations, please refer to the descriptions of corresponding method embodiments, which are omitted herein.
The embodiments of the present disclosure are all described in a progressive manner, and the same or similar portions of the embodiments can refer to each other. Each embodiment lays an emphasis on its distinctions from other embodiments. In particular, a hardware plus program embodiment is simply described since it is substantially similar to the method embodiment, and please refer to the descriptions of the method embodiment for the relevant portions.
The embodiments of the present disclosure are not limited to the industrial communication standards, the standard computer data processing and data storage rules, or the situations as described in one or more embodiments of the present disclosure. Some industrial standards, or an implementation scheme which is self-defined or slightly amended based on the implementations described in the embodiments can also achieve the same, equivalent or similar, or modification expectable implementation effects as compared with the above embodiments. The embodiments obtained by applying these amended or modified modes for data acquisition, storage, judgment, processing, etc., should still fall within the scope of the optional implementation schemes of the embodiments of the present disclosure.
In 1990s, a technical improvement can be very obviously distinguished as a hardware improvement (e.g., an improvement to a circuit structure such as a diode, a transistor, a switch, etc. ) or a software improvement (amethodological flow improvement) . However, with the development of the technologies, the improvements to many methodical flows nowadays have been regarded as direct improvements to the hardware circuit structures. Almost all of the designers can obtain corresponding hardware circuit structure by programming the improved methodical flows into a hardware circuit. Thus, an improvement to a methodological flow may also be implemented by a hardware entity module. For example, a Programmable Logic Device (PLD) (e.g., Field Programmable Gate Array (FPGA) ) is such an integrated circuit with its logical functions determined by the user’s programming of the device. The designer ‘integrates’ a digital system onto a PLD through a programming by himself, without needing to design and manufacture the Application Specific Integrated Circuit (ASIC) chip by the chip manufacturer. Moreover, at present, instead of manually manufacturing the integrated circuit chips, such programming is mostly implemented using software ‘logic compiler’ , which is similar to the software compiler used for the program development, and the original codes to be compiled should be written in a specific programming language referred to as Hardware Description Language (HDL) . There are many kinds of HDLs, such as Advanced Boolean Expression Language (ABEL) , Altera Hardware Description Language (AHDL) , Confluence, Cornell University Programming Language (CUPL) , HDCal, Java Hardware Description Language (JHDL) , Lava, Lola, MyHDL, PALASM, Ruby Hardware Description Language (RHDL) , etc., and currently the most commonly used is Very-High-Speed Integrated Circuit Hardware Description Language (VHDL) and Verilog2. It should also be apparent to those skilled in the art that a hardware circuit that implements the logic methodical flow can be easily obtained just by slightly logically programming the methodical flow into an integrated circuit with the above hardware description languages.
The controller may be implemented in any suitable manner. For example, the controller may take the form of, for example, a microprocessor or a processor, a computer readable medium storing computer readable program codes (e.g. software or firmware) executable by the (micro) processor, a logic gate, a switch, an Application Specific Integrated Circuit (ASIC) , a programmable logic controller, or an embedded microcontroller. The examples of the controller include but not limited to the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicone Labs C8051F320, and the controller of a memory can be further implemented as a part of the control logic of the memory. As known to those skilled in the art, in addition to implementing the controller by merely using computer readable program codes, by logically programming the methodical steps, the controller is completely enabled to realize the same function in the form of a logic gate, a switch, an ASIC, a programmable logic controller, an embedded microcontroller, etc. Thus, such a controller may be deemed as a hardware component, while means included therein for realizing various functions may also be deemed as structures in the hardware component. Alternatively, those means for realizing various functions may even be deemed as not only software modules for implementing a method, but also the structures in the hardware component.
The systems, devices, modules or units elaborated in the above embodiments specifically may be implemented by a computer chip or an entity, or a product having a certain function. A typical implementation apparatus is a computer. Specifically, the computer may be, for example, a personal computer, a laptop computer, a vehicle-mounted human-machine interaction device, a cellular phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device or any combination of these devices.
Although one or more embodiments of the present disclosure provide the method operation steps as described in the embodiments or flowcharts, more or less operation steps may be included based on the conventional or inventiveless means. Any step execution order listed in the embodiments is only one of the various step execution orders, rather than representing the unique step execution order. Regarding a practical device or terminal product, the steps may be executed orderly or in parallel according to the method illustrated in the embodiments or the drawing (e.g., by the parallel processors, or under a multi-thread processing environment or even a distributed data processing environment) . The term ‘comprise’ , ‘include’ or any other variant intends to cover the non-exclusive inclusions, so that a process, a method, a product or a device comprising a series of elements comprise not only those elements, but also other elements not explicitly listed, or further comprise inherent elements of such process, method, product or device. In a case where there is no further limitation, it does not exclude any other identical or equivalent element existing in the process, method, product or device comprising the elements. The terms ‘first’ , ‘second’ , etc. are used to denote names, rather than representing any particular order.
In order to facilitate the descriptions, the above device is described based on the functions with various functional modules, respectively. Of course, during implementation of the present disclosure, the functions of the modules may be realized in the same or a plurality of software and/or hardware, or a module that realizes a function may be implemented by a combination of a plurality of submodules or subunits, and the like. The device embodiments described above are merely illustrative, e.g., the unit partitioning is only a logical function partitioning, and other partitioning modes are possible during the actual implementation. For example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the mutual coupling or direct coupling or communication connection illustrated or discussed may be an indirect coupling or communication connection through some interfaces, devices or units, and may be electrical, mechanical or other form.
The present disclosure is described with reference to a flow diagram and/or a block diagram of the method, device (system) and computer program product according to the embodiments of the present disclosure. It should be appreciated that each flow and/or block in the flow diagram and/or the block diagram and a combination of flows and/or blocks in the flow diagram and/or the block diagram can be realized by computer program instructions. Those computer program instructions can be provided to a general computer, a dedicated computer, an embedded processor or a processor of other programmable data processing device to produce a machine, so that the instructions executed by the processor of the computer or other programmable data processing device produce means for realizing specified functions in one or more flows in the flow diagram and/or one or more blocks in the block diagram.
The computer program instructions may also be stored in a computer readable memory which is capable of guiding the computer or other programmable data processing device to work in a specific mode, so that the instructions stored in the computer readable memory generate a product including instructing means for realizing the functions specified in one or more flows in the flowchart and one or more blocks in the block diagram.
The computer program instructions may also be loaded to the computer or other programmable data processing device, so that a series of operation steps can be performed in the computer or other programmable device to generate a processing realized by the computer, thus the instructions executed in the computer or other programmable device provide the steps for realizing the functions specified in one or more flows in the flowchart and one or more blocks in the block diagram.
In a typical configuration, the computing device comprises one or more processors (CPUs) , an input/output interface, a network interface and a memory.
The memory may have the form of a volatile memory, a Random-Access Memory (RAM) and/or a non-volatile memory such as Read-Only Memory (ROM) or a flash RAM, etc. among the computer readable medium. The memory is an example of the computer readable medium.
The computer-readable medium includes permanent and non-permanent, removable and non-removable media, which can realize the information storage in any method or technique. The information can be computer readable instructions, data structures, program modules or other data. An example of the computer storage medium includes, but not limited to, a phase change memory (PRAM) , a static random access memory (SRAM) , a dynamic random access memory (DRAM) , other types of random access memory (RAM) , a read-only memory (ROM) , an electrically-erasable programmable read-only memory (EEPROM) , a flash memory or other memory techniques, a compact disk read only memory (CD-ROM) , a digital versatile disc (DVD) or other optical storages, magnetic cassette tapes, magnetic diskettes or other magnetic storage device or any other non-transmission medium, which can be used for the storage of information accessible to a computing device. According to the definitions herein, the computer readable medium does not include any temporary computer readable media (transitory media) , such as modulated data signal and carrier wave.
Those skilled in the art should appreciate that any embodiment of the present disclosure can be provided as a method, a system or a computer program product. Therefore, the present disclosure can take the form of a full hardware embodiment, a full software embodiment, or an embodiment combining software and hardware. Moreover, the present disclosure can take the form of a computer program product implemented on one or more computer usable storage mediums (including, but not limited to, a magnetic disc memory, CD-ROM, optical storage, etc. ) containing therein computer usable program codes.
One or more embodiments of the present disclosure may be described in the general context of computer executable instructions executed by the computer, e.g., the program module. In general, the program module includes routine, program, object, component, data structure, etc. executing a particular task or realizing a particular abstract data type. The present disclosure may also be put into practice in the distributed computing environments where tasks are executed by remote processing devices connected through a communication network. In the distributed computing environments, the program modules may be located in the local and remote computer storage medium including the storage device.
The embodiments of the present disclosure are all described in a progressive manner, and the same or similar portions of the embodiments can refer to each other. Each embodiment lays an emphasis on its distinctions from other embodiments. In particular, the system embodiment is simply described since it is substantially similar to the method embodiment, and please refer to the descriptions of the method embodiment for the relevant portions. In the descriptions of the present disclosure, the terms ‘an (one) embodiment’ , ‘some embodiments’ , ‘example’ , ‘specific example’ , ‘some examples’ , etc. mean that the specific features, structures, materials or characteristics described with reference to the embodiment (s) or example (s) are included in at least one embodiment or example of the present disclosure. In the present disclosure, the schematic description of any of those terms is not certainly for the same embodiment or example. In addition, the described specific features, structures, materials or characteristics may be combined in a proper way in any one or more embodiments or examples. Moreover, in a case where there is no contradiction, different embodiments or examples described in the present disclosure and the features thereof may be combined and integrated by those skilled in the art.
Those described above are just examples of one or more embodiments of the present disclosure, rather than limitations to one or more embodiments of the present disclosure. For those skilled in the art, one or more embodiments of the present disclosure are intended to cover any amendment or variation. Any amendment, equivalent substitution, improvement, etc. made under the spirit and principle of the present disclosure should fall within the scope of the claims.