CN109214193B - Data encryption and machine learning model training method and device and electronic equipment - Google Patents

Data encryption and machine learning model training method and device and electronic equipment Download PDF

Info

Publication number
CN109214193B
CN109214193B CN201710542807.1A CN201710542807A CN109214193B CN 109214193 B CN109214193 B CN 109214193B CN 201710542807 A CN201710542807 A CN 201710542807A CN 109214193 B CN109214193 B CN 109214193B
Authority
CN
China
Prior art keywords
data
encrypted
encoder
self
hidden layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710542807.1A
Other languages
Chinese (zh)
Other versions
CN109214193A (en
Inventor
陈超超
周俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Advanced New Technologies Co Ltd
Original Assignee
Advanced New Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Advanced New Technologies Co Ltd filed Critical Advanced New Technologies Co Ltd
Priority to CN201710542807.1A priority Critical patent/CN109214193B/en
Publication of CN109214193A publication Critical patent/CN109214193A/en
Application granted granted Critical
Publication of CN109214193B publication Critical patent/CN109214193B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The embodiment of the specification discloses a data encryption method, a machine learning model training method, a device and electronic equipment. The data encryption method comprises the following steps: and inputting the data to be encrypted into a self-encoder for processing, and extracting the hidden layer data correspondingly generated by the self-encoder to be used as the encrypted data corresponding to the data to be encrypted.

Description

Data encryption and machine learning model training method and device and electronic equipment
Technical Field
The present disclosure relates to the field of computer software technologies, and in particular, to a method and an apparatus for data encryption and machine learning model training, and an electronic device.
Background
With the advent of the information age, large companies in various industries have stored a great deal of valuable data, such as financial data of banks, e-commerce data of e-commerce, user social data of social application service providers, and the like.
Such a situation undoubtedly poses a greater challenge to the management of data, whether it is internal employees of a company, or competitors and partners, and when the data is exposed to the outside, if the data is directly presented in the form of original data, the risk of data leakage is often faced later.
Data encryption techniques can be generally employed to protect the privacy of data to reduce the risk of data leakage. However, the data encryption techniques in the prior art are generally used for storage purposes, and the data encryption techniques used for storage purposes must be bi-directional, i.e., data can be encrypted as well as decrypted. Data encrypted by such encryption techniques generally have a large difference from the corresponding original data, and if not decrypted, the encrypted data is difficult to reflect useful information in the original data. Therefore, there is a need for a technique that can encrypt original data and, after the encrypted data is output to another party, the other party can use the encrypted data without decryption.
Disclosure of Invention
The embodiment of the specification provides a data encryption method, a machine learning model training method, a device and electronic equipment, which are used for solving the following technical problems: there is a need for a technique that can encrypt original data and, after the encrypted data is output to another party, the other party can use the encrypted data without decryption.
In order to solve the above technical problem, the embodiments of the present specification are implemented as follows:
the data encryption method provided by the embodiment of the specification comprises the following steps:
inputting data to be encrypted into a self-encoder for processing;
acquiring neural network hidden layer data generated by the self-encoder in the processing process;
and obtaining encrypted data corresponding to the data to be encrypted according to the neural network hidden layer data.
An embodiment of this specification provides a data encryption apparatus, including:
the processing module is used for inputting the data to be encrypted into the self-encoder for processing;
the acquisition module is used for acquiring the neural network hidden layer data generated by the self-encoder in the processing process;
and the obtaining module is used for obtaining the encrypted data corresponding to the data to be encrypted according to the hidden layer data of the neural network.
The machine learning model training method provided by the embodiment of the specification comprises the following steps:
acquiring encrypted data, wherein the encrypted data is obtained by inputting corresponding data to be encrypted into a self-encoder for processing and according to neural network hidden layer data generated by the self-encoder in the processing process;
training a machine learning model using the encrypted data.
The embodiment of this specification provides a machine learning model training device, includes:
the acquisition module is used for acquiring encrypted data, inputting the corresponding data to be encrypted into a self-encoder for processing, and acquiring the encrypted data according to the neural network hidden layer data generated by the self-encoder in the processing process;
and the training module is used for training a machine learning model by using the encrypted data.
An electronic device provided in an embodiment of the present specification includes:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to:
inputting data to be encrypted into a self-encoder for processing;
acquiring neural network hidden layer data generated by the self-encoder in the processing process;
and obtaining encrypted data corresponding to the data to be encrypted according to the neural network hidden layer data.
Another electronic device provided in an embodiment of this specification includes:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to:
the acquisition module is used for acquiring encrypted data, inputting the corresponding data to be encrypted into a self-encoder for processing, and acquiring the encrypted data according to the neural network hidden layer data generated by the self-encoder in the processing process;
and the training module is used for training a machine learning model by using the encrypted data.
The embodiment of the specification adopts at least one technical scheme which can achieve the following beneficial effects: the data to be encrypted may be the original data itself or data obtained by formatting the original data, and the correspondingly obtained encrypted data may include useful information of the data to be encrypted, so that after such encrypted data is output to other parties, the other parties can use the encrypted data without decryption.
Drawings
In order to more clearly illustrate the embodiments of the present specification or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present specification, and for those skilled in the art, other drawings can be obtained according to the drawings without any creative effort.
Fig. 1 is a schematic diagram of an overall architecture involved in a practical application scenario of the solution of the present specification;
fig. 2 is a schematic flow chart of a data encryption method provided in an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a self-encoder provided in an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of another self-encoder provided in an embodiment of the present disclosure;
fig. 5a is a schematic flowchart of training a self-encoder in a practical application scenario provided by the embodiment of the present disclosure;
fig. 5b is a schematic flowchart of a data encryption process performed based on a trained self-encoder in an actual application scenario provided in the embodiment of the present disclosure;
FIG. 6 is a schematic flow chart diagram illustrating a method for training a machine learning model according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a data encryption device corresponding to fig. 2 provided in an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a machine learning model training apparatus corresponding to fig. 6 provided in an embodiment of the present disclosure.
Detailed Description
The embodiment of the specification provides a data encryption method, a machine learning model training method, a device and electronic equipment.
In order to make those skilled in the art better understand the technical solutions in the present specification, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any inventive step based on the embodiments of the present disclosure, shall fall within the scope of protection of the present application.
Fig. 1 is a schematic diagram of an overall architecture related to the solution of the present specification in a practical application scenario. In the overall architecture, three parts are mainly involved: the data to be encrypted, the equipment where the self-encoder is located and the encrypted data corresponding to the data to be encrypted. The data to be encrypted is input into the equipment where the self-encoder is located for processing, and the encrypted data corresponding to the data to be encrypted can be obtained.
Fig. 2 is a schematic flowchart of a data encryption method provided in an embodiment of the present disclosure. Possible execution subjects of the process include, but are not limited to, the following devices that can be servers or terminals: personal computers, medium-sized computers, computer clusters, mobile phones, tablet computers, intelligent wearable devices, car machines and the like.
The flow in fig. 2 may include the following steps:
s202: and inputting the data to be encrypted into the self-encoder for processing.
In the embodiment of the present specification, the self-encoder is implemented based on a neural network, which is a neural network model, and by training the self-encoder, a nonlinear machine learning algorithm with input equal to output (in practical application, a certain error is allowed) can be implemented.
Intermediate states between the input and the output can contain useful information of the input, according to which principle data encryption can be realized on the basis of the self-encoder as well as the intermediate states.
In the embodiment of the present specification, the data to be encrypted may be the original data itself described in the background art; the data to be encrypted may also be data obtained by performing corresponding preprocessing on the original data in order to adapt to the scheme of the present specification, where the preprocessing may be, for example, formatting processing, data cleaning processing, feature extraction processing, and the like.
S204: and acquiring the neural network hidden layer data generated by the self-encoder in the processing process.
In this embodiment, the self-encoder may include at least one hidden layer of a neural network, and in the processing, the input data to be encrypted may sequentially pass through the input layer, the hidden layers, and the output layer of the self-encoder, each layer may include a plurality of nodes, and the nodes may calculate data from an upper node, for example, assign weights to each data, and perform linear or nonlinear operations, and the like. And the data obtained by the node calculation of the hidden layer is the hidden layer data of the neural network, and reflects the intermediate state.
It should be noted that, in step S204, it is not necessary to acquire all the neural network hidden layer data generated from the encoder during the processing. For example, if the self-encoder includes a plurality of hidden layers, the neural network hidden layer data generated by only one of the hidden layers may be obtained.
S206: and obtaining encrypted data corresponding to the data to be encrypted according to the neural network hidden layer data.
In the embodiment of the present specification, the neural network hidden layer data may be directly used as encrypted data corresponding to the data to be encrypted, or the neural network hidden layer data may be further processed to obtain encrypted data corresponding to the data to be encrypted. For the former way, step S206 may not include an action that needs to be actually performed, but only represents that the neural network hidden data obtained in step S204 is the encrypted data corresponding to the data to be encrypted. For the latter method, for example, weighted calculation may be performed on hidden data of a neural network in a plurality of hidden layers, and the calculated data may be used as encrypted data corresponding to data to be encrypted.
When there are a plurality of hidden layers, the hidden layer data of any one of the hidden layers may be used as the obtained encrypted data. In a preferred embodiment, the hidden layer data of the hidden layer closest to the output layer may be used as the obtained encrypted data.
The encrypted data correspondingly obtained by the method of fig. 2 can contain useful information of the data to be encrypted, because: during the layer-by-layer processing of the data to be encrypted in self-encoding, although the data is transformed, the main characteristics (belonging to useful information) are still kept. Therefore, the encrypted data obtained in this way has practical significance for external presentation, and has good practicability. The outward presentation here can be understood as outward output to the user of the data. The user of the data may train the model with the encrypted data. The trained model can be used in some practical scenarios, such as risk assessment scenarios, credit prediction scenarios, and the like.
Based on the method of fig. 2, the present specification also provides some specific embodiments of the method, and further provides the following descriptions.
For convenience of understanding, the embodiments of the present specification provide a schematic structural diagram of an auto encoder, as shown in fig. 3.
The self-encoder in fig. 3 includes an input layer, an output layer, and a hidden layer. x is the number of1、x2、x3、x4、x5、x6"+ 1" represents the input data for each dimension of the self-encoder, x1'、x'2、x'3、x'4、x'5、x'6Representing the output data for each dimension of the self-encoder. It can be seen that in the self-encoder, the dimension of the hidden layer is lower than that of the input layer, and the dimension of the neural network hidden layer data generated by the hidden layer in the process of processing the input data is correspondingly lower than that of the input data. For the convenience of calculation, input data, neural network hidden data and output data can be generally expressed in a vector form.
In the embodiments of the present specification, the original data is not necessarily represented in a vector form, and is often represented in a data table, a key value peer form. In this case, the original data may be vectorized and then input to the self-encoder for processing.
For example, for step S202, before inputting the encryption to be encrypted from the encoder for processing, the following steps may also be performed: acquiring original data; formatting the original data to obtain a vector representing the original data; the data to be encrypted includes the vector representing the original data.
Assuming that the original data is part of the user's information, there are 6 dimensions: age, hometown, property, annual income, whether there is a car (house), whether there is a loan. The information of the dimensions of each user can be extracted as original data, formatting processing is performed to obtain a 6-dimensional vector corresponding to each user, for example, the 6-dimensional vector of a certain user can be represented as (20-30 years old, 100 ten thousands, 20 ten thousands, car-in and loan-out); for convenience of calculation, each dimension of information in the vector of this example may be further mapped into a number according to a predetermined mapping rule, and then input to the self-encoder.
In the embodiments of the present specification, the self-encoder used for data encryption may be an already trained self-encoder. The process of autoencoder training includes: adjusting the number of hidden layers of the self-encoder and the number of nodes of each hidden layer; inputting sample data into a self-encoder, and training the self-encoder by taking the input data of the self-encoder and corresponding output data as targets.
The target, i.e. the function where the input equals the output, can be expressed as hW,b(x) X, where x represents the input, hW,b(x) The output is represented by ≈ rather than ═ because in practical applications a certain error is allowed, generally the input is substantially equal to the output, which is advantageous for shortening the training time.
Further, the input layer and the output layer of the self-encoder may have the same structure, for example, the input layer and the output layer include the same number of nodes, and the symmetrical structure is favorable for speeding up the training convergence. After training is completed, the single-node input of each node of the input layer is basically equal to the single-node output of the corresponding node of the output layer. Of course, if the input layer and the output layer have different structures, it is still possible to successfully learn the above functionNumber hW,b(x)。
In this embodiment, when the self-encoder includes only one hidden layer, for step S204, at least a part of the hidden layer data generated by the hidden layer in the process may be acquired.
When the self-encoder comprises a plurality of hidden layers, one or more hidden layers can be determined in advance to serve as target hidden layers; or, according to the specific situation of the neural network hidden layer data generated by the hidden layers in the processing process, selecting one or more hidden layers as target hidden layers; further, for step S204, at least part of the neural network hidden layer data generated by the target hidden layer in the processing procedure may be acquired.
For the first mode in the previous paragraph. For example, the hidden layer with the lowest dimension may be selected as the target hidden layer; for another example, according to the requirement of the encryption degree, if the encryption degree is desired to be higher, the more central hidden layer may be selected as the target hidden layer. For the second approach in the previous paragraph. For example, if some dimensions in the neural network hidden layer data corresponding to a certain hidden layer directly expose the corresponding part of the original information, the hidden layer can be excluded and the target hidden layer can be determined in the remaining hidden layers.
In fig. 3, since there is only one hidden layer, the hidden layer is the target hidden layer, and the dimension of the target hidden layer is set to be lower than that of the input layer, which has the advantages that: the features of the input data are gathered, so that the useful information is extracted, and the dimensionality and the data volume of the encrypted data are reduced.
It should be noted that the dimension of the target hidden layer is lower than that of the input layer is not a requirement that must be satisfied in the solution of this specification, and it is also possible that the dimension of the target hidden layer is equal to or greater than that of the input layer. For example, fig. 4 is a schematic structural diagram of another self-encoder provided in an embodiment of this specification, where hidden layers of the self-encoder in fig. 4 include three layers, and a dimension of each hidden layer is higher than a dimension of an input layer, and one or more of the three layers may be used as a target hidden layer.
According to the above description, the embodiment of the present specification further provides a schematic diagram of an implementation scheme of the data encryption method in a practical application scenario, as shown in fig. 5a and fig. 5 b.
FIG. 5a is a schematic diagram of a process for training a self-encoder, which has been described above; fig. 5b is a schematic flow chart of data encryption based on the trained auto-encoder.
In fig. 5b, the data to be encrypted is data containing multiple dimensions of "age", "hometown" and "asset", and may be specifically represented as a multidimensional vector, such as a 6-dimensional vector in the above example. In training the self-encoder, a vector having the same structure as the data to be encrypted may be used as sample data.
The process in fig. 5b mainly comprises the following steps:
inputting the data to be encrypted in a vector form into a trained self-encoder for processing; and acquiring hidden layer data generated in a self-encoder in the processing process as encrypted data corresponding to the data to be encrypted. The hidden layer data can preferably be represented as a vector having dimensions lower than those of the data to be encrypted, the resulting encrypted data being able to contain useful information of the data to be encrypted.
Assuming that the above-mentioned 6-dimensional vector is inputted into the self-encoder in fig. 3, a 4-dimensional vector can be obtained as corresponding encrypted data, for example, (0.21,0.43,0.23,0.98), etc., each piece of data in the 4-dimensional vector can reflect the main characteristics of 1-dimensional data or multi-dimensional data in the 6-dimensional vector to some extent, which is useful for external presentation purposes.
In the embodiment of the present specification, the encrypted data may be used for training the machine learning model instead of the corresponding data to be encrypted, in addition to being used for external presentation. Based on such a concept, an embodiment of the present specification further provides a machine learning model training method, as shown in fig. 6, and fig. 6 is a schematic flow chart of the machine learning model training method.
The flow in fig. 6 may include the following steps:
s602: and acquiring encrypted data, wherein the encrypted data is obtained by inputting the corresponding data to be encrypted into a self-encoder for processing and according to the neural network hidden layer data generated by the self-encoder in the processing process.
S604: training a machine learning model using the encrypted data.
Because the encrypted data comprises the main information of the data to be encrypted corresponding to the encrypted data, the effect close to the effect of training the machine learning model by using the encrypted data can be achieved, and the data to be encrypted does not need to be exposed in the training process, so that the privacy of the data to be encrypted is facilitated.
The data encryption method and the machine learning model training method provided by the embodiment of the present specification are described above, and based on the same idea, the embodiment of the present specification further provides corresponding apparatuses, as shown in fig. 7 and fig. 8.
Fig. 7 is a schematic structural diagram of a data encryption apparatus corresponding to fig. 2 provided in an embodiment of the present specification, where a dashed box represents an optional module, and the apparatus may be located on an execution body of the flow in fig. 2, and includes:
the processing module 701 inputs data to be encrypted into the self-encoder for processing;
an obtaining module 702, configured to obtain the neural network hidden layer data generated by the self-encoder in the processing process;
the obtaining module 703 is configured to obtain, according to the neural network hidden layer data, encrypted data corresponding to the data to be encrypted.
Optionally, the encrypted data is used to train a machine learning model.
Optionally, the apparatus further comprises:
a formatting module 704, configured to obtain original data before the processing module 701 inputs the data to be encrypted into the self-encoder for processing, and format the original data to obtain a vector representing the original data; the data to be encrypted includes the vector representing the original data.
Optionally, the self-encoder of the data input to be encrypted is a trained self-encoder.
Optionally, the obtaining module 702 obtains the neural network hidden layer data generated by the self-encoder in the processing process, specifically including:
the obtaining module 702 determines a target hidden layer among hidden layers included in the self-encoder, and obtains the neural network hidden layer data generated by the target hidden layer in the processing process.
Optionally, the dimension of the target hidden layer is lower than the dimension of the input layer of the self-encoder.
Optionally, the obtaining module 702 obtains the neural network hidden layer data generated by the self-encoder in the processing process, specifically including:
the obtaining module 702 obtains the neural network hidden layer data generated by the hidden layer closest to the output layer in the self-encoder during the processing.
Optionally, the obtaining module 703 obtains, according to the neural network hidden layer data, encrypted data corresponding to the data to be encrypted, which specifically includes:
optionally, the neural network hidden layer data is a vector.
Fig. 8 is a schematic structural diagram of a machine learning model training apparatus corresponding to fig. 6 provided in an embodiment of the present specification, where the apparatus may be located on an execution body of the flowchart in fig. 6, and includes:
an obtaining module 801, configured to obtain encrypted data, where the encrypted data is processed by inputting data to be encrypted corresponding to the encrypted data into a self-encoder, and is obtained according to neural network hidden layer data generated by the self-encoder in the processing process;
a training module 802 for training a machine learning model using the encrypted data.
Based on the same idea, embodiments of the present specification further provide an electronic device corresponding to fig. 2, including:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to:
inputting data to be encrypted into a self-encoder for processing;
acquiring neural network hidden layer data generated by the self-encoder in the processing process;
and obtaining encrypted data corresponding to the data to be encrypted according to the neural network hidden layer data.
Based on the same idea, embodiments of the present specification further provide an electronic device corresponding to fig. 6, including:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to:
the acquisition module is used for acquiring encrypted data, inputting the corresponding data to be encrypted into a self-encoder for processing, and acquiring the encrypted data according to the neural network hidden layer data generated by the self-encoder in the processing process;
and the training module is used for training a machine learning model by using the encrypted data.
Based on the same idea, the embodiments of the present specification further provide a non-volatile computer storage medium corresponding to fig. 2, storing computer-executable instructions configured to:
inputting data to be encrypted into a self-encoder for processing;
acquiring neural network hidden layer data generated by the self-encoder in the processing process;
and obtaining encrypted data corresponding to the data to be encrypted according to the neural network hidden layer data.
Based on the same idea, the embodiments of the present specification further provide a non-volatile computer storage medium corresponding to fig. 6, in which computer-executable instructions are stored, and the computer-executable instructions are configured to:
acquiring encrypted data, wherein the encrypted data is obtained by inputting corresponding data to be encrypted into a self-encoder for processing and according to neural network hidden layer data generated by the self-encoder in the processing process;
training a machine learning model using the encrypted data.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the embodiments of the apparatus, the electronic device, and the nonvolatile computer storage medium, since they are substantially similar to the embodiments of the method, the description is simple, and the relevant points can be referred to the partial description of the embodiments of the method.
The apparatus, the electronic device, the nonvolatile computer storage medium and the method provided in the embodiments of the present description correspond to each other, and therefore, the apparatus, the electronic device, and the nonvolatile computer storage medium also have similar advantageous technical effects to the corresponding method.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Hardware Description Language), traffic, pl (core universal Programming Language), HDCal (jhdware Description Language), lang, Lola, HDL, laspam, hardward Description Language (vhr Description Language), vhal (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the various elements may be implemented in the same one or more software and/or hardware implementations of the present description.
As will be appreciated by one skilled in the art, the present specification embodiments may be provided as a method, system, or computer program product. Accordingly, embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The description has been presented with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the description. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
This description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present specification, and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (20)

1. A method of data encryption, comprising:
inputting data to be encrypted into a self-encoder for processing;
acquiring neural network hidden layer data generated by the self-encoder in the processing process;
obtaining encrypted data corresponding to the data to be encrypted according to the neural network hidden layer data; the encrypted data is used for being externally output to a user of the data, so that the user receiving the encrypted data can use the encrypted data without decryption; the using includes training a model with the encrypted data.
2. The method of claim 1, the encrypted data is used to train a machine learning model.
3. The method of claim 1, wherein before inputting the data to be encrypted into the self-encoder for processing, the method further comprises:
acquiring original data;
formatting the original data to obtain a vector representing the original data;
the data to be encrypted includes the vector representing the original data.
4. The method of claim 1, wherein the self-encoder of the data input to be encrypted is a trained self-encoder.
5. The method according to claim 4, wherein the obtaining of the neural network hidden layer data generated by the self-encoder in the processing process specifically includes:
determining a target hidden layer in all hidden layers contained in the self-encoder;
and acquiring the neural network hidden layer data generated by the target hidden layer in the processing process.
6. The method of claim 5, the target hidden layer having dimensions lower than dimensions of an input layer of the self-encoder.
7. The method according to claim 1, wherein the obtaining of the neural network hidden layer data generated by the self-encoder in the processing process specifically includes:
and acquiring hidden data of the neural network generated by the hidden layer closest to the output layer in the self-encoder in the processing process.
8. The method according to any one of claims 1 to 7, wherein the neural network hidden data is a vector.
9. A data encryption apparatus comprising:
the processing module is used for inputting the data to be encrypted into the self-encoder for processing;
the acquisition module is used for acquiring the neural network hidden layer data generated by the self-encoder in the processing process;
the obtaining module is used for obtaining encrypted data corresponding to the data to be encrypted according to the hidden layer data of the neural network; the encrypted data is used for being externally output to a user of the data, so that the user receiving the encrypted data can use the encrypted data without decryption; the using includes training a model with the encrypted data.
10. The apparatus of claim 9, the encrypted data is used to train a machine learning model.
11. The apparatus of claim 9, the apparatus further comprising:
the formatting module is used for acquiring original data before the processing module inputs the data to be encrypted into the self-encoder for processing, and formatting the original data to obtain a vector representing the original data; the data to be encrypted includes the vector representing the original data.
12. The apparatus of claim 9, the self-encoder of the data input to be encrypted is a trained self-encoder.
13. The apparatus according to claim 12, wherein the obtaining module obtains the neural network hidden layer data generated by the self-encoder during the processing, specifically including:
the acquisition module determines a target hidden layer in all hidden layers included in the self-encoder and acquires the neural network hidden layer data generated by the target hidden layer in the processing process.
14. The apparatus of claim 13, the target hidden layer having dimensions lower than dimensions of an input layer of the self-encoder.
15. The apparatus according to claim 9, wherein the obtaining module obtains the neural network hidden layer data generated by the self-encoder during the processing, and specifically includes:
the acquisition module acquires the neural network hidden layer data generated by the hidden layer closest to the output layer in the self-encoder in the processing process.
16. The apparatus of any one of claims 9-15, wherein the neural network hidden data is a vector.
17. A machine learning model training method, comprising:
acquiring encrypted data, wherein the encrypted data is obtained by inputting corresponding data to be encrypted into a self-encoder for processing and according to neural network hidden layer data generated by the self-encoder in the processing process; the encrypted data is used for being externally output to a user of the data, so that the user receiving the encrypted data can use the encrypted data without decryption;
training a machine learning model using the encrypted data.
18. A machine learning model training apparatus, comprising:
the acquisition module is used for acquiring encrypted data, inputting the corresponding data to be encrypted into a self-encoder for processing, and acquiring the encrypted data according to the neural network hidden layer data generated by the self-encoder in the processing process; the encrypted data is used for being externally output to a user of the data, so that the user receiving the encrypted data can use the encrypted data without decryption;
and the training module is used for training a machine learning model by using the encrypted data.
19. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to:
inputting data to be encrypted into a self-encoder for processing;
acquiring neural network hidden layer data generated by the self-encoder in the processing process;
obtaining encrypted data corresponding to the data to be encrypted according to the neural network hidden layer data; the encrypted data is used for being externally output to a user of the data, so that the user receiving the encrypted data can use the encrypted data without decryption; the using includes training a model with the encrypted data.
20. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to:
the acquisition module is used for acquiring encrypted data, inputting the corresponding data to be encrypted into a self-encoder for processing, and acquiring the encrypted data according to the neural network hidden layer data generated by the self-encoder in the processing process; the encrypted data is used for being externally output to a user of the data, so that the user receiving the encrypted data can use the encrypted data without decryption;
and the training module is used for training a machine learning model by using the encrypted data.
CN201710542807.1A 2017-07-05 2017-07-05 Data encryption and machine learning model training method and device and electronic equipment Active CN109214193B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710542807.1A CN109214193B (en) 2017-07-05 2017-07-05 Data encryption and machine learning model training method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710542807.1A CN109214193B (en) 2017-07-05 2017-07-05 Data encryption and machine learning model training method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN109214193A CN109214193A (en) 2019-01-15
CN109214193B true CN109214193B (en) 2022-03-22

Family

ID=64993078

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710542807.1A Active CN109214193B (en) 2017-07-05 2017-07-05 Data encryption and machine learning model training method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN109214193B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111523673B (en) * 2019-02-01 2021-07-27 创新先进技术有限公司 Model training method, device and system
CN111260053A (en) * 2020-01-13 2020-06-09 支付宝(杭州)信息技术有限公司 Method and apparatus for neural network model training using trusted execution environments
CN111539012B (en) * 2020-03-19 2021-07-20 重庆特斯联智慧科技股份有限公司 Privacy data distribution storage system and method of edge framework
CN111415013B (en) * 2020-03-20 2024-03-22 矩阵元技术(深圳)有限公司 Privacy machine learning model generation and training method and device and electronic equipment
CN113723604B (en) * 2020-05-26 2024-03-26 杭州海康威视数字技术股份有限公司 Neural network training method and device, electronic equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1578457A (en) * 2003-07-17 2005-02-09 汤姆森许可贸易公司 Individual video encryption system and method
CN105718959A (en) * 2016-01-27 2016-06-29 中国石油大学(华东) Object identification method based on own coding
CN105786798A (en) * 2016-02-25 2016-07-20 上海交通大学 Natural language intention understanding method in man-machine interaction
CN106469336A (en) * 2016-09-21 2017-03-01 广东工业大学 Two-dimension code anti-counterfeit prediction meanss based on BP neural network and method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060020563A1 (en) * 2004-07-26 2006-01-26 Coleman Christopher R Supervised neural network for encoding continuous curves
CN101350155A (en) * 2008-09-09 2009-01-21 无敌科技(西安)有限公司 Method and system for generating and verifying cipher through genus nerval network
CN102810154B (en) * 2011-06-02 2016-05-11 国民技术股份有限公司 A kind of physical characteristics collecting fusion method and system based on trusted module
CN104009836B (en) * 2014-05-26 2018-06-22 中国人民解放军理工大学 Encryption data detection method and system
CN105868678B (en) * 2015-01-19 2019-09-17 阿里巴巴集团控股有限公司 The training method and device of human face recognition model
CN106034146B (en) * 2015-03-12 2019-10-22 阿里巴巴集团控股有限公司 Information interacting method and system
CN105281973A (en) * 2015-08-07 2016-01-27 南京邮电大学 Webpage fingerprint identification method aiming at specific website category
CN105389770B (en) * 2015-11-09 2018-10-26 河南师范大学 Embedded, extracting method and device based on BP and the image watermark of RBF neural
CN105631296B (en) * 2015-12-30 2018-07-31 北京工业大学 A kind of safe face authentication system design method based on CNN feature extractors
CN105760932B (en) * 2016-02-17 2018-04-06 第四范式(北京)技术有限公司 Method for interchanging data, DEU data exchange unit and computing device
CN106203625B (en) * 2016-06-29 2019-08-02 中国电子科技集团公司第二十八研究所 A kind of deep-neural-network training method based on multiple pre-training
CN106355250B (en) * 2016-08-31 2019-04-30 天津南大通用数据技术股份有限公司 The optimization method and device of judgement private communication channel neural network based

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1578457A (en) * 2003-07-17 2005-02-09 汤姆森许可贸易公司 Individual video encryption system and method
CN105718959A (en) * 2016-01-27 2016-06-29 中国石油大学(华东) Object identification method based on own coding
CN105786798A (en) * 2016-02-25 2016-07-20 上海交通大学 Natural language intention understanding method in man-machine interaction
CN106469336A (en) * 2016-09-21 2017-03-01 广东工业大学 Two-dimension code anti-counterfeit prediction meanss based on BP neural network and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于神经网络的视频加密与压缩技术的研究;赵婷婷;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20100715(第07期);第I138-977页 *

Also Published As

Publication number Publication date
CN109214193A (en) 2019-01-15

Similar Documents

Publication Publication Date Title
CN109214193B (en) Data encryption and machine learning model training method and device and electronic equipment
CN113297396B (en) Method, device and equipment for updating model parameters based on federal learning
CN112200132A (en) Data processing method, device and equipment based on privacy protection
CN114429222A (en) Model training method, device and equipment
CN114417411A (en) End cloud development system, model processing method, device and equipment
CN108255471A (en) A kind of system configuration item configuration device based on configuration external member, method and apparatus
CN111158650A (en) Report template, report template and report generation method and device
CN111507726B (en) Message generation method, device and equipment
CN115238250B (en) Model processing method, device and equipment
CN111191090B (en) Method, device, equipment and storage medium for determining service data presentation graph type
CN109325127B (en) Risk identification method and device
CN113221717A (en) Model construction method, device and equipment based on privacy protection
CN113849837A (en) Training method, device and equipment of security model and data processing method
CN111209277A (en) Data processing method, device, equipment and medium
CN115828993A (en) Training method of transaction risk detection model, and transaction risk detection method and device
CN116630480B (en) Interactive text-driven image editing method and device and electronic equipment
CN114036571A (en) Data processing method, device and equipment based on privacy protection
CN113239851B (en) Privacy image processing method, device and equipment based on privacy protection
CN111160861B (en) Method, device and equipment for renewing service authority
CN113992429B (en) Event processing method, device and equipment
CN115017915B (en) Model training and task execution method and device
CN114037062A (en) Feature extraction method and device of multitask model
CN116011003A (en) Information recommendation method, device and equipment based on privacy protection
CN110929871A (en) Game decision method and system
CN117290724A (en) Service processing model training method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20191206

Address after: P.O. Box 31119, grand exhibition hall, hibiscus street, 802 West Bay Road, Grand Cayman, ky1-1205, Cayman Islands

Applicant after: Innovative advanced technology Co., Ltd

Address before: A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands

Applicant before: Alibaba Group Holding Co., Ltd.

REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40003199

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant