CN109325508A - The representation of knowledge, machine learning model training, prediction technique, device and electronic equipment - Google Patents

The representation of knowledge, machine learning model training, prediction technique, device and electronic equipment Download PDF

Info

Publication number
CN109325508A
CN109325508A CN201710637074.XA CN201710637074A CN109325508A CN 109325508 A CN109325508 A CN 109325508A CN 201710637074 A CN201710637074 A CN 201710637074A CN 109325508 A CN109325508 A CN 109325508A
Authority
CN
China
Prior art keywords
data
knowledge
neural network
training
machine learning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710637074.XA
Other languages
Chinese (zh)
Inventor
方文静
周俊
李小龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Advanced New Technologies Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201710637074.XA priority Critical patent/CN109325508A/en
Publication of CN109325508A publication Critical patent/CN109325508A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

This specification embodiment discloses the representation of knowledge, machine learning model training, prediction technique, device and electronic equipment, the knowledge representation method includes: to input initial data to deep neural network, the feature that the hidden layer of the deep neural network is exported is as the corresponding representation of knowledge of initial data, further, the representation of knowledge can be used for training machine learning model, or carry out data prediction for trained machine learning model.

Description

The representation of knowledge, machine learning model training, prediction technique, device and electronic equipment
Technical field
This specification is related to machine learning techniques field more particularly to the representation of knowledge, machine learning model training, prediction side Method, device and electronic equipment.
Background technique
In current Internet era, earth-shaking variation occurs for people's lives.Electricity is being realized using various data Under the new model of sub-information, the various aspects in people's daily life are facilitated, such as: payment, traffic and health control etc.. Meanwhile thus magnanimity data related to user also generate, that is, usually said big data.
In the prior art, machine learning model is the tool that big data field is commonly used, and learns mould in training machine Type or using machine learning model to initial data carry out data prediction when, initial data is often directly inputted into machine learning Model carries out corresponding processing operation.
Based on the prior art, it is desirable to be able to ensure the scheme of the privacy of initial data.
Summary of the invention
This specification embodiment provides the representation of knowledge, machine learning model training, prediction technique, device and electronics and sets It is standby, for solving following technical problem: being required to ensure the scheme of the privacy of initial data.
In order to solve the above technical problems, this specification embodiment is achieved in that
The knowledge representation method that this specification embodiment provides, comprising:
Initial data is inputted into deep neural network;
The hidden layer for obtaining the trained deep neural network is directed to the feature of initial data output, as described The corresponding representation of knowledge of initial data, which is used for training machine learning model, or is used for trained machine Learning model carries out data prediction.
The machine learning model training method that this specification embodiment provides, comprising:
Obtain the corresponding representation of knowledge of initial data, the corresponding representation of knowledge of the initial data is according to deep neural network Hidden layer for the initial data output feature obtain;
Using the corresponding representation of knowledge of the initial data, machine learning model is trained.
The machine learning model prediction technique that this specification embodiment provides, comprising:
Obtain data to be predicted;
The deep neural network that the data input to be predicted is used for the representation of knowledge is handled, the depth nerve net Network is obtained using training data training;
Feature of the hidden layer for the data output to be predicted for obtaining the deep neural network, as described to be predicted The corresponding representation of knowledge of data;
The corresponding representation of knowledge of the data to be predicted is inputted into machine learning model, obtains prediction result, the machine Learning model obtains the representation of knowledge training of the training data using the deep neural network.
The machine learning model training device that this specification embodiment provides, comprising:
Initial data is inputted deep neural network by the first input module;
First obtains module, obtains the hidden layer of the trained deep neural network for initial data output Feature, as the corresponding representation of knowledge of the initial data, which is used for training machine learning model, or is used for Trained machine learning model carries out data prediction.
The machine learning model training device that this specification embodiment provides, comprising:
Second obtains module, obtains the corresponding representation of knowledge of initial data, the corresponding representation of knowledge root of the initial data It is obtained according to the hidden layer of deep neural network for the feature of initial data output;
Second training module is trained machine learning model using the corresponding representation of knowledge of the initial data.
The machine learning model prediction meanss that this specification embodiment provides, comprising:
Second prediction obtains module, obtains data to be predicted;
The data input to be predicted is used at the deep neural network of the representation of knowledge by the second prediction processing module Reason, the deep neural network are obtained using training data training;
Second prediction extraction module obtains spy of the hidden layer for the data output to be predicted of the deep neural network Sign, as the corresponding representation of knowledge of the data to be predicted;
Second prediction output module, will the corresponding representation of knowledge input machine learning model of the data to be predicted, obtain Prediction result, the machine learning model are to the representation of knowledge of the training data trained using the deep neural network It arrives.
The a kind of electronic equipment that this specification embodiment provides, comprising:
At least one processor;And
The memory being connect at least one described processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one described processor, and described instruction is by described at least one A processor executes so that at least one described processor can:
For initial data is inputted deep neural network;
The hidden layer for obtaining the trained deep neural network is directed to the feature of initial data output, as described The corresponding representation of knowledge of initial data, which is used for training machine learning model, or is used for trained machine Learning model carries out data prediction.
Another electronic equipment that this specification embodiment provides, comprising:
At least one processor;And
The memory being connect at least one described processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one described processor, and described instruction is by described at least one A processor executes so that at least one described processor can:
To obtain the corresponding representation of knowledge of initial data, the corresponding representation of knowledge of the initial data is according to depth nerve net The hidden layer of network is obtained for the feature of initial data output;
Using the corresponding representation of knowledge of the initial data, machine learning model is trained.
Another electronic equipment that this specification embodiment provides, comprising:
At least one processor;And
The memory being connect at least one described processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one described processor, and described instruction is by described at least one A processor executes so that at least one described processor can:
To obtain data to be predicted;
The deep neural network that the data input to be predicted is used for the representation of knowledge is handled, the depth nerve net Network is obtained using training data training;
Feature of the hidden layer for the data output to be predicted for obtaining the deep neural network, as described to be predicted The corresponding representation of knowledge of data;
The corresponding representation of knowledge of the data to be predicted is inputted into machine learning model, obtains prediction result, the machine Learning model obtains the representation of knowledge training of the training data using the deep neural network.
At least one above-mentioned technical solution that this specification embodiment uses can reach following the utility model has the advantages that input is original Data obtain the output of the hidden layer of deep neural network as the representation of knowledge, are substituted with the representation of knowledge former to deep neural network Beginning data are conducive to the privacy for ensureing the initial data.
Detailed description of the invention
In order to illustrate more clearly of this specification embodiment or technical solution in the prior art, below will to embodiment or Attached drawing needed to be used in the description of the prior art is briefly described, it should be apparent that, the accompanying drawings in the following description is only The some embodiments recorded in this specification, for those of ordinary skill in the art, in not making the creative labor property Under the premise of, it is also possible to obtain other drawings based on these drawings.
Fig. 1 is a kind of overall architecture schematic diagram that the scheme of this specification is related under a kind of practical application scene;
Fig. 2 is a kind of flow diagram for knowledge representation method that this specification embodiment provides;
Fig. 3 is a kind of schematic diagram for deep neural network training process that this specification embodiment provides;
Fig. 4 is a kind of schematic diagram for representation of knowledge acquisition process that this specification embodiment provides;
Fig. 5 is a kind of flow diagram for machine learning model training method that this specification embodiment provides;
Fig. 6 is a kind of schematic diagram for machine learning model training process that this specification embodiment provides;
Fig. 7 is a kind of flow diagram for machine learning model prediction technique that this specification embodiment provides;
Fig. 8 is a kind of schematic illustration for machine learning model prediction technique that this specification embodiment provides;
Fig. 9 is a kind of structural schematic diagram for representation of knowledge device that this specification embodiment provides;
Figure 10 is a kind of structural schematic diagram for machine learning model training device that this specification embodiment provides;
Figure 11 is a kind of structural schematic diagram for machine learning model prediction meanss that this specification embodiment provides.
Specific embodiment
This specification embodiment provides the representation of knowledge, machine learning model training, prediction technique, device and electronics and sets It is standby.
In order to make those skilled in the art more fully understand the technical solution in this specification, below in conjunction with this explanation Attached drawing in book embodiment is clearly and completely described the technical solution in this specification embodiment, it is clear that described Embodiment be merely a part but not all of the embodiments of the present application.Based on this specification embodiment, this field Those of ordinary skill's every other embodiment obtained without creative efforts, all should belong to the application The range of protection.
Fig. 1 is a kind of overall architecture schematic diagram that the scheme of this specification is related under a kind of practical application scene.This is whole The workflow of body framework specifically includes that input initial data to deep neural network equipment, by the hidden of the deep neural network Further, which is sent to by network as the corresponding representation of knowledge of the initial data for the feature of layer output Machine learning model equipment is used for training machine learning model, or pre- for trained machine learning model progress data It surveys.
In the scheme of this specification, it is related to the training to deep neural network, and the training to machine learning model. Obscure in order to prevent, the training data that training machine learning model is utilized can be known as " the first training number by following embodiment According to ", the training data that training deep neural network is utilized is known as " the second training data ".
Based on the overall architecture, the scheme of this specification is described in detail below.
This specification embodiment provides a kind of knowledge representation method, as shown in Fig. 2, Fig. 2 is the knowledge representation method Flow diagram, the process may comprise steps of:
S202: initial data is inputted into deep neural network.
The deep neural network includes input layer, output layer and hidden layer.
Deep neural network can be obtained by prebuild, so be convenient for coping with other business divisions rapidly (for example, machine The training side of learning model or other need the business division etc. using the representation of knowledge) for by the deep neural network The demand of the available representation of knowledge;
Can also be connected to explicitly for certain type initial data use demand after or its representation of knowledge use need After asking, then the deep neural network is constructed, for example, when the user of machine learning model asks for initial data, then construct this Deep neural network in order to obtain the corresponding representation of knowledge of initial data, and the representation of knowledge is handed over instead of the initial data Pay the user.
Initial data can be corresponding business datum (for example, record etc. in picture, text or business datum table), Business datum can further be expressed as vector, in order to Processing with Neural Network.
S204: the hidden layer for obtaining the trained deep neural network is directed to the feature of initial data output, makees For the corresponding representation of knowledge of the initial data, which can be used for training machine learning model, or for instructing The machine learning model perfected carries out data prediction.
Multiple hidden layers are generally comprised in deep neural network, hidden layer output is characterized in carrying out the initial data non-thread What property converted.
For any initial data of input, can correspond to obtain the feature of one or more hidden layer outputs, due to the spy Sign includes the main information of the initial data, and therefore, the feature that one or more of hidden layers can be exported is as the original The corresponding representation of knowledge of beginning data.
In practical applications, the dimension of hidden layer can be less than the corresponding dimension of initial data, in this way, not only improving prevents original Beginning data adversely affect subsequent learning process bring in the case where feature is sparse, are also beneficial to reduce saving the representation of knowledge The cost spent.
All sides of initial data and the training side of machine learning model are frequently not same side, and initial data may The information for being unable to external disclosure, therefore, all sides of initial data can using the corresponding representation of knowledge of the initial data as First training data is supplied to training side, and training Fang Ze is learnt using first training data instead of the initial data training machine Model, to be conducive to ensure initial data for the privacy of training side.
In addition, the corresponding knowledge of initial data can be used when predicting using machine learning model initial data Indicate to replace initial data, so using trained machine learning model realize to the corresponding representation of knowledge of initial data into Row prediction is equally beneficial for realizing the privacy protecting to initial data during prediction.
Further, hidden layer described in step S204 can specifically refer to high hidden layer.Generally, high hidden layer can refer to close One or more hidden layers of output layer.Preferably, high hidden layer can be the last one hidden layer of corresponding deep neural network, That is, near the hidden layer of output layer.
Generally, the quantity of hidden layer is more, then also more to the number of transitions of the initial data of input, in this way, to deserved The representation of knowledge arrived changes privacy that is also bigger, and then being more advantageous to the protection initial data compared with the initial data.
In this specification embodiment, for step S202, the deep neural network is obtained by training, specifically can be with It include: to be trained training data (specially the second training data) input neural network to it;The training data is square Formation formula, the row of the matrix indicate that a training sample, column indicate the feature or label of each training sample.
The initial data can be used as the second training data and be used to train deep neural network.Second training data can Think matrix form, wherein the row of the second training data matrix indicates that a training sample, column indicate each training sample Feature or label;The corresponding knowledge table of second training data is the square that columns is less than the second training data matrix column number Battle array.
Second training data is specifically as follows the second training data matrix being made of the feature of training sample, generally The row (can be considered as row vector) on ground, the second training data matrix can indicate that training sample, column can indicate training sample Feature.
For example, every row respectively indicates a training sample in the second training data matrix, each column respectively indicates training One feature of sample, the dimension of the sample is the feature quantity of the training sample.
It is specifically illustrated in conjunction with Fig. 3, Fig. 3 is a kind of deep neural network training process that this specification embodiment provides Schematic diagram.
In Fig. 3, the training data of deep neural network is the second training data matrix of n row d column, second instruction The every row for practicing data matrix respectively indicates a training sample, and each column respectively indicates a feature of training sample.The label The target output that can be the corresponding Machine Learning Problems of the second training data matrix, specific to Fig. 3, which is by the second instruction Practice a n dimensional vector of the corresponding target output composition of each training sample that data matrix includes.
The training process of corresponding deep neural network mainly may include: in Fig. 3
The second training data Input matrix is handled to a specified neural network, which includes two A hidden layer, using one of hidden layer of hithermost output layer as high hidden layer;
It is corresponding to the processing result of the second training data matrix and the second training data matrix according to the neural network Label, which is trained, the trained neural network may act as above-mentioned for the representation of knowledge Deep neural network.
Fig. 4 is a kind of schematic diagram for representation of knowledge acquisition process that this specification embodiment provides, which may include:
Raw data matrix is handled using the trained deep neural network of training process by Fig. 3, according to the depth The feature of the corresponding output of the high hidden layer of neural network, obtains the matrix (referred to as representation of knowledge matrix) of n row p column, as this The corresponding representation of knowledge of raw data matrix.
Row in representation of knowledge matrix and raw data matrix is correspondingly that every row corresponds respectively to an original number According to.Here, the columns d of raw data matrix preferably may be larger than representation of knowledge matrix column number p, in this way, by original Data matrix carries out dimensionality reduction and obtains the representation of knowledge, which has the advantages that carrying cost is low, good confidentiality.
Based on identical thinking, this specification additionally provides a kind of machine learning model training method, and Fig. 5 is a kind of machine The flow diagram of learning model training method, the process in Fig. 5 may comprise steps of:
S502: obtaining the corresponding representation of knowledge of initial data, and the corresponding representation of knowledge of the initial data is according to depth mind Hidden layer through network is obtained for the feature of initial data output;
S504: using the corresponding representation of knowledge of the initial data, machine learning model is trained.
Method is trained to above-mentioned machine learning model below with reference to Fig. 6 to be illustrated.
Fig. 6 is a kind of schematic diagram for machine learning model training process that this specification embodiment provides, the training process It may include: to be trained using the representation of knowledge obtained as shown in Figure 4 as the first training data to machine learning model, Obtain required machine learning model.In practical applications, it can also directly be trained using the corresponding initial data of the representation of knowledge Machine learning model still in the scheme of this specification, preferably replaces it as the first training data using the representation of knowledge Corresponding initial data is trained machine learning model, can be directly exposed to machine learning model to avoid by initial data Training side, be conducive to protect training data privacy.
For the representation of knowledge that deep neural network handles initial data, can using the representation of knowledge as First training data replaces original initial data, is used for training machine learning model.The advantage is that: since the representation of knowledge is What nonlinear, more complicated transformation obtained is carried out to original initial data using deep neural network, therefore, is used When training machine learning model, original initial data will not be exposed, to be conducive to improve the hidden of original initial data Private is also beneficial to improve the safety of machine learning model training process.
Further, the hidden layer is specially high hidden layer, and the high hidden layer is the last one of the deep neural network Hidden layer, that is, the hidden layer near output layer.
It is possible to further use trained machine learning model to carry out data prediction.
The process of the data prediction such as may comprise steps of:
Obtain data to be predicted;The data to be predicted are inputted the deep neural network to handle;Described in acquisition The hidden layer of deep neural network for the data to be predicted output feature, as the corresponding knowledge table of the data to be predicted Show;By the trained machine learning model of the corresponding representation of knowledge input of data to be predicted, prediction result is obtained.
According to the explanation of front it is found that by the way that the representation of knowledge is advantageous as the first training data training machine learning model In the privacy for protecting corresponding first training data, be based on same thinking, using trained machine learning model into Line number it is predicted that when, can also first treat prediction data carry out the representation of knowledge, with for carry out number it was predicted that so being conducive to protect Protect the privacy of data to be predicted.
Based on same thinking, this specification additionally provides a kind of machine learning model prediction technique, the process of this method Figure is as shown in Figure 7.
Fig. 7 is a kind of flow diagram for machine learning model prediction technique that this specification embodiment provides, this method It may comprise steps of:
S702: data to be predicted are obtained.
S704: the deep neural network that the data input to be predicted is used for the representation of knowledge is handled, the depth Neural network is obtained using training data training;
S706: feature of the hidden layer for the data output to be predicted of the deep neural network is obtained, as described The corresponding representation of knowledge of data to be predicted.
S708: inputting machine learning model for the corresponding representation of knowledge of the data to be predicted, obtain prediction result, described Machine learning model obtains the representation of knowledge training of the training data using the deep neural network.
The above method is understood in order to clearer, can refer to schematic illustration shown in Fig. 8.
Fig. 8 is a kind of schematic illustration for machine learning model prediction technique that this specification embodiment provides.
Training process based on Fig. 3, Fig. 4 and machine learning model shown in fig. 6 obtains required machine learning model, into One step, treating the main process that prediction data is predicted using the trained machine learning model may include:
Input data matrix to be predicted, by trained deep neural network to the data matrix to be predicted at Reason;The feature of high hidden layer output is extracted from the deep neural network as the representation of knowledge, the columns p of the representation of knowledge less than to Prediction data matrix column number d;Then, inputting the representation of knowledge, (machine learning model corresponds to the machine learning model Model shown in fig. 8), obtain the prediction result for treating prediction data.
It is converted to obtain the representation of knowledge by treating prediction data using deep neural network, prediction data is treated in realization The dimensionality reduction of feature advantageously reduces the carrying cost to the representation of knowledge, also helps the secret protection for being promoted and treating prediction data Effect.
Based on same thinking, this specification embodiment additionally provides a kind of representation of knowledge device, as shown in Figure 9.The dress It sets and may include:
First input module 101, inputs deep neural network for initial data;
First obtains module 102, and the hidden layer for obtaining the trained deep neural network is defeated for the initial data Feature out, as the corresponding representation of knowledge of the initial data, which is used for training machine learning model, or Data prediction is carried out for trained machine learning model.
Further, the hidden layer is specially high hidden layer.The high hidden layer is the last one of the deep neural network Hidden layer.
Wherein, high hidden layer can be the last one hidden layer of multiple hidden layers in the deep neural network, namely near The hidden layer of nearly output layer.
Further, the deep neural network obtains as follows, can specifically include:
The specified neural network of initial data input is trained it by first input module 101;It is described Training data is matrix form, and the row of the matrix indicates that a training sample, column indicate the feature or label of each training sample.
The corresponding representation of knowledge of the initial data is the matrix that columns is less than the training data matrix column number.
It is obtained after carrying out complicated, nonlinear processing to training data by deep neural network due to the representation of knowledge , also, the representation of knowledge is the matrix that columns is less than the training data matrix column number, the columns indicates feature Dimension obtains the representation of knowledge by carrying out dimensionality reduction to training data, which has carrying cost low, good confidentiality excellent Point.
Based on same thinking, this specification embodiment additionally provides a kind of machine learning model training device, such as Figure 10 It is shown.
Figure 10 is a kind of structural schematic diagram for machine learning model prediction meanss that this specification embodiment provides, the device It can specifically include:
Second obtains module 201, obtains the corresponding representation of knowledge of initial data, the corresponding representation of knowledge of the initial data Feature according to the hidden layer of deep neural network for initial data output obtains;
Second training module 202 instructs machine learning model using the corresponding representation of knowledge of the initial data Practice.
Second obtains module 201, (right by the way that initial data is replaced with the representation of knowledge obtained by deep neural network Answer the first training data described previously), which has better secret protection effect than initial data, can keep away Exempt from the training side that initial data is directly exposed to machine learning model.
Further, the hidden layer is specially high hidden layer, and further, the high hidden layer is the deep neural network The last one hidden layer.
In this specification embodiment, using the corresponding representation of knowledge of initial data, machine learning model has been trained Afterwards, trained machine learning model can be used and carry out number it was predicted that the device for carrying out data prediction may include with lower die Block:
First prediction obtains module, obtains data to be predicted;
The data to be predicted are inputted the deep neural network and handled by the first prediction processing module;
First prediction extraction module obtains spy of the hidden layer for the data output to be predicted of the deep neural network Sign, as the corresponding representation of knowledge of the data to be predicted;
First prediction output module, by the trained engineering of the corresponding representation of knowledge input of data to be predicted Model is practised, prediction result is obtained.
The representation of knowledge by deep neural network treat prediction data carry out it is complicated, obtain after Nonlinear Processing, can To have the effect of preferable secret protection.
Based on same thinking, this specification embodiment additionally provides a kind of machine learning model prediction meanss, specifically may be used To refer to Figure 11.
Figure 11 is a kind of structural schematic diagram for machine learning model prediction meanss that this specification embodiment provides, and specifically may be used To include:
Second prediction obtains module 301, obtains data to be predicted;
Second prediction processing module 302 handles the data input deep neural network to be predicted;The depth Neural network obtains the representation of knowledge for generating the corresponding data to be predicted by training;
Second prediction extraction module 303 obtains the hidden layer of the deep neural network for the data output to be predicted Feature, as the corresponding representation of knowledge of the data to be predicted;
Second prediction output module 304, will the corresponding representation of knowledge input machine learning model of the data to be predicted, obtain To prediction result;The representation of knowledge training that the machine learning model is exported according to the deep neural network obtains.
Further, the hidden layer is specially high hidden layer.
Further, the high hidden layer is the last layer of multiple hidden layers in the deep neural network.
High hidden layer can be the last layer of multiple hidden layers in the deep neural network, be also possible to the depth nerve Other hidden layers such as hidden layer of layer second from the bottom in multiple hidden layers in network.Generally, the quantity of hidden layer is more, training machine study The effect of model is better.
Based on same thinking, this illustrates that embodiment additionally provides a kind of electronic equipment, comprising:
At least one processor;And
The memory being connect at least one described processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one described processor, and described instruction is by described at least one A processor executes so that at least one described processor can:
For initial data is inputted deep neural network;
The hidden layer for obtaining the trained deep neural network is directed to the feature of initial data output, as described The corresponding representation of knowledge of initial data, which is used for training machine learning model, or is used for trained machine Learning model carries out data prediction.
Based on same thinking, this illustrates that embodiment additionally provides another electronic equipment, comprising:
At least one processor;And
The memory being connect at least one described processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one described processor, and described instruction is by described at least one A processor executes so that at least one described processor can:
To obtain the corresponding representation of knowledge of initial data, the corresponding representation of knowledge of the initial data is according to depth nerve net The hidden layer of network is obtained for the feature of initial data output;
Using the corresponding representation of knowledge of the initial data, machine learning model is trained.
Based on same thinking, this illustrates that embodiment additionally provides another electronic equipment, comprising:
At least one processor;And
The memory being connect at least one described processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one described processor, and described instruction is by described at least one A processor executes so that at least one described processor can:
To obtain data to be predicted;
The deep neural network that the data input to be predicted is used for the representation of knowledge is handled;The depth nerve net Network is obtained using training data training;
Feature of the hidden layer for the data output to be predicted for obtaining the deep neural network, as described to be predicted The corresponding representation of knowledge of data;
The corresponding representation of knowledge of the data to be predicted is inputted into machine learning model, obtains prediction result;The machine Learning model obtains the representation of knowledge training of the training data using the deep neural network.
It is above-mentioned that this specification specific embodiment is described.Other embodiments are in the scope of the appended claims It is interior.In some cases, the movement recorded in detail in the claims or step can be come according to the sequence being different from embodiment It executes and desired result still may be implemented.In addition, process depicted in the drawing not necessarily require show it is specific suitable Sequence or consecutive order are just able to achieve desired result.In some embodiments, multitasking and parallel processing be also can With or may be advantageous.
All the embodiments in this specification are described in a progressive manner, same and similar portion between each embodiment Dividing may refer to each other, and each embodiment focuses on the differences from other embodiments.Especially for device, For electronic equipment, nonvolatile computer storage media embodiment, since it is substantially similar to the method embodiment, so description It is fairly simple, the relevent part can refer to the partial explaination of embodiments of method.
Device that this specification embodiment provides, electronic equipment, nonvolatile computer storage media with method are corresponding , therefore, device, electronic equipment, nonvolatile computer storage media also have the Advantageous effect similar with corresponding method Fruit, since the advantageous effects of method being described in detail above, which is not described herein again corresponding intrument, The advantageous effects of electronic equipment, nonvolatile computer storage media.
In the 1990s, the improvement of a technology can be distinguished clearly be on hardware improvement (for example, Improvement to circuit structures such as diode, transistor, switches) or software on improvement (improvement for method flow).So And with the development of technology, the improvement of current many method flows can be considered as directly improving for hardware circuit. Designer nearly all obtains corresponding hardware circuit by the way that improved method flow to be programmed into hardware circuit.Cause This, it cannot be said that the improvement of a method flow cannot be realized with hardware entities module.For example, programmable logic device (Programmable Logic Device, PLD) (such as field programmable gate array (Field Programmable Gate Array, FPGA)) it is exactly such a integrated circuit, logic function determines device programming by user.By designer Voluntarily programming comes a digital display circuit " integrated " on a piece of PLD, designs and makes without asking chip maker Dedicated IC chip.Moreover, nowadays, substitution manually makes IC chip, this programming is also used instead mostly " is patrolled Volume compiler (logic compiler) " software realizes that software compiler used is similar when it writes with program development, And the source code before compiling also write by handy specific programming language, this is referred to as hardware description language (Hardware Description Language, HDL), and HDL is also not only a kind of, but there are many kind, such as ABEL (Advanced Boolean Expression Language)、AHDL(Altera Hardware Description Language)、Confluence、CUPL(Cornell University Programming Language)、HDCal、JHDL (Java Hardware Description Language)、Lava、Lola、MyHDL、PALASM、RHDL(Ruby Hardware Description Language) etc., VHDL (Very-High-Speed is most generally used at present Integrated Circuit Hardware Description Language) and Verilog.Those skilled in the art also answer This understands, it is only necessary to method flow slightly programming in logic and is programmed into integrated circuit with above-mentioned several hardware description languages, The hardware circuit for realizing the logical method process can be readily available.
Controller can be implemented in any suitable manner, for example, controller can take such as microprocessor or processing The computer for the computer readable program code (such as software or firmware) that device and storage can be executed by (micro-) processor can Read medium, logic gate, switch, specific integrated circuit (Application Specific Integrated Circuit, ASIC), the form of programmable logic controller (PLC) and insertion microcontroller, the example of controller includes but is not limited to following microcontroller Device: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicone Labs C8051F320 are deposited Memory controller is also implemented as a part of the control logic of memory.It is also known in the art that in addition to Pure computer readable program code mode is realized other than controller, can be made completely by the way that method and step is carried out programming in logic Controller is obtained to come in fact in the form of logic gate, switch, specific integrated circuit, programmable logic controller (PLC) and insertion microcontroller etc. Existing identical function.Therefore this controller is considered a kind of hardware component, and to including for realizing various in it The device of function can also be considered as the structure in hardware component.Or even, it can will be regarded for realizing the device of various functions For either the software module of implementation method can be the structure in hardware component again.
System, device, module or the unit that above-described embodiment illustrates can specifically realize by computer chip or entity, Or it is realized by the product with certain function.It is a kind of typically to realize that equipment is computer.Specifically, computer for example may be used Think personal computer, laptop computer, cellular phone, camera phone, smart phone, personal digital assistant, media play It is any in device, navigation equipment, electronic mail equipment, game console, tablet computer, wearable device or these equipment The combination of equipment.
For convenience of description, it is divided into various units when description apparatus above with function to describe respectively.Certainly, implementing this The function of each unit can be realized in the same or multiple software and or hardware when specification one or more embodiment.
It should be understood by those skilled in the art that, this specification embodiment can provide as method, system or computer program Product.Therefore, this specification embodiment can be used complete hardware embodiment, complete software embodiment or combine software and hardware The form of the embodiment of aspect.Moreover, it wherein includes that computer is available that this specification embodiment, which can be used in one or more, It is real in the computer-usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) of program code The form for the computer program product applied.
This specification is referring to the method, equipment (system) and computer program product according to this specification embodiment Flowchart and/or the block diagram describes.It should be understood that can be realized by computer program instructions every in flowchart and/or the block diagram The combination of process and/or box in one process and/or box and flowchart and/or the block diagram.It can provide these computers Processor of the program instruction to general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices To generate a machine, so that generating use by the instruction that computer or the processor of other programmable data processing devices execute In the dress for realizing the function of specifying in one or more flows of the flowchart and/or one or more blocks of the block diagram It sets.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates, Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one The step of function of being specified in a box or multiple boxes.
In a typical configuration, calculating equipment includes one or more processors (CPU), input/output interface, net Network interface and memory.
Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/or The forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable medium Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method Or technology come realize information store.Information can be computer readable instructions, data structure, the module of program or other data. The example of the storage medium of computer includes, but are not limited to phase change memory (PRAM), static random access memory (SRAM), moves State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable Programmable read only memory (EEPROM), flash memory or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM), Digital versatile disc (DVD) or other optical storage, magnetic cassettes, tape magnetic disk storage or other magnetic storage devices Or any other non-transmission medium, can be used for storage can be accessed by a computing device information.As defined in this article, it calculates Machine readable medium does not include temporary computer readable media (transitory media), such as the data-signal and carrier wave of modulation.
It should also be noted that, the terms "include", "comprise" or its any other variant are intended to nonexcludability It include so that the process, method, commodity or the equipment that include a series of elements not only include those elements, but also to wrap Include other elements that are not explicitly listed, or further include for this process, method, commodity or equipment intrinsic want Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including described want There is also other identical elements in the process, method of element, commodity or equipment.
This specification can describe in the general context of computer-executable instructions executed by a computer, such as journey Sequence module.Generally, program module include routines performing specific tasks or implementing specific abstract data types, programs, objects, Component, data structure etc..Specification can also be practiced in a distributed computing environment, in these distributed computing environments, By executing task by the connected remote processing devices of communication network.In a distributed computing environment, program module can To be located in the local and remote computer storage media including storage equipment.
All the embodiments in this specification are described in a progressive manner, same and similar portion between each embodiment Dividing may refer to each other, and each embodiment focuses on the differences from other embodiments.Especially for system reality For applying example, since it is substantially similar to the method embodiment, so being described relatively simple, related place is referring to embodiment of the method Part explanation.
The foregoing is merely this specification embodiments, are not intended to limit this application.For those skilled in the art For, various changes and changes are possible in this application.All any modifications made within the spirit and principles of the present application are equal Replacement, improvement etc., should be included within the scope of the claims of this application.

Claims (17)

1. a kind of knowledge representation method, comprising:
Initial data is inputted into deep neural network;
The hidden layer for obtaining the trained deep neural network is directed to the feature of initial data output, as described original The corresponding representation of knowledge of data, which is used for training machine learning model, or is used for trained machine learning Model carries out data prediction.
2. the method as described in claim 1, the hidden layer is specially high hidden layer.
3. method according to claim 2, the high hidden layer is the last one hidden layer of the deep neural network.
4. the method as described in claim 1, the deep neural network obtains as follows:
Training data input neural network is trained it;The training data is matrix form, and the row of the matrix indicates One training sample, column indicate the feature or label of each training sample.
5. a kind of machine learning model training method, comprising:
Obtain the corresponding representation of knowledge of initial data, the corresponding representation of knowledge of the initial data is according to the hidden of deep neural network Layer is obtained for the feature of initial data output;
Using the corresponding representation of knowledge of the initial data, machine learning model is trained.
6. method as claimed in claim 5, described using the corresponding representation of knowledge of the initial data, to machine learning model After being trained, the method further include:
Obtain data to be predicted;
The data to be predicted are inputted the deep neural network to handle;
Feature of the hidden layer for the data output to be predicted for obtaining the deep neural network, as the data to be predicted The corresponding representation of knowledge;
By the trained machine learning model of the corresponding representation of knowledge input of data to be predicted, prediction result is obtained.
7. a kind of machine learning model prediction technique, comprising:
Obtain data to be predicted;
The deep neural network that the data input to be predicted is used for the representation of knowledge is handled, the deep neural network benefit It is obtained with training data training;
Feature of the hidden layer for the data output to be predicted for obtaining the deep neural network, as the data to be predicted The corresponding representation of knowledge;
The corresponding representation of knowledge of the data to be predicted is inputted into machine learning model, obtains prediction result, the machine learning Model obtains the representation of knowledge training of the training data using the deep neural network.
8. a kind of representation of knowledge device, comprising:
Initial data is inputted deep neural network by the first input module;
First obtains module, and the hidden layer for obtaining the trained deep neural network is directed to the spy of initial data output Sign, as the corresponding representation of knowledge of the initial data, which is used for training machine learning model, or for instructing The machine learning model perfected carries out data prediction.
9. device as claimed in claim 8, the hidden layer is specially high hidden layer.
10. device as claimed in claim 9, the high hidden layer is the last one hidden layer of the deep neural network.
11. device as claimed in claim 8, the deep neural network obtains as follows:
Training data input neural network is trained it by first input module;The training data is rectangular Formula, the row of the matrix indicate that a training sample, column indicate the feature or label of each training sample.
12. a kind of machine learning model training device, the device include:
Second obtains module, obtains the corresponding representation of knowledge of initial data, and the corresponding representation of knowledge of the initial data is according to depth The hidden layer of degree neural network is obtained for the feature of initial data output;
Second training module is trained machine learning model using the corresponding representation of knowledge of the initial data.
13. device as claimed in claim 12, described using the corresponding representation of knowledge of the initial data, to machine learning mould After type is trained, the device further include:
First prediction obtains module, obtains data to be predicted;
The data to be predicted are inputted the deep neural network and handled by the first prediction processing module;
First prediction extraction module obtains feature of the hidden layer for the data output to be predicted of the deep neural network, As the corresponding representation of knowledge of the data to be predicted;
First prediction output module, by the trained machine learning mould of the corresponding representation of knowledge input of data to be predicted Type obtains prediction result.
14. a kind of machine learning model prediction meanss, comprising:
Second prediction obtains module, obtains data to be predicted;
Second prediction processing module handles the deep neural network that the data input to be predicted is used for the representation of knowledge, The deep neural network is obtained using training data training;
Second prediction extraction module obtains feature of the hidden layer for the data output to be predicted of the deep neural network, As the corresponding representation of knowledge of the data to be predicted;
Second prediction output module, will the corresponding representation of knowledge input machine learning model of the data to be predicted, predicted As a result, the machine learning model obtains the representation of knowledge training of the training data using the deep neural network.
15. a kind of electronic equipment, comprising:
At least one processor;And
The memory being connect at least one described processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one described processor, and described instruction is by described at least one Manage device execute so that at least one described processor can:
For initial data is inputted deep neural network;
The hidden layer for obtaining the trained deep neural network is directed to the feature of initial data output, as described original The corresponding representation of knowledge of data, which is used for training machine learning model, or is used for trained machine learning Model carries out data prediction.
16. a kind of electronic equipment, comprising:
At least one processor;And
The memory being connect at least one described processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one described processor, and described instruction is by described at least one Manage device execute so that at least one described processor can:
To obtain the corresponding representation of knowledge of initial data, the corresponding representation of knowledge of the initial data is according to deep neural network Hidden layer is obtained for the feature of initial data output;
Using the corresponding representation of knowledge of the initial data, machine learning model is trained.
17. a kind of electronic equipment, comprising:
At least one processor;And
The memory being connect at least one described processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one described processor, and described instruction is by described at least one Manage device execute so that at least one described processor can:
To obtain data to be predicted;
The deep neural network that the data input to be predicted is used for the representation of knowledge is handled, the deep neural network benefit It is obtained with training data training;
Feature of the hidden layer for the data output to be predicted for obtaining the deep neural network, as the data to be predicted The corresponding representation of knowledge;
The corresponding representation of knowledge of the data to be predicted is inputted into machine learning model, obtains prediction result, the machine learning Model obtains the representation of knowledge training of the training data using the deep neural network.
CN201710637074.XA 2017-07-31 2017-07-31 The representation of knowledge, machine learning model training, prediction technique, device and electronic equipment Pending CN109325508A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710637074.XA CN109325508A (en) 2017-07-31 2017-07-31 The representation of knowledge, machine learning model training, prediction technique, device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710637074.XA CN109325508A (en) 2017-07-31 2017-07-31 The representation of knowledge, machine learning model training, prediction technique, device and electronic equipment

Publications (1)

Publication Number Publication Date
CN109325508A true CN109325508A (en) 2019-02-12

Family

ID=65244748

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710637074.XA Pending CN109325508A (en) 2017-07-31 2017-07-31 The representation of knowledge, machine learning model training, prediction technique, device and electronic equipment

Country Status (1)

Country Link
CN (1) CN109325508A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111429215A (en) * 2020-03-18 2020-07-17 北京互金新融科技有限公司 Data processing method and device
CN111461328A (en) * 2020-04-03 2020-07-28 南京星火技术有限公司 Neural network training method and electronic equipment
CN111753878A (en) * 2020-05-20 2020-10-09 济南浪潮高新科技投资发展有限公司 Network model deployment method, equipment and medium
CN112100645A (en) * 2019-06-18 2020-12-18 中国移动通信集团浙江有限公司 Data processing method and device
CN113557536A (en) * 2019-04-25 2021-10-26 欧姆龙株式会社 Learning system, data generation device, data generation method, and data generation program
CN114418128A (en) * 2022-03-25 2022-04-29 新华三人工智能科技有限公司 Model deployment method and device
CN113557536B (en) * 2019-04-25 2024-05-31 欧姆龙株式会社 Learning system, data generation device, data generation method, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104298999A (en) * 2014-09-30 2015-01-21 西安电子科技大学 Hyperspectral feature leaning method based on recursion automatic coding
CN104331816A (en) * 2014-10-28 2015-02-04 常州大学 Knowledge learning and privacy protection based big-data user purchase intention predicating method
US20150134583A1 (en) * 2013-11-14 2015-05-14 Denso Corporation Learning apparatus, learning program, and learning method
CN105426857A (en) * 2015-11-25 2016-03-23 小米科技有限责任公司 Training method and device of face recognition model
CN105718959A (en) * 2016-01-27 2016-06-29 中国石油大学(华东) Object identification method based on own coding
CN106682606A (en) * 2016-12-23 2017-05-17 湘潭大学 Face recognizing method and safety verification apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150134583A1 (en) * 2013-11-14 2015-05-14 Denso Corporation Learning apparatus, learning program, and learning method
CN104298999A (en) * 2014-09-30 2015-01-21 西安电子科技大学 Hyperspectral feature leaning method based on recursion automatic coding
CN104331816A (en) * 2014-10-28 2015-02-04 常州大学 Knowledge learning and privacy protection based big-data user purchase intention predicating method
CN105426857A (en) * 2015-11-25 2016-03-23 小米科技有限责任公司 Training method and device of face recognition model
CN105718959A (en) * 2016-01-27 2016-06-29 中国石油大学(华东) Object identification method based on own coding
CN106682606A (en) * 2016-12-23 2017-05-17 湘潭大学 Face recognizing method and safety verification apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
赵婷婷: "基于神经网络的视频加密与压缩技术的研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113557536A (en) * 2019-04-25 2021-10-26 欧姆龙株式会社 Learning system, data generation device, data generation method, and data generation program
CN113557536B (en) * 2019-04-25 2024-05-31 欧姆龙株式会社 Learning system, data generation device, data generation method, and storage medium
CN112100645A (en) * 2019-06-18 2020-12-18 中国移动通信集团浙江有限公司 Data processing method and device
CN111429215A (en) * 2020-03-18 2020-07-17 北京互金新融科技有限公司 Data processing method and device
CN111429215B (en) * 2020-03-18 2023-10-31 北京互金新融科技有限公司 Data processing method and device
CN111461328A (en) * 2020-04-03 2020-07-28 南京星火技术有限公司 Neural network training method and electronic equipment
CN111461328B (en) * 2020-04-03 2024-04-30 南京星火技术有限公司 Training method of neural network
CN111753878A (en) * 2020-05-20 2020-10-09 济南浪潮高新科技投资发展有限公司 Network model deployment method, equipment and medium
CN114418128A (en) * 2022-03-25 2022-04-29 新华三人工智能科技有限公司 Model deployment method and device

Similar Documents

Publication Publication Date Title
CN109325508A (en) The representation of knowledge, machine learning model training, prediction technique, device and electronic equipment
CN108170667A (en) Term vector processing method, device and equipment
CN107957831A (en) A kind of data processing method, device and processing equipment for showing interface content
CN107391526A (en) A kind of data processing method and equipment based on block chain
CN109214193B (en) Data encryption and machine learning model training method and device and electronic equipment
CN107957989A (en) Term vector processing method, device and equipment based on cluster
CN109271587A (en) A kind of page generation method and device
CN110378400A (en) A kind of model training method and device for image recognition
CN108021610A (en) Random walk, random walk method, apparatus and equipment based on distributed system
CN108874765A (en) Term vector processing method and processing device
CN109389412A (en) A kind of method and device of training pattern
CN108537085A (en) A kind of barcode scanning image-recognizing method, device and equipment
CN108491468A (en) A kind of document processing method, device and server
CN107122632A (en) The encryption method and device of software installation bag
CN110119507A (en) Term vector generation method, device and equipment
CN106611401A (en) Method and device for storing image in texture memory
CN109241749A (en) Data encryption, machine learning model training method, device and electronic equipment
CN109800582A (en) Multiparty data processing method, device and the equipment that can be traced to the source
CN109739474A (en) A kind of processing method of service request, device, equipment and medium
CN107613046A (en) Filter pipe-line system, image processing method, device and electronic equipment
CN108255471A (en) A kind of system configuration item configuration device based on configuration external member, method and apparatus
CN107766703A (en) Watermark addition processing method, device and client
CN110502614A (en) Text hold-up interception method, device, system and equipment
CN107025259A (en) A kind of deployment method of details page, equipment and mobile terminal
CN109656946A (en) A kind of multilist relation query method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20191209

Address after: P.O. Box 31119, grand exhibition hall, hibiscus street, 802 West Bay Road, Grand Cayman, ky1-1205, Cayman Islands

Applicant after: Innovative advanced technology Co.,Ltd.

Address before: A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands

Applicant before: ALIBABA GROUP HOLDING Ltd.

REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40004017

Country of ref document: HK

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190212