CN109740743A - Hierarchical neural network query recommendation method and device - Google Patents
Hierarchical neural network query recommendation method and device Download PDFInfo
- Publication number
- CN109740743A CN109740743A CN201910255010.2A CN201910255010A CN109740743A CN 109740743 A CN109740743 A CN 109740743A CN 201910255010 A CN201910255010 A CN 201910255010A CN 109740743 A CN109740743 A CN 109740743A
- Authority
- CN
- China
- Prior art keywords
- neural network
- layer neural
- state vector
- session
- moment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Abstract
The invention discloses a hierarchical neural network query recommendation method and device. The hierarchical neural network query recommendation method comprises the following steps: establishing two neural networks, namely a session layer neural network for modeling short-term query records of a user and a user layer neural network for modeling long-term query records of the user, wherein a state vector of the session layer neural network at the current moment is used as the input of the user layer neural network at the current moment, and a state vector of the user layer neural network at the current moment is used as the input of the session layer neural network at the next moment; and outputting inquiry recommendation content for the inquiry session according to the session layer neural network and the user layer neural network. The scheme provided by the invention can improve the efficiency and accuracy of query recommendation.
Description
Technical field
The present invention relates to technical field of the computer network, and in particular to a kind of hierarchical neural network inquiry recommended method and dress
It sets.
Background technique
In the prior art, there are various inquiry recommended methods.Inquiry is recommended to be to carry out information retrieval in user
Cheng Zhong, when user merely enters partial query keyword, system can predict the query intention of user, recommend one group of user and look into
Candidate word is ask for its selection.By inquiring the searching keyword for recommending that user can be helped to remodify their inputs, so as to more
Required content is accurately found, the satisfaction of user is improved.Inquiry recommended method traditional at present is all based on greatly feature
It excavates, used feature includes co-occurrence degree and the semantic similarity etc. between inquiry.But the excavation of this feature is often all
It is artificially defined and operation, it is likely that there are some hiding features not to be mined and find.
It is in the prior art be based on RNN(Recurrent Neural Networks, Recognition with Recurrent Neural Network) inquiry recommend
Method can well model time series data, to predict next possible query term.But
This method has only focused on current short-term inquiry session, does not account for the inquiry record of user, thus inquire the efficiency recommended and
Accuracy rate is to be improved.
Summary of the invention
In view of this, can be mentioned it is an object of the invention to propose that a kind of hierarchical neural network inquires recommended method and device
Efficiency and accuracy rate are recommended in height inquiry.
According to an aspect of the present invention, a kind of hierarchical neural network inquiry recommended method is provided, comprising:
It establishes and records the session layer neural network modeled and for inquiring user note for a long time for being inquired in short term user
Two neural networks of client layer neural network modeled are recorded, wherein by the shape of the session layer neural network at current time
Input of the state vector as client layer neural network described in current time, by the shape of the client layer neural network at current time
Input of the state vector as the session layer neural network of subsequent time;
According to the session layer neural network and the client layer neural network, for inquiry session output inquiry recommendation.
Preferably, the method also includes:
The value of state vector in the session layer neural network is combined into new session layer neural network with different weights
State vector;
It is described according to the session layer neural network and the client layer neural network, recommend for inquiry session output inquiry in
Hold, comprising:
According to update session layer neural network state vector after the session layer neural network and the client layer neural network,
For inquiry session output inquiry recommendation.
Preferably, the state vector using the session layer neural network at current time is used as described in current time
The input of family layer neural network, using the state vector of the client layer neural network at current time as described in subsequent time
The input of session layer neural network, comprising:
Using the state vector of the session layer neural network of t moment as the input of t moment client layer neural network, t moment is used
Input of the state vector of family layer neural network as t+1 moment session layer neural network, by t+1 moment session layer neural network
Input of the state vector as t+1 moment client layer neural network, wherein t is greater than or equal to 0.
Preferably, described using the state vector of t moment client layer neural network as t+1 moment session layer neural network
Input, using the state vector of t+1 moment session layer neural network as the input hidden layer of t+1 moment client layer neural network,
Include:
The state vector of t moment client layer neural network is used to initialize first of the session layer neural network at t+1 moment
Hidden layer state vector;The last one hidden layer state vector of the session layer neural network at t+1 moment is used as the t+1 moment
The input of family layer neural network.
Preferably, the state vector by t moment client layer neural network is used to initialize the session layer mind at t+1 moment
First hidden layer state vector through network, comprising:
The state vector of the last one hidden layer of t moment client layer neural network is used to initialize the session layer at t+1 moment
First hidden layer state vector of neural network.
Preferably, the value by the state vector in the session layer neural network is combined into new with different weights
The state vector of session layer neural network, comprising:
The value of the state vector of all hidden layers in the session layer neural network is combined into new session with different weights
The state vector of layer neural network.
According to another aspect of the present invention, a kind of hierarchical neural network inquiry recommendation apparatus is provided, comprising:
Hierarchical neural network establishes module, for establishing two neural networks of session layer neural network and client layer neural network,
Wherein using the state vector of the session layer neural network at current time as client layer neural network described in current time
Input, using the state vector of the client layer neural network at current time as the session layer neural network of subsequent time
Input;
Recommend output module, for establishing the session layer neural network and the institute of module foundation according to the hierarchical neural network
Client layer neural network is stated, for inquiry session output inquiry recommendation.
Preferably, described device further include:
Attention mechanism processing module, for establishing the hierarchical neural network in the session layer neural network of module foundation
The value of state vector is combined into the state vector of new session layer neural network with different weights;
The recommendation output module, according to update session layer neural network state vector after the session layer neural network and institute
Client layer neural network is stated, for inquiry session output inquiry recommendation.
Preferably, the hierarchical neural network establishes module using the state vector of the session layer neural network of t moment as t
The input of moment client layer neural network, using the state vector of t moment client layer neural network as t+1 moment session layer nerve
The input of network, using the state vector of t+1 moment session layer neural network as the input of t+1 moment client layer neural network,
Wherein t is greater than or equal to 0.
Preferably, the attention mechanism processing module by the state of all hidden layers in the session layer neural network to
The value of amount is combined into the state vector of new session layer neural network with different weights.
It can be found that the technical solution of the embodiment of the present invention, establishes carry out for inquiring user record in short term respectively
The session layer neural network of modeling and two nerves of client layer neural network modeled for inquiring user record for a long time
Network, wherein using the state vector of the session layer neural network at current time as client layer nerve net described in current time
The input of network, using the state vector of the client layer neural network at current time as the session layer of subsequent time nerve
The input of network;When receive inquiry session when, can according to the session layer neural network and the client layer neural network,
For inquiry session output inquiry recommendation.Wherein session layer neural network can record modeling for the short-term inquiry of user,
Client layer neural network can be modeled for the long-term inquiry record of user, and session layer neural network and client layer nerve
Two outputting and inputting for neural network of network can establish incidence relation.In this way, not only only accounting for current short-term inquiry
Record, it is also considered that arrived the long-term inquiry record of user, and by two minds of session layer neural network and client layer neural network
State vector through network, which is closely linked, to be comprehensively considered, and efficiency and accuracy rate that inquiry is recommended are improved.
Further, the embodiment of the present invention may be incorporated into attention mechanism, by the shape in the session layer neural network
The value of state vector is combined into the state vector of new session layer neural network with different weights, thus uses attention mechanism
More preferably capture the preference information of user.
Detailed description of the invention
Disclosure illustrative embodiments are described in more detail in conjunction with the accompanying drawings, the disclosure above-mentioned and its
Its purpose, feature and advantage will be apparent, wherein in disclosure illustrative embodiments, identical reference label
Typically represent same parts.
Fig. 1 is a kind of schematic flow of hierarchical neural network inquiry recommended method according to an embodiment of the invention
Figure;
Fig. 2 is a kind of another schematic flow of hierarchical neural network inquiry recommended method according to an embodiment of the invention
Figure;
Fig. 3 is a kind of schematic diagram of NQS model according to an embodiment of the invention;
Fig. 4 is a kind of schematic diagram of HNQS model according to an embodiment of the invention;
Fig. 5 is a kind of schematic diagram of AHNQS model according to an embodiment of the invention;
Fig. 6 is each model inspection of different inquiry length of session under a kind of Recall index according to an embodiment of the invention
Effect picture;
Fig. 7 is each model inspection effect of different inquiry length of session under a kind of MRR index according to an embodiment of the invention
Fruit figure;
Fig. 8 is a kind of schematic block diagram of hierarchical neural network inquiry recommendation apparatus according to an embodiment of the invention;
Fig. 9 is a kind of schematic block diagram of hierarchical neural network inquiry recommendation apparatus according to an embodiment of the invention.
Specific embodiment
To make the objectives, technical solutions, and advantages of the present invention clearer, below in conjunction with specific embodiment, and reference
Attached drawing, the present invention is described in more detail.
Although showing the preferred embodiment of the disclosure in attached drawing, however, it is to be appreciated that may be realized in various forms
The disclosure is without that should be limited by the embodiments set forth herein.On the contrary, thesing embodiments are provided so that the disclosure more
Add thorough and complete, and the scope of the present disclosure can be completely communicated to those skilled in the art.
The present invention provides a kind of hierarchical neural network inquiry recommended method, is specifically to provide a kind of based on attention mechanism
Hierarchical neural network inquires recommended method, can improve inquiry and recommend efficiency and accuracy rate.
Below in conjunction with the technical solution of attached drawing the present invention is described in detail embodiment.
Fig. 1 is a kind of schematic flow of hierarchical neural network inquiry recommended method according to an embodiment of the invention
Figure.This method can be applied in hierarchical neural network inquiry recommendation apparatus.
Referring to Fig. 1, which comprises
In a step 101, establish for user inquire in short term the session layer neural network that is modeled of record and be used for
Head of a household's phase inquires two neural networks of client layer neural network that record is modeled, wherein by the session layer at current time
Input of the state vector of neural network as client layer neural network described in current time, by the client layer at current time
Input of the state vector of neural network as the session layer neural network of subsequent time.
It, can be using the state vector of the session layer neural network of t moment as t moment client layer neural network in the step
Input, using the state vector of t moment client layer neural network as the input of t+1 moment session layer neural network, when by t+1
Input of the state vector of session layer neural network as t+1 moment client layer neural network is carved, wherein t is greater than or equal to 0.
Wherein it is possible to be the session layer mind that the state vector of t moment client layer neural network is used to initialize the t+1 moment
First hidden layer state vector through network;By the session layer neural network at t+1 moment the last one hide layer state to
Measure the input as t+1 moment client layer neural network.
Wherein it is possible to be to be used to the state vector of the last one hidden layer of t moment client layer neural network to initialize t
First hidden layer state vector of the session layer neural network at+1 moment.
In a step 102, according to the session layer neural network and the client layer neural network, for inquiry session output
Inquire recommendation.
It can be found that the technical solution of the embodiment of the present invention, establishes carry out for inquiring user record in short term respectively
The session layer neural network of modeling and two nerves of client layer neural network modeled for inquiring user record for a long time
Network, wherein using the state vector of the session layer neural network at current time as client layer nerve net described in current time
The input of network, using the state vector of the client layer neural network at current time as the session layer of subsequent time nerve
The input of network;When receive inquiry session when, can according to the session layer neural network and the client layer neural network,
For inquiry session output inquiry recommendation.Wherein session layer neural network can record modeling for the short-term inquiry of user,
Client layer neural network can be modeled for the long-term inquiry record of user, and session layer neural network and client layer nerve
Two outputting and inputting for neural network of network can establish incidence relation.In this way, the present invention program not only only accounts for currently
Short-term inquiry record, it is also considered that arrived user it is long-term inquiry record, and by session layer neural network and client layer nerve
The state vector of two neural networks of network, which is closely linked, to be comprehensively considered, and efficiency that inquiry is recommended and accurate is improved
Rate.
Fig. 2 is a kind of the another schematic of hierarchical neural network inquiry recommended method according to an embodiment of the invention
Flow chart.The present invention program is described in more detail relative to Fig. 1 in Fig. 2.This method can be applied to hierarchical neural network inquiry and push away
It recommends in device.
The hierarchical neural network inquiry based on attention mechanism (attention machanism) that the invention proposes a kind of
Recommended method.In this method, including two neural networks, one is known as session layer neural network (may be simply referred to as session layer), separately
One is known as client layer neural network (may be simply referred to as client layer).Wherein, session layer neural network is mainly for the short-term of user
Inquiry record can be used for the short-term inquiry record to user and model, and client layer neural network is mainly for the long-term of user
Inquiry record can be used for the long-term inquiry record to user and model.General session layer neural network may include input layer,
Hidden layer, output layer etc.;Wherein input layer is the coded representation for inquiring each inquiry in session;The state vector of hidden layer be by
Input layer is by being calculated.
Wherein, the state vector (for characterizing the state vector of short-term inquiry session) of session layer neural network can be used to
As the input of client layer neural network, equally, the state vector of client layer neural network (for characterize the state of user to
Amount) input of session layer neural network can be used as.Specifically, can be by the shape of the session layer neural network at current time
Input of the state vector as current time client layer neural network, then by the state vector of current time client layer neural network
Input as subsequent time session layer neural network.For example, the state vector of the session layer neural network of t moment can be made
For the input of t moment client layer neural network, then using the state vector of t moment client layer neural network as t+1 moment meeting
The input of session layer neural network, using the state vector of t+1 moment session layer neural network as t+1 moment client layer neural network
Input.The state vector of t moment client layer neural network wherein can be used to initialize the session layer nerve net at t+1 moment
First hidden layer state vector of network;The last one hidden layer state vector of the session layer neural network at t+1 moment is made
For the input of t+1 moment client layer neural network.Wherein, t is greater than or equal to 0.Wherein described state vector, is state variable
The state of value at a time, referred to as system at the moment.Value of the state variable at the moment of t=0 be known as system original state or
Initial state, i.e. also referred to as initial state vector or initial state vector, such as the original state of present system can refer to
The vector value (i.e. the state vector at the moment of t=0) of first hidden layer of session layer neural network.
In addition, the present invention has also used attention mechanism to carry out the more preferable preference information for capturing user.With deep learning
It rises, the neural network based on attention mechanism becomes a hot spot of nearest neural network research.Attention mechanism is one
Kind strategy, is put forward in visual pattern field earliest.The thought of attention mechanism is to improve the weight of useful information, to allow
Task processing system, which focuses more on, finds in input data significantly useful information relevant to currently exporting, to improve output
Quality.Attention mechanism is also a resource allocator model, it simulates human brain work, more resources is concentrated on important
In content.The present invention it has been investigated that, in the inquiry session, the different user's intention of different query express, simultaneously
Also there is different weights when expressing user interest hobby, such as looking into for user can may more be showed by the inquiry that user clicks
It askes and is intended to.The present invention distributes different weights to different inquiry sessions with attention mechanism, inquiry sessions different in this way
Hiding vector can be combined into the state vector of new inquiry session with different weights.
According in AOL(America Online Log, search for log data set) on the experiment that carries out, experimental result table
Bright the method for the present invention than it is existing be based on RNN(Recognition with Recurrent Neural Network) inquiry recommended method than get well, in MRR(Mean
Reciprocal Rank, averaged reciprocals ranking are the inverted works of sequence that model answer is provided in result in the system of being evaluated
For its accuracy, then all problems are averaged) index and Recall(recall rate) 21.86% He is improved in index
22.99%, especially to short inquiry session, effect improves more obvious.
Referring to fig. 2, which comprises
In step 201, two neural networks of session layer neural network and client layer neural network are established, wherein by current time
Session layer neural network input of the state vector as current time client layer neural network, by the client layer at current time
Input of the state vector of neural network as the session layer neural network of subsequent time.
In the prior art be based on RNN(Recurrent Neural Networks, Recognition with Recurrent Neural Network) inquiry recommendation side
Method is mainly based upon the inquiry recommended method of session layer neural network.General session layer neural network may include input layer, hidden
Hide layer, output layer etc.;Wherein input layer is the coded representation for inquiring each inquiry in session;The state vector of hidden layer is by defeated
Enter layer by being calculated, calculating process is as shown in Equation (1);Output layer is calculated by hidden layer, calculating process such as formula
(5) shown in.Wherein, session can refer to a sequence comprising the inquiry in a period of time, such as inquiry 1, inquiry 2 ...,
N is inquired, and session layer refers to a neural network.In general, if former and later two query word query times are no more than setting
Time such as 30 minutes, it is believed that in the same inquiry session.
It is a kind of NQS(Neural Query Suggestion according to an embodiment of the invention, base referring to Fig. 3
Recommend in the inquiry of neural network) schematic diagram of model.Wherein session layer neural network respectively includes input layer from bottom to up
(Input layer), hidden layer (Hidden layer), output layer (Output layer).Where it is assumed that an inquiry session
In haveA inquiry, is expressed as to generate the state vector for inputting the RNN, is encoded to each inquiry session using 1-of-N
Mode.Wherein the state vector of hidden layer can be calculated by formula (1):
Wherein:
Wherein,Indicate the state vector of hidden layer,Indicate the state vector of a upper hidden layer,To update weight,
It is the state vector of activation,It is to forget weight,Indicate the inquiry of n-th of input,Expression comes fromPower
Weight,Expression comes fromWeight,Indicate Sigmod activation primitive.Wherein Sigmoid activation primitive
Referred to as S sigmoid growth curve, in information science, due to its singly increase and inverse function list increase etc. properties, Sigmoid function often by with
The threshold function table for making neural network, by variable mappings to 0, between 1.Wherein tanh () is hyperbolic tangent function.
Wherein it is possible to respectively indicate GRU(Gated Recurrent Unit, door with RNN_session and RNN_user
Control cycling element) function, it will the state vector of session layer neural network is expressed as.The output of this neural network is pre-
Survey the score of next inquiry recommendation items:
WhereinIt indicates the value of output layer in session layer neural network, indicates that output is the score of next inquiry recommendation items.
Wherein, g() indicate that tanh activation primitive, hn, t indicate the state vector of n-th of hidden layer of t moment session layer neural network.
Wherein it is possible to select the loss function compared in pairs, because score can be allowed relatively high in this way, more likely become
The recommendation items accurately inquired come before list, to improve the user satisfaction of user.Generally in recommender system, mostly
Using cross entropy or the loss function of TOP1, the effect of TOP1 is more preferable, it is therefore possible to use loss function Loss below:
Wherein,Indicate the number of negative sample,Respectively indicate the score of negative sample and the score of right value.
The present invention compared with the existing technology in the inquiry recommended method based on RNN, propose that user-session RNN inquiry is recommended
Method.
The present invention can by the state vector of the last one hidden layer of each session layer neural network at current time, as
The input of the client layer neural network at current time;The state of the last one hidden layer in the client layer neural network at current time
Vector can also be used to initialize the state vector of first hidden layer of the session layer neural network at next moment.For example,
The state vector of the last one hidden layer of t moment client layer neural network is used to initialize the session layer nerve at t+1 moment
First hidden layer state vector of network.
Wherein, the state vector of hidden layer in client layer neural network is calculated, calculating process is as shown in formula (7);Meeting
The calculating of the state vector of first hidden layer of session layer neural network is as shown in formula (8).
The inquiry recommended method of dialogue-based RNN only models the short-term inquiry record of user in the prior art,
There is no the long-term retrieval records to user to model, therefore the invention proposes the NQS method of layering, i.e. HNQS
(Hierarchical Neural Query Suggestion, the inquiry recommended models based on hierarchical neural network).Referring to figure
It is a kind of schematic diagram of HNQS model according to an embodiment of the invention shown in 4.In Fig. 4, lower section is session layer nerve net
Network (Session- level RNN), top are client layer neural network (User- level RNN).In Fig. 4, including session layer
Two neural networks of neural network and client layer neural network, lower section includes the session layer nerve at t, t+1 and t+2 moment in figure
Network, top include the client layer neural network at t and t+1 moment.Wherein, by the state of the session layer neural network of t moment to
The input as t moment client layer neural network is measured, then using the state vector of t moment client layer neural network as when t+1
The input for carving session layer neural network, using the state vector of t+1 moment session layer neural network as t+1 moment client layer nerve
The input of network.It wherein, is the session that the state vector of t moment client layer neural network can be used to initialize the t+1 moment
First hidden layer state vector of layer neural network;The last one by the session layer neural network at t+1 moment hides stratiform
Input of the state vector as t+1 moment client layer neural network;Similarly, by the state of the session layer neural network at t+1 moment to
The input as t+1 moment client layer neural network is measured, then using the state vector of t+1 moment client layer neural network as t+
The input of 2 moment session layer neural networks, and so on.
For session layer neural network, inquires the query word in session and enter after coding as the input of neural network
Session layer neural network is calculated.For client layer neural network, t, t+1 and t+2 moment session layer nerve net are successively used
For the last one hidden layer vector of network, that is, session layer neural network state vector as input, output is client layer neural network
The last one hidden layer vector, that is, client layer neural network state vector.About session layer neural network and client layer neural network
Two layers of connection, being can be defeated as the client layer neural network of t moment using the session layer neural network state vector of t moment
Enter, then using t moment client layer neural network state vector as the input of t+1 moment session layer neural network.
Referring to fig. 4, for the inquiry session at t+1 moment, the modeling process of two neural networks, which can be such that, uses t moment
Client layer neural network the last one hidden layer vector of state vector Ut(, that is, t moment client layer neural network ht, u=Ut)
Coding with first query word q1, t+1 in the inquiry session at t+1 moment calculates in session layer neural network as input
Then first hidden layer vector successively combines the inquiry Chinese word coding in inquiry session, calculates behind first hidden layer vector
Other hidden layer vectors;Until the last one query word of inquiry session is combined to be calculated, respective session layer is refreshing at this time
Through the last one hidden layer vector of network, can this vector be referred to as session layer neural network state vector St+1, then will
The client layer neural network at this state vector St+1 input t+1 moment.In conjunction with the state vector of t moment client layer neural network
The last one hidden layer vector of Ut(, that is, t moment client layer neural network ht, u=Ut), it calculates and generates t+1 moment client layer nerve
Network state vector Ut+1(, that is, t+1 moment client layer the last one hidden layer vector of neural network ht+1, u=Ut+1).In this way,
The input of one inquiry session and the processing modeling process of two neural networks just complete, behind received inquiry session again,
It is also the same treatment process.
Wherein, the input of client layer neural network is the state vector of session layer neural network, client layer neural network
It updates are as follows:
Indicate the value of the state vector of n-th of hidden layer in client layer neural network,Indicate n-th of session layer mind
State vector value through network.Wherein, u indicates user u, RNNuser() indicate client layer neural network.
Wherein, the present invention is to input session layer neural network using the state vector of client layer neural network, can be used to
Introduce the long-term retrieval preference information of user:
Wherein,Indicate the state vector value of first hidden layer of t+1 moment session layer neural network,Table
Showing client layer neural network state vector, W indicates weight vectors,Indicate bias vector.
In step 202, it will the value of the state vector of all hidden layers is in session layer neural network with different weight groups
Synthesize the state vector of new session layer neural network.
The present invention further establishes the HNQS model based on attention mechanism.It is proposed by the present invention based on attention mechanism
HNQS model namely AHNQS(Attention-based Hierarchical Neural Query Suggestion, are based on
The inquiry recommended models of attention mechanism and hierarchical neural network), it will in session layer neural network the state of all hidden layers to
The value of amount is combined into the state vector of new session layer neural network with different weights, then as client layer neural network
Input.
In the present invention program, it is believed that in the inquiry session, the different user of different query express is intended to, simultaneously
Also there is different weights when expressing user interest hobby, such as looking into for user can may more be showed by the inquiry that user clicks
It askes and is intended to.The present invention can distribute different power to inquiry session different in session layer neural network by attention mechanism
Heavy, the state vector of all hidden layers can be combined into new session layer mind in such session layer neural network with different weights
State vector through network.It is a kind of schematic diagram of AHNQS model according to an embodiment of the invention referring to Fig. 5.Fig. 5
In, lower section is session layer neural network (Session level RNN), and top is client layer neural network (User- level
RNN), wherein session layer neural network (Session level) introduces user's attention (User Attention).
It is as shown in Figure 5:
It is as follows in the update of client layer neural network, user state information:
Wherein:
Wherein,The state vector value for indicating t moment session layer neural network, is calculated by formula (10),When indicating t
Carve the state value of j-th of hidden layer of session layer neural network.Indicate j-th of hidden layer of t moment session layer neural network
Weighted value,For intermediate parameters.In this way, parameter therein can be continuously available update by loss function.Wherein, ht,
U indicates t-th of hidden layer state vector of client layer neural network user u, and u indicates user u, hTThe t- of t-1, u expression user u
1 moment hidden layer state vector, hj, t indicate j-th of hidden layer state vector value of t moment session layer neural network, exp
() indicates exponential function.The present invention passes through weighted valueIt combines the state vector of hidden layers all in session layer neural network
The state vector of the session layer neural network of Cheng Xin.
In step 203, according to update session layer neural network state vector after the session layer neural network and institute
Client layer neural network is stated, for inquiry session output inquiry recommendation.
Distributing the value of the state vector of hidden layers all in session layer neural network to different weights and with different power
After being re-combined into the state vector of new session layer neural network, then it is re-used as the input of client layer neural network, in this way meeting
The state vector of session layer neural network and the client layer neural network can accordingly change.
The step according to update session layer neural network state vector after the session layer neural network and the user
Layer neural network, for inquiry session output inquiry recommendation, available more accurate inquiry recommendation.
The present invention program checks HNQS model and AHNQS model by experiment, judges that can the two improve inquiry
Whether the effect of recommendation, i.e. layering and attention mechanism are helpful to inquiry recommendation effect.In addition with inquiry session
The variation of length, recommendation effect sometimes can also change.
In conjunction with foregoing teachings, inquiry recommended models described in the invention include: (1) ADJ(the original co-
Occurrence based ranking, the original inquiry based on co-occurrence degree are recommended) model;(2) NQS model;(3) HNQS mould
Type;(4) AHNQS model.
The present invention shows data set and specific parameter setting in Tables 1 and 2.
Table 1
Model | Criticize size | Rejecting rate | Learning rate | Momentum |
NQS | 50 | 0.5 | 0.01 | 0.0 |
HNQS | 50 | 0.1 | 0.1 | 0.0 |
AHNQS | 50 | 0.1 | 0.1 | 0.0 |
Table 2
The present invention measures the effect of all models using MRR and Recall index.According to experimental result, it is concluded that
1) overall performance
The present invention shows the effect of all models in table 3, wherein having carried out the inspection of data conspicuousness to best modelling effect
It tests, as shown in table 3 below:
Model(model) | Recall@10 | MRR@10 |
ADJ | .7072 | .6922 |
NQS | .6444 | .6148 |
HNQS | .8138▲ | .7874▲ |
AHNQS | .8618▲ | .8514▲ |
Table 3
According to above-mentioned experimental result, it can be seen that the numerical value of AHNQS is maximum, also represent it is best the result is that AHNQS, followed by
HNQS, ADJ, that worst is NQS, wherein in MRR and Recall index, AHNQS ratio ADJ high 21.86% and 22.99%,
HNQS only distinguishes height 5.9% and 8.13%, this has further showed that attention mechanism of the invention recommended raising inquiry
Effect is helpful.
In order to further study influence of the length of session to the method for the present invention, the length for inquiring session is divided into short by the present invention
(short), (medium), long (long) in, and different model methods are detected in different inquiry length of session, it ties
Fruit is as referring to shown in Fig. 6 and Fig. 7.Wherein, Fig. 6 is that difference is looked under a kind of Recall index according to an embodiment of the invention
Ask each model inspection effect picture of length of session;Fig. 7 is that difference is looked under a kind of MRR index according to an embodiment of the invention
Ask each model inspection effect picture of length of session.It is short, medium and long for each length of session situation in Fig. 6 and Fig. 7, four cylindricalitys
Respectively correspond ADJ, NQS, HNQS, AHNQS.
It can be seen that the numerical value highest of AHNQS, representing effect all is best, followed by HNQS in each length,
But in MRR index, the raising amount of AHNQS ratio HNQS is more compared with Recall index, this also further demonstrates present invention side
The attention mechanism that method proposes has apparent help effect for the accuracy that inquiry is recommended.
It is above-mentioned to describe the hierarchical neural network inquiry recommended method of the invention based on attention mechanism, following phase in detail
The corresponding inquiry recommendation apparatus of the present invention and equipment should be introduced.
Fig. 8 is a kind of schematic block of hierarchical neural network inquiry recommendation apparatus according to an embodiment of the invention
Figure.
Referring to Fig. 8, in a kind of hierarchical neural network inquiry recommendation apparatus 80, comprising: hierarchical neural network establishes module
81, recommend output module 82, attention mechanism processing module 83.
Hierarchical neural network establishes module 81, for establishing the session layer modeled for inquiring user record in short term
Neural network and two neural networks of client layer neural network modeled for inquiring user record for a long time, wherein will work as
Input of the state vector of the session layer neural network at preceding moment as client layer neural network described in current time, will work as
Input of the state vector of the client layer neural network at preceding moment as the session layer neural network of subsequent time.
Recommend output module 82, for establishing the session layer nerve of the foundation of module 81 according to the hierarchical neural network
Network and the client layer neural network, for inquiry session output inquiry recommendation.
Attention mechanism processing module 83, for the hierarchical neural network to be established the session layer nerve that module 81 is established
The value of state vector in network is combined into the state vector of new session layer neural network with different weights;The recommendation is defeated
Module 82 out, according to the session layer neural network and client layer nerve after update session layer neural network state vector
Network, for inquiry session output inquiry recommendation.
Wherein, the hierarchical neural network establish module 81 can be by the state vector of the session layer neural network of t moment
As the input of t moment client layer neural network, using the state vector of t moment client layer neural network as t+1 moment session
The input of layer neural network, using the state vector of t+1 moment session layer neural network as t+1 moment client layer neural network
Input, wherein t is greater than or equal to 0.
Wherein it is possible to be the session layer mind that the state vector of t moment client layer neural network is used to initialize the t+1 moment
First hidden layer state vector through network;By the session layer neural network at t+1 moment the last one hide layer state to
Measure the input as t+1 moment client layer neural network.
Wherein it is possible to be to be used to the state vector of the last one hidden layer of t moment client layer neural network to initialize t
First hidden layer state vector of the session layer neural network at+1 moment
Wherein, the attention mechanism processing module 83 can by the state of all hidden layers in the session layer neural network to
The value of amount is combined into the state vector of new session layer neural network with different weights.
Fig. 9 is a kind of schematic block of hierarchical neural network inquiry recommendation apparatus according to an embodiment of the invention
Figure.
Referring to Fig. 9, in a kind of hierarchical neural network inquiry recommendation apparatus 90, comprising: processor 91, memory 92.
Processor 91, establish for user inquire in short term the session layer neural network that is modeled of record and be used for
Head of a household's phase inquires two neural networks of client layer neural network that record is modeled, wherein by the session layer at current time
Input of the state vector of neural network as client layer neural network described in current time, by the client layer at current time
Input of the state vector of neural network as the session layer neural network of subsequent time;According to the session layer nerve net
Network and the client layer neural network, for inquiry session output inquiry recommendation.
Memory 92, the inquiry recommendation that storage processor 91 exports.
The embodiment of the present invention also provides a kind of non-transitory machinable medium, is stored thereon with executable code,
When the executable code is executed by the processor of electronic equipment, the processor is made to execute method described below:
It establishes and records the session layer neural network modeled and for inquiring user note for a long time for being inquired in short term user
Two neural networks of client layer neural network modeled are recorded, wherein by the shape of the session layer neural network at current time
Input of the state vector as client layer neural network described in current time, by the shape of the client layer neural network at current time
Input of the state vector as the session layer neural network of subsequent time;
According to the session layer neural network and the client layer neural network, for inquiry session output inquiry recommendation.
In conclusion the present invention provides a kind of, the hierarchical neural network based on attention mechanism inquires recommended method,
The mechanism of middle layering includes the neural network of a session layer neural network and the neural network of a client layer neural network, meeting
Session layer neural network can be used for the record modeling of the short-term inquiry to user, and client layer neural network can be used for the length to user
Phase inquiry is modeled.Attention mechanism proposed by the present invention can preferably capture the information of the preference of user.The present invention exists
It is tested on AOL data set, the experimental results showed that method of the invention is than existing inquiry recommendation side neural network based
Method will be got well, and 21.86% and 22.99% have been respectively increased in MRR index and Recall index, especially to short inquiry session, effect
Fruit is improved more obvious.
Above it is described in detail according to the technique and scheme of the present invention by reference to attached drawing.
In addition, being also implemented as a kind of computer program or computer program product, the meter according to the method for the present invention
Calculation machine program or computer program product include the calculating for executing the above steps limited in the above method of the invention
Machine program code instruction.
Alternatively, the present invention can also be embodied as a kind of (or the computer-readable storage of non-transitory machinable medium
Medium or machine readable storage medium), it is stored thereon with executable code (or computer program or computer instruction code),
When the executable code (or computer program or computer instruction code) by electronic equipment (or calculate equipment, server
Deng) processor execute when, so that the processor is executed each step according to the above method of the present invention.
Those skilled in the art will also understand is that, various illustrative logical blocks, mould in conjunction with described in disclosure herein
Block, circuit and algorithm steps may be implemented as the combination of electronic hardware, computer software or both.
It should be understood by those ordinary skilled in the art that: the above is only a specific embodiment of the present invention, and
It is not used in the limitation present invention, all within the spirits and principles of the present invention, any modification, equivalent substitution, improvement and etc. done,
It should be included within protection scope of the present invention.
Claims (10)
1. a kind of hierarchical neural network inquires recommended method characterized by comprising
It establishes and records the session layer neural network modeled and for inquiring user note for a long time for being inquired in short term user
Two neural networks of client layer neural network modeled are recorded, wherein by the shape of the session layer neural network at current time
Input of the state vector as client layer neural network described in current time, by the shape of the client layer neural network at current time
Input of the state vector as the session layer neural network of subsequent time;
According to the session layer neural network and the client layer neural network, for inquiry session output inquiry recommendation.
2. the method according to claim 1, wherein the method also includes:
The value of state vector in the session layer neural network is combined into new session layer neural network with different weights
State vector;
It is described according to the session layer neural network and the client layer neural network, recommend for inquiry session output inquiry in
Hold, comprising:
According to update session layer neural network state vector after the session layer neural network and the client layer neural network,
For inquiry session output inquiry recommendation.
3. the method according to claim 1, wherein the session layer neural network by current time
Input of the state vector as client layer neural network described in current time, by the client layer neural network at current time
Input of the state vector as the session layer neural network of subsequent time, comprising:
Using the state vector of the session layer neural network of t moment as the input of t moment client layer neural network, t moment is used
Input of the state vector of family layer neural network as t+1 moment session layer neural network, by t+1 moment session layer neural network
Input of the state vector as t+1 moment client layer neural network, wherein t is greater than or equal to 0.
4. according to the method described in claim 3, it is characterized in that, the state vector by t moment client layer neural network
As the input of t+1 moment session layer neural network, using the state vector of t+1 moment session layer neural network as the t+1 moment
The input of client layer neural network, comprising:
The state vector of t moment client layer neural network is used to initialize first of the session layer neural network at t+1 moment
Hidden layer state vector;The last one hidden layer state vector of the session layer neural network at t+1 moment is used as the t+1 moment
The input of family layer neural network.
5. according to the method described in claim 4, it is characterized in that, the state vector by t moment client layer neural network
For initialize the t+1 moment session layer neural network first hidden layer state vector, comprising:
The state vector of the last one hidden layer of t moment client layer neural network is used to initialize the session layer at t+1 moment
First hidden layer state vector of neural network.
6. according to the method described in claim 2, it is characterized in that, the state vector by the session layer neural network
Value the state vector of new session layer neural network is combined into different weights, comprising:
The value of the state vector of all hidden layers in the session layer neural network is combined into new session with different weights
The state vector of layer neural network.
7. a kind of hierarchical neural network inquires recommendation apparatus characterized by comprising
Hierarchical neural network establishes module, for establishing the session layer nerve net modeled for inquiring user record in short term
Network and two neural networks of client layer neural network modeled for inquiring user record for a long time, wherein by current time
The session layer neural network input of the state vector as client layer neural network described in current time, by current time
The client layer neural network state vector as subsequent time the session layer neural network input;
Recommend output module, for establishing the session layer neural network and the institute of module foundation according to the hierarchical neural network
Client layer neural network is stated, for inquiry session output inquiry recommendation.
8. device according to claim 7, which is characterized in that described device further include:
Attention mechanism processing module, for establishing the hierarchical neural network in the session layer neural network of module foundation
The value of state vector is combined into the state vector of new session layer neural network with different weights;
The recommendation output module, according to update session layer neural network state vector after the session layer neural network and institute
Client layer neural network is stated, for inquiry session output inquiry recommendation.
9. device according to claim 7, it is characterised in that:
The hierarchical neural network establishes module using the state vector of the session layer neural network of t moment as t moment client layer
The input of neural network, using the state vector of t moment client layer neural network as the defeated of t+1 moment session layer neural network
Enter, using the state vector of t+1 moment session layer neural network as the input of t+1 moment client layer neural network, wherein t is greater than
Or it is equal to 0.
10. device according to claim 8, it is characterised in that:
The attention mechanism processing module is by the value of the state vector of all hidden layers in the session layer neural network with not
Same weight is combined into the state vector of new session layer neural network.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2019102150519 | 2019-03-21 | ||
CN201910215051 | 2019-03-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109740743A true CN109740743A (en) | 2019-05-10 |
Family
ID=66371402
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910255010.2A Pending CN109740743A (en) | 2019-03-21 | 2019-04-01 | Hierarchical neural network query recommendation method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109740743A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110188283A (en) * | 2019-06-05 | 2019-08-30 | 中国人民解放军国防科技大学 | Information recommendation method and system based on joint neural network collaborative filtering |
CN112000873A (en) * | 2020-06-19 | 2020-11-27 | 南京理工大学 | Session-based recommendation system, method, device and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108320786A (en) * | 2018-02-06 | 2018-07-24 | 华南理工大学 | A kind of Chinese meal vegetable recommendation method based on deep neural network |
CN108959429A (en) * | 2018-06-11 | 2018-12-07 | 苏州大学 | A kind of method and system that the film merging the end-to-end training of visual signature is recommended |
CN109145213A (en) * | 2018-08-22 | 2019-01-04 | 清华大学 | Inquiry recommended method and device based on historical information |
-
2019
- 2019-04-01 CN CN201910255010.2A patent/CN109740743A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108320786A (en) * | 2018-02-06 | 2018-07-24 | 华南理工大学 | A kind of Chinese meal vegetable recommendation method based on deep neural network |
CN108959429A (en) * | 2018-06-11 | 2018-12-07 | 苏州大学 | A kind of method and system that the film merging the end-to-end training of visual signature is recommended |
CN109145213A (en) * | 2018-08-22 | 2019-01-04 | 清华大学 | Inquiry recommended method and device based on historical information |
Non-Patent Citations (1)
Title |
---|
WANYU CHEN ET AL.: "Attention-based Hierarchical Neural Query Suggestion", 《ARXIV》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110188283A (en) * | 2019-06-05 | 2019-08-30 | 中国人民解放军国防科技大学 | Information recommendation method and system based on joint neural network collaborative filtering |
CN110188283B (en) * | 2019-06-05 | 2021-11-23 | 中国人民解放军国防科技大学 | Information recommendation method and system based on joint neural network collaborative filtering |
CN112000873A (en) * | 2020-06-19 | 2020-11-27 | 南京理工大学 | Session-based recommendation system, method, device and storage medium |
CN112000873B (en) * | 2020-06-19 | 2022-10-18 | 南京理工大学 | Session-based recommendation system, method, device and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110457589B (en) | Vehicle recommendation method, device, equipment and storage medium | |
CN109359140B (en) | Sequence recommendation method and device based on adaptive attention | |
CN104598611B (en) | The method and system being ranked up to search entry | |
CN111931062A (en) | Training method and related device of information recommendation model | |
CN107330130A (en) | A kind of implementation method of dialogue robot to artificial customer service recommendation reply content | |
Nadimi-Shahraki et al. | Cold-start problem in collaborative recommender systems: Efficient methods based on ask-to-rate technique | |
CN110889450B (en) | Super-parameter tuning and model construction method and device | |
CN111859149A (en) | Information recommendation method and device, electronic equipment and storage medium | |
CN110222838B (en) | Document sorting method and device, electronic equipment and storage medium | |
CN107832426A (en) | A kind of APP recommendation method and system based on using sequence context | |
Ahuja et al. | Low-resource adaptation for personalized co-speech gesture generation | |
CN110263245A (en) | The method and apparatus for pushing object to user based on intensified learning model | |
CN112084307A (en) | Data processing method and device, server and computer readable storage medium | |
CN112232933A (en) | House source information recommendation method, device, equipment and readable storage medium | |
CN113806630A (en) | Attention-based multi-view feature fusion cross-domain recommendation method and device | |
CN111523940B (en) | Deep reinforcement learning-based recommendation method and system with negative feedback | |
CN109740743A (en) | Hierarchical neural network query recommendation method and device | |
CN115270752A (en) | Template sentence evaluation method based on multilevel comparison learning | |
CN114781503A (en) | Click rate estimation method based on depth feature fusion | |
CN113641811B (en) | Session recommendation method, system, equipment and storage medium for promoting purchasing behavior | |
Ma | Research on basketball teaching network course resource recommendation method based on deep learning algorithm | |
CN108491477B (en) | Neural network recommendation method based on multi-dimensional cloud and user dynamic interest | |
CN112148994B (en) | Information push effect evaluation method and device, electronic equipment and storage medium | |
CN108073978A (en) | A kind of constructive method of the ultra-deep learning model of artificial intelligence | |
CN117271768A (en) | False news detection method and device based on large language model analysis and guidance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190510 |
|
RJ01 | Rejection of invention patent application after publication |