CN109002519A - Answer selection method, device and electronic equipment based on convolution loop neural network - Google Patents

Answer selection method, device and electronic equipment based on convolution loop neural network Download PDF

Info

Publication number
CN109002519A
CN109002519A CN201810742597.5A CN201810742597A CN109002519A CN 109002519 A CN109002519 A CN 109002519A CN 201810742597 A CN201810742597 A CN 201810742597A CN 109002519 A CN109002519 A CN 109002519A
Authority
CN
China
Prior art keywords
data
answer
title
status switch
described problem
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810742597.5A
Other languages
Chinese (zh)
Inventor
杨鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wisdom Technology Development Co Ltd
Original Assignee
Beijing Wisdom Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wisdom Technology Development Co Ltd filed Critical Beijing Wisdom Technology Development Co Ltd
Priority to CN201810742597.5A priority Critical patent/CN109002519A/en
Publication of CN109002519A publication Critical patent/CN109002519A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

This application involves the answer selection method based on convolution loop neural network, device and electronic equipment.The described method includes: obtaining problem data and the answer data including title data and summary data;Layer, which is remembered, by two-way shot and long term handles the hidden status switch of acquisition problem, the hidden status switch of title and the hidden status switch of abstract;Obtain that problem characteristic indicates, title feature indicates and character representation of making a summary respectively with convolutional neural networks;Carry out pondization operation indicates to obtain problem finally is indicated to problem characteristic, and the character representation that indicates title feature and make a summary carries out splicing pondization operation and indicates to obtain answer finally;And computational problem finally indicate and answer finally indicate between similarity to obtain relevance scores.In this way, Deep Semantics information can be excavated by two-way shot and long term memory network, and by the special characteristic in convolutional neural networks extracted vector to determine the correlation between problem data and answer data, to promote the accuracy of answer selection.

Description

Answer selection method, device and electronic equipment based on convolution loop neural network
Technical field
The present invention relates generally to data processing field, and more specifically, is related to a kind of based on convolution loop nerve net Answer selection method, device and the electronic equipment of network.
Background technique
With the fast development of artificial intelligence field key technology, artificial intelligence assistance application is graduallyd mature, major science and technology Company has issued the Domestic artificial intelligent product of oneself in succession.Benefit from the new skill of speech recognition and natural language processing field Art, user can be interacted with this kind of product more naturally by way of voice.
In this interaction scenarios, a critically important part is exactly to answer human user by artificial intelligence product with nature The problem of language proposes.Therefore, under this true Opening field question and answer application scenarios, it is desirable to be able to find true generation automatically The mass knowledge on boundary, to obtain answer corresponding with problem.
In addition, with the development of internet technology, problem, and acquisition and problem are being searched in more and more people's selection on the net Relevant answer.Typically, netizen can pass through community's question answering system (Community-based Question Answering), such as Baidu is known, know, Stack Overflow etc., retrieve and oneself want the problem of understanding, and obtain and ask It inscribes and relevant is satisfied with answer.
But usually when scanning for based on problem, a large amount of answers corresponding with problem can be obtained, therefore, therefore, are needed Want improved answer selection scheme.
Summary of the invention
In order to solve the above-mentioned technical problem, the application is proposed.Embodiments herein is provided one kind and is followed based on convolution Answer selection method, device and the electronic equipment of ring neural network excavate Deep Semantics by two-way shot and long term memory network Information, and by the special characteristic in convolutional neural networks extracted vector to determine the correlation between problem data and answer data Property, to promote the accuracy of answer selection.
According to the one side of the application, a kind of answer selection method based on convolution loop neural network is provided, comprising: Problem data and answer data corresponding with described problem data are obtained, the answer data includes title data and abstract number According to;By two-way shot and long term remember layer handle respectively described problem data, title data and summary data term vector indicate with Obtain and the hidden status switch of problem corresponding to described problem data, the hidden status switch of title corresponding with the title data and with The corresponding hidden status switch of making a summary of the summary data;With convolutional neural networks respectively from the hidden status switch of described problem, title Hidden status switch and hidden status switch of making a summary obtain problem characteristic expression, title feature indicates and abstract character representation;To described Problem characteristic indicates that carry out pondization operation indicates to obtain problem finally, and indicates and the abstract feature the title feature Expression carries out splicing pondization operation to be indicated with obtaining answer finally;And calculating described problem is finally indicated with the answer most Similarity between indicating eventually is to obtain relevance scores of the answer data relative to described problem data.
In the above-mentioned answer selection method based on convolution loop neural network, further comprises: being based on and described problem Multiple scores of the corresponding a plurality of answer data of data are ranked up a plurality of answer data.
In the above-mentioned answer selection method based on convolution loop neural network, a plurality of answer data is by multiple The a plurality of candidate answers data for the described problem data that search engine obtains.
In the above-mentioned answer selection method based on convolution loop neural network, obtain problem data and with described problem number According to corresponding answer data include: in a plurality of candidate answers data every candidate answers data carry out text participle and Remove stop words.
In the above-mentioned answer selection method based on convolution loop neural network, asked respectively from described with convolutional neural networks It inscribes the hidden status switch of hidden status switch, title and the hidden status switch of abstract obtains problem characteristic expression, title feature is indicated and plucked Wanting character representation includes: directly to obtain described problem character representation from the hidden status switch of described problem with convolutional neural networks;It will Described problem character representation is averaged to obtain renewal vector on sequence length;With the renewal vector for the title It is hidden to obtain updated title that hidden status switch and the hidden status switch of abstract carry out the update of word rank attention respectively Status switch and hidden status switch of making a summary;And with convolutional neural networks respectively from the updated hidden status switch of title Obtaining the title feature with hidden status switch of making a summary indicates and the abstract character representation.
In the above-mentioned answer selection method based on convolution loop neural network, pond is carried out to described problem character representation Operation is indicated with obtaining problem finally, and indicates the title feature to carry out splicing pondization operation with the abstract character representation Indicate to include: that the operation of maximum value pondization is carried out to obtain the final table of problem to described problem character representation to obtain answer finally Show;The title feature is indicated and the abstract character representation carries out vector splicing;And it is special to the spliced title Sign indicates and abstract character representation carries out the operation of maximum value pondization is indicated with obtaining answer finally.
In the above-mentioned answer selection method based on convolution loop neural network, calculate described problem finally indicates with it is described Answer finally indicate between similarity include: to obtain the answer data relative to the relevance scores of described problem data Calculate described problem finally indicate and the answer finally indicate between vector cosine similarity to obtain the answer number According to the relevance scores relative to described problem data.
According to the another aspect of the application, a kind of answer selection device based on convolution loop neural network is provided, is wrapped It includes: data capture unit, for obtaining problem data and answer data corresponding with described problem data, the answer data packet Include title data and summary data;Recirculating network unit handles described problem data, mark for remembering layer by two-way shot and long term The term vector of topic data and summary data is indicated to obtain the hidden status switch of the problem corresponding with described problem data and the mark Inscribe the hidden status switch of the corresponding title of data and the hidden status switch of abstract corresponding with the summary data;Convolutional network unit, For being obtained respectively from the hidden status switch of described problem, the hidden status switch of title and the hidden status switch of abstract with convolutional neural networks Problem characteristic indicates, title feature indicates and abstract character representation;Pond operating unit, for described problem character representation into The operation of row pondization is indicated with obtaining problem finally, and indicates the title feature to carry out splicing pond with the abstract character representation Change operation is indicated with obtaining answer finally;And score calculating unit, it is indicated and the answer for calculating described problem finally Similarity between final expression is to obtain relevance scores of the answer data relative to described problem data.
In the above-mentioned answer selection device based on convolution loop neural network, further comprise: sequencing unit is used for base The a plurality of answer data is ranked up in multiple relevance scores of a plurality of answer data corresponding with described problem data.
In the above-mentioned answer selection device based on convolution loop neural network, a plurality of answer data is by multiple The a plurality of candidate answers data for the described problem data that search engine obtains.
In the above-mentioned answer selection device based on convolution loop neural network, the data capture unit is used for: to institute It states every candidate answers data in a plurality of candidate answers data and carries out text participle and removal stop words.
In the above-mentioned answer selection device based on convolution loop neural network, further comprise: attention mechanism unit is used In: described problem character representation is averaged to obtain renewal vector on sequence length;And with the renewal vector pair Carry out the update of word rank attention respectively in the hidden status switch of the title and the hidden status switch of abstract to be updated The hidden status switch of title afterwards and hidden status switch of making a summary;And the convolutional network unit with convolutional neural networks for being distinguished Obtaining the title feature from the updated hidden status switch of title and hidden status switch of making a summary indicates special with the abstract Sign indicates.
In the above-mentioned answer selection device based on convolution loop neural network, the pond operating unit is used for: to institute Stating problem characteristic indicates that carry out the operation of maximum value pondization indicates to obtain problem finally;The title feature is indicated to pluck with described Character representation is wanted to carry out vector splicing;And the character representation that indicates the spliced title feature and make a summary carries out maximum The operation of value pondization is indicated with obtaining answer finally.
In the above-mentioned answer selection device based on convolution loop neural network, the score calculating unit is used for: being calculated Described problem finally indicates and the answer finally indicate between vector cosine similarity to obtain the answer data phase For the relevance scores of described problem data.
According to the application's in another aspect, providing a kind of electronic equipment, comprising: processor;And memory, in institute It states and is stored with computer program instructions in memory, the computer program instructions make described when being run by the processor Processor executes as described above based on the answer selection method of convolution loop neural network.
According to the another aspect of the application, a kind of computer readable storage medium is provided, which is characterized in that the calculating Computer program instructions are stored on machine readable storage medium storing program for executing, it, can when the computer program instructions are executed by a computing apparatus Operation is to execute answer selection method as described above based on convolution loop neural network.
Answer selection method, device and electronic equipment provided by the present application based on convolution loop neural network, Ke Yitong Cross two-way shot and long term memory network and excavate Deep Semantics information, and by the special characteristic in convolutional neural networks extracted vector with The correlation between problem data and answer data is determined, to promote the accuracy of answer selection.
Detailed description of the invention
From the detailed description with reference to the accompanying drawing to the embodiment of the present invention, these and/or other aspects of the invention and Advantage will become clearer and be easier to understand, in which:
Fig. 1 illustrates the processes according to the answer selection method based on convolution loop neural network of the embodiment of the present application Figure.
Fig. 2 is illustrated in the answer selection method based on convolution loop neural network according to the embodiment of the present application The flow chart of the convolutional neural networks course of work of attention mechanism.
Fig. 3 is illustrated according to the schematic of the answer selection method based on convolution loop neural network of the embodiment of the present application The flow chart of overall process.
Fig. 4 illustrates the exemplary schematic diagram of the convolution loop neural network framework according to the embodiment of the present application.
Fig. 5 illustrates the block diagram of the answer selection device based on convolution loop neural network according to the embodiment of the present application.
Fig. 6 illustrates the block diagram of the electronic equipment according to the embodiment of the present application.
Specific embodiment
In the following, example embodiment according to the application will be described in detail by referring to the drawings.Obviously, described embodiment is only It is only a part of the embodiment of the application, rather than the whole embodiments of the application, it should be appreciated that the application is not by described herein The limitation of example embodiment.
Application is summarized
As described above, either artificial intelligence assistant or online question answering system, require for particular problem selection with Its corresponding answer.It is mainly realized by search engine currently, obtaining answer corresponding with problem, that is, by the beginning of search engine Step filters out a series of relevant documentations, then finds correct option in the search result that search engine returns.
But during directly using search engine, there is both sides limitation, first is that search engine was most good at Task is that the traditional inquiry of information retrieval class and the search of question and answer class have certain difference;Second is that the sort algorithm of commercial search engine Black box, in this case it is not apparent that how to be realized inside it, it is difficult to by simple method to the result of multiple search engines into Row integration and sequence.
Therefore, in order to solve the above-mentioned technical problem, the embodiment of the present application provides a kind of based on convolution loop neural network Answer selection method, device and electronic equipment, acquisition problem data and including title data and summary data first Answer data, then (long-short term memory:LSTM) layer is remembered by two-way shot and long term and handles the hidden shape of acquisition problem The hidden status switch of state sequence, title and the hidden status switch of abstract, then obtained respectively with convolutional neural networks problem characteristic indicate, Title feature indicates and abstract character representation, and indicates that carry out pondization operation indicates to obtain problem finally to problem characteristic, right Title feature, which indicates, and abstract character representation carries out splicing pondization operation indicates that last computational problem is final to obtain answer finally Indicate and answer finally indicate between relevance scores of the similarity to obtain answer data relative to problem data.In this way, Deep Semantics information can be excavated by two-way shot and long term memory network, and by specific in convolutional neural networks extracted vector Feature is to determine the correlation between problem data and answer data, to promote the accuracy of answer selection.
Therefore, answer selection method, device and the electronics provided by the embodiments of the present application based on convolution loop neural network Equipment can preferably utilize the information of search engine, that is to say, that can effectively metasearch engine result and root It scores according to the characteristics of question and answer generic task search result, so that answering of wanting of goals for higher search result reflection user Case.
Here, it will be understood by those skilled in the art that according to the embodiment of the present application based on convolution loop neural network Answer selection method, device and electronic equipment can be used for the correlation point by calculating a plurality of candidate answers corresponding with problem Number is not limited solely to utilize the candidate answers sequencing information provided by search engine to select answer.In addition, according to the application reality Answer selection method, device and the electronic equipment based on convolution loop neural network for applying example also can be applied to except artificial intelligence Needs other than assistant and online question answering system select the scene of answer relevant to problem.
After describing the basic principle of the application, carry out the various non-limits for specifically introducing the application below with reference to the accompanying drawings Property embodiment processed.
Illustrative methods
Fig. 1 illustrates the processes according to the answer selection method based on convolution loop neural network of the embodiment of the present application Figure.
As shown in Figure 1, the answer selection method based on convolution loop neural network according to the embodiment of the present application includes: S110, obtains problem data and answer data corresponding with described problem data, and the answer data includes title data and plucks Want data;S120, by two-way shot and long term remember layer handle respectively the word of described problem data, title data and summary data to Amount is indicated with the hidden status switch of the problem obtained and described problem data are corresponding, the hidden state of title corresponding with the title data Sequence and the hidden status switch of abstract corresponding with the summary data;S130, it is hidden from described problem respectively with convolutional neural networks The hidden status switch of status switch, title and the hidden status switch of abstract obtain problem characteristic expression, title feature indicates and abstract is special Sign indicates;S140, carry out pondization operation to described problem character representation is indicated with obtaining problem finally, and to the title feature Indicate that splicing pondization operation is carried out with the abstract character representation to be indicated to obtain answer finally;And S150, it asks described in calculating Topic is final indicate and the answer finally indicate between similarity to obtain the answer data relative to described problem data Relevance scores.
In step s 110, problem data and answer data corresponding with described problem data, the answer data are obtained Including title data and summary data.In the answer selection method based on convolution loop neural network according to the embodiment of the present application In, deep neural network model is extended to three sections of problem of can handle, title and abstract texts, to improve answer selection Accuracy.
Here, as described above, the answer data can be the set of candidate answers corresponding with problem, that is, with single The corresponding a plurality of answer of problem.Also, this plurality of answer can be to be scanned for obtaining by different search engines for problem 's.
Therefore, described more in the answer selection method based on convolution loop neural network according to the embodiment of the present application Answer data is a plurality of candidate answers data of the described problem data obtained by multiple search engines.
In addition, can be pre-processed to answer data after obtaining candidate answers set, for example, may include into Compose a piece of writing this participle and removal two steps of stop words.
It is, being obtained in the answer selection method based on convolution loop neural network according to the embodiment of the present application Problem data and answer data corresponding with described problem data include: to every candidate in a plurality of candidate answers data Answer data carries out text participle and removal stop words.
In the step s 120, layer is remembered by two-way shot and long term and handles described problem data, title data and summary data Term vector indicate with the hidden status switch of the problem obtained and described problem data are corresponding, title corresponding with the title data Hidden status switch and the hidden status switch of abstract corresponding with the summary data.
Here, it will be understood by those skilled in the art that when completing the inter-related task of natural language processing, firstly, it is necessary to Convert natural language to the language that calculates equipment and can identify, i.e. mathematic sign, wherein the mathematic sign is usually vector.It changes Yan Zhi need to convert these data in the table that machine can identify after obtaining problem data, title data and summary data Show.
Particularly, in the embodiment of the present application, term vector conversion can be carried out to the data by word embedding grammar, with The term vector for obtaining the data indicates, wherein term vector is that continuous, dense, low dimensional the distribution of word indicates.Word insertion Method (Word Embedding Method) is to refer to a kind of method of natural language terms mathematicization by natural language Each of speech word is mapped to hyperspace, and is expressed as a multi-C vector.This word representation method will by training Word is mapped to a K dimension real vector (general K desirable 50,100 etc.) wherein, and each word is one in K dimensional vector space Point, the distance between word (such as Euclidean distance, Cosine distance) represent the semantic similarity between word.For example, in this Shen It please can will shift to an earlier date the Glove term vector of trained 100 dimension (K=100) as described problem data, institute in embodiment The term vector for stating title data and the summary data indicates, also, during model training, automatically more according to training mission New term vector.
After described problem data, title data and summary data are converted to term vector, generated with two-way LSTM layers Corresponding hidden status switch.Here, two-way shot and long term memory layer (LSTM layers two-way) is on the basis of unidirectional LSTM network On develop comprising the opposite LSTM neural net layer of both direction, respectively before to LSTM (forward LSTM) layer With backward LSTM (backward LSTM) layer.Wherein, it can utilize in text sequence on each word complete future for forward direction LSTM layers Context information, backward LSTM layers can completely pass by contextual information using each word in text sequence.That is, forward direction LSTM layers are handled from front to back, and backward LSTM layers are handled from back to front.Each moment, two shot and long terms remember mould The result of type is spliced together the overall output as this moment.
It will be understood by those skilled in the art that LSTM is a kind of special RNN, can be better solved compared to RNN longer Sentence, while also having the performance of not defeated RNN on short sentence.And RNN is the neural network of a kind of " self-loopa ", is usually used in each Kind natural language processing task, basic neural network unit, can be according to input as traditional neural network unit Corresponding output is provided, and " self-loopa " can guarantee that next neural network list can be transmitted to step by step in sequence information Member.The form of its " self-loopa " also can be regarded as by the same neural network unit replicate n times, in an iterative process each Information is all transmitted to next unit by neural network unit.
For being converted to the input text { x of term vector form1, x2..., xn, with the input x of t momenttFor, length is in short-term Output is calculated by following mode:
ft=σ (Wf·[ht-1;xt]+bf)
it=σ (Wi·[ht-1;xt]+bi)
ot=σ (Wo·[ht-1;xt]+bo)
ht=ot·tanh(ct)
The output at the last one moment is by the expression as entire sequence.That is, the hidden shape of LSTM layers of forward direction output State are as follows: [fh1, fh2... fht], similarly, the hidden state of backward LSTM layers of output are as follows: [bh1, bh2... bht].When will be each Carve both direction hidden state be attached after the output of two-way LSTM: [h can be obtained1, h2...ht]=[(fh1, bh1), (fh2, bh2)...(fht, bht)].For example, being enabled by taking problem data as an exampleWithMould is remembered to shot and long term before respectively indicating Type and backward shot and long term memory models are to problem q in the output of t moment, then output of the two-way shot and long term memory models in moment t
It is noted that in the embodiment of the present application, described forward direction LSTM layers and backward LSTM layers of the hidden layer Between there is no information flow, that is, described forward direction LSTM layers and it is backward LSTM layers described between there is no data flowing, this guarantees described double Expanded view to LSTM layers is acyclic.Meanwhile it will be understood by those skilled in the art that two-way LSTM layers can effectively keep away Exempt from gradient to disappear and gradient explosion, so that the long Dependence Problem of text data can be preferably handled by described two-way LSTM layers, That is, in the embodiment of the present application, can preferably handle the long of the text data by described two-way LSTM layers and rely on Problem, further to promote classifying quality.
In step s 130, with convolutional neural networks respectively from the hidden status switch of described problem, the hidden status switch of title and Hidden status switch of making a summary obtains problem characteristic expression, title feature indicates and abstract character representation.
In convolutional neural networks, the parameter of convolutional layer is made of a series of small convolution kernels, and the same convolution kernel is defeated Enter upper movement, calculates the dot product of convolution kernel and corresponding position as exporting, mathematically this operation is referred to as discrete convolution.Specifically Ground, for one-dimensional input { h1, h2..., hn, output of the convolutional layer in i-th bit are as follows:
Wherein W={ WiIt is convolution kernel, convolution kernel size is 2L+1, and L is the sequence length of input vector.Intuitively, These convolution kernels can generate response for some specific signals in the training process, therefore can extract different in input The special characteristic signal that part is included.
In the answer selection method based on convolution loop neural network according to the embodiment of the present application, in order to improve answer The accuracy of selection further can carry out the update based on attention mechanism to answer data based on problem data.
That is, problem data can be right directly as the input of convolutional layer by the output of two-way LSTM model Title and abstract part in answer data, input convolution again after being updated first using word rank attention mechanism Layer.
For problem data, it is assumed that the output problem characteristic of convolutional layer indicates are as follows:
Also, it is indicated after average on sequence length L are as follows:
Wherein, NFIndicate the number of convolution kernel in convolutional layer.Next, using vector cqTo the hidden shape of title and abstract State sequence carries out the update of word-based rank attention simultaneously.H is indicated with the hidden status switch of titleH, tFor, in t moment, more Vector after new indicatesIt is obtained by the following formula:
mH, t=tanh (W1hH, t+W2cq)
SH, t∝exp(wTmH, t)
Wherein, W1, W2, w is network parameter to be trained.
Therefore, in the answer selection method based on convolution loop neural network according to the embodiment of the present application, with convolution Neural network obtains problem characteristic table from the hidden status switch of described problem, the hidden status switch of title and the hidden status switch of abstract respectively Show, title feature indicates and abstract character representation includes: directly to be obtained from the hidden status switch of described problem with convolutional neural networks Described problem character representation;Described problem character representation is averaged to obtain renewal vector on sequence length;With described Renewal vector status switch hidden for the title and the hidden status switch of abstract carry out the update of word rank attention respectively To obtain the hidden status switch of updated title and hidden status switch of making a summary;And with convolutional neural networks respectively from it is described more The hidden status switch of title and hidden status switch of making a summary after new obtain the title feature expression and the abstract character representation.
Fig. 2 is illustrated in the answer selection method based on convolution loop neural network according to the embodiment of the present application The flow chart of the convolutional neural networks course of work of attention mechanism.As shown in Fig. 2, including the convolutional Neural net of attention mechanism The network course of work includes: S210, directly obtains described problem mark sheet from the hidden status switch of described problem with convolutional neural networks Show;Described problem character representation is averaged on sequence length to obtain renewal vector by S220;S230, with the update Vector status switch hidden for the title and the hidden status switch of abstract carry out the update of word rank attention respectively to obtain Obtain the updated hidden status switch of title and hidden status switch of making a summary;And S240, with convolutional neural networks respectively from it is described more The hidden status switch of title and hidden status switch of making a summary after new obtain the title feature expression and the abstract character representation.
In step S140, carry out pondization operation to described problem character representation is indicated with obtaining problem finally, and to institute Stating title feature indicates that splicing pondization operation is carried out with the abstract character representation to be indicated to obtain answer finally.
It is one group of vector of convolutional neural networks output for problem character representation corresponding to problem data, for example, Can on this dimension of length maximizing, carry out maximum value pond, to obtain a length as NFVector as problem It is final to indicate oq.And for title and the two parts of making a summary, by respectively obtained after convolutional neural networks two groups to Amount, separately includes lh, lcA length is NFVector, wherein lhAnd lcThe respectively sequence length of title and abstract.For example, can be with This two parts is stitched together first and constitutes one group of lh+lcA vector, then the operation of maximum value pondization is carried out, obtain a length For NFVector finally indicate o as the whole answer to title and abstracta.Here, it will be understood by those skilled in the art that It, can also be using other pondizations side in such as average value pond for the output of convolutional neural networks other than maximum value pond Method.
Therefore, in the answer selection method based on convolution loop neural network according to the embodiment of the present application, to described Problem characteristic indicates that carry out pondization operation indicates to obtain problem finally, and indicates and the abstract feature the title feature Expression carries out the operation of splicing pondization to obtain answer finally and indicate to include: to carry out maximum value pondization behaviour to described problem character representation Work is indicated with obtaining problem finally;The title feature is indicated and the abstract character representation carries out vector splicing;And it is right The spliced title feature indicates and abstract character representation carries out the operation of maximum value pondization is indicated with obtaining answer finally.
In step S150, calculate described problem finally indicate and the answer finally indicate between similarity to obtain Relevance scores of the answer data relative to described problem data.
For example, can finally be indicated with computational problem to title make a summary whole answer finally indicate between cosine it is similar Degree:
Also, with relevance scores of the answer data relative to described problem data described in obtained similarity value, for example, As the score based on problem search answer search result obtained.
Here, it will be understood by those skilled in the art that the similarity between final problem vector sum answer vector can also To be indicated using other parameters, the embodiment of the present application is not intended to carry out any restrictions to this.
Therefore, in the answer selection method based on convolution loop neural network according to the embodiment of the present application, institute is calculated State problem finally and indicate and the answer finally indicate between similarity to obtain the answer data relative to described problem The relevance scores of data include: calculate described problem finally indicate and the answer finally indicate between vector cosine phase Like degree to obtain relevance scores of the answer data relative to described problem data.
After obtaining the relevance scores of problem data and answer data, so that it may based on corresponding to same problem The relevance scores of a plurality of answer are ranked up answer, so that the answer for coming front is the answer that user wants.
It is, in the answer selection method based on convolution loop neural network according to the embodiment of the present application, into one Step includes: that multiple scores based on a plurality of answer data corresponding with described problem data arrange a plurality of answer data Sequence.
Fig. 3 is illustrated according to the schematic of the answer selection method based on convolution loop neural network of the embodiment of the present application The flow chart of overall process.As shown in figure 3, for an input problem q, being retrieved in multiple search engines in step S310 Problem is to obtain candidate answers set { ai}.In step S320, for the candidate answers set { a of acquisitioniPre-processed, it wraps It includes and carries out Chinese text participle and removal two steps of stop words, be directed to each candidate answers a to obtaini(including title hhWith pluck Want hc) binary group (q, ai).Each candidate is answered using convolution loop neural network model as described above in step S330 Case calculates the relevance scores s between problemi, so that can be arranged by score size between any two candidate answers Sequence.Finally, in step S340, according to the relevance scores s of step S330 calculatingiBy the results sets of candidate answers sequence and defeated Out.
Fig. 4 illustrates the exemplary schematic diagram of the convolution loop neural network framework according to the embodiment of the present application.Such as Fig. 4 institute Show, the core of convolution loop neural network is using the shot and long term memory models and convolutional neural networks distich in deep learning Son is modeled, that is, is modeled using two-way shot and long term memory models to problem and answer, is remembered respectively first in shot and long term It is further indicated with the vector that convolutional layer and pond layer obtain problem and answer on the basis of model output, finally for example using remaining Similarity between two vectors of string similarity calculation.In addition, the convolution loop neural network can be further introduced into a word Rank attention mechanism, the vector of Utilizing question indicates that the answer vector for treating input convolutional neural networks is updated, then allows It is by convolutional neural networks, to improve the accuracy of answer prediction.
Here, it will be understood by those skilled in the art that above-mentioned convolution loop neural network framework can be used for example The programming language of python 3.6.3 is realized, and utilizes such as PyTorch 0.3.1, NumPy 1.13.3, BeautifulSoup 4.6.0, the third party libraries such as THULAC are run in the systems such as Linux, Windows, Mac.It is completed by loading pre-training Model can be crawled with the relevant search result of complete dual problem, be pre-processed, sort a series of tasks of output.
Exemplary means
Fig. 5 is illustrated according to the schematic of the answer selection device based on convolution loop neural network of the embodiment of the present application Block diagram.
As shown in figure 5, the answer selection device 400 based on convolution loop neural network according to the embodiment of the present application wraps It includes: data capture unit 410, for obtaining problem data and answer data corresponding with described problem data, the answer number According to including title data and summary data;Recirculating network unit 420 handles described problem for remembering layer by two-way shot and long term The term vectors of data, title data and summary data indicate with obtain the hidden status switch of the problem corresponding with described problem data, And the corresponding hidden status switch of title of title data and the hidden status switch of abstract corresponding with the summary data;Convolution net Network unit 430 is used for convolutional neural networks respectively from the hidden status switch of described problem, the hidden status switch of title and the hidden shape of abstract State sequence obtains problem characteristic expression, title feature indicates and abstract character representation;Pond operating unit 440, for described Problem characteristic indicates that carry out pondization operation indicates to obtain problem finally, and indicates and the abstract feature the title feature Expression carries out splicing pondization operation to be indicated with obtaining answer finally;And score calculating unit 450, for calculating described problem most Eventually indicate and the answer finally indicate between similarity to obtain phase of the answer data relative to described problem data Closing property score.
In one example, it in the above-mentioned answer selection device 400 based on convolution loop neural network, further wraps It includes: sequencing unit, for multiple relevance scores based on a plurality of answer data corresponding with described problem data to described more Answer data is ranked up.
In one example, described a plurality of to answer in the above-mentioned answer selection device 400 based on convolution loop neural network Case data are a plurality of candidate answers data of the described problem data obtained by multiple search engines.
In one example, in the above-mentioned answer selection device 400 based on convolution loop neural network, the data are obtained It takes unit 410 to be used for: text participle being carried out to every candidate answers data in a plurality of candidate answers data and removal stops Word.
In one example, it in the above-mentioned answer selection device 400 based on convolution loop neural network, further wraps Include: attention mechanism unit is used for: described problem character representation is averaged to obtain renewal vector on sequence length; And word rank note is carried out respectively with renewal vector status switch hidden for the title and the hidden status switch of abstract The update for power of anticipating is to obtain the hidden status switch of updated title and hidden status switch of making a summary;And the convolutional network unit 430 is described for being obtained respectively from the updated hidden status switch of title and hidden status switch of making a summary with convolutional neural networks Title feature indicates and the abstract character representation.
In one example, in the above-mentioned answer selection device 400 based on convolution loop neural network, the pondization behaviour Make unit 440 to be used for: carry out the operation of maximum value pondization to described problem character representation is indicated with obtaining problem finally;To the mark It inscribes character representation and the abstract character representation carries out vector splicing;And the spliced title feature is indicated and plucked Want character representation to carry out the operation of maximum value pondization is indicated with obtaining answer finally.
In one example, in the above-mentioned answer selection device 400 based on convolution loop neural network, the score meter Calculate unit 450 be used for: calculate described problem finally indicate and the answer finally indicate between vector cosine similarity with Obtain relevance scores of the answer data relative to described problem data.
Here, it will be understood by those skilled in the art that the above-mentioned answer selection device 400 based on convolution loop neural network In each unit and module concrete function and operation have been described above referring to figs. 1 to Fig. 4 description based on convolution loop mind It is discussed in detail in answer selection method through network, and therefore, will omit its repeated description.
As described above, can be with according to the answer selection device 400 based on convolution loop neural network of the embodiment of the present application It realizes in various terminal equipment, such as the server for running question answering system, installs as artificial intelligence assistance application Hardware device etc..In one example, software module and/or hard can be used as according to the device of the embodiment of the present application 400 Part module and be integrated into the terminal device.For example, the device 400 can be one in the operating system of the terminal device Software module, or can be and be directed to the application program that the terminal device is developed;Certainly, which equally may be used To be one of numerous hardware modules of the terminal device.
It alternatively, in another example, should answer selection device 400 and the terminal based on convolution loop neural network Equipment is also possible to discrete equipment, and the device 400 can be connected to the terminal by wired and or wireless network and set It is standby, and interactive information is transmitted according to the data format of agreement.
Example electronic device
In the following, being described with reference to Figure 6 the electronic equipment according to the embodiment of the present application.
Fig. 6 illustrates the block diagram of the electronic equipment according to the embodiment of the present application.
As shown in fig. 6, electronic equipment 10 includes one or more processors 11 and memory 12.
Processor 11 can be central processing unit (CPU) or have data-handling capacity and/or instruction execution capability Other forms processing unit, and can control the other assemblies in electronic equipment 10 to execute desired function.
Memory 12 may include one or more computer program products, and the computer program product may include each The computer readable storage medium of kind form, such as volatile memory and/or nonvolatile memory.The volatile storage Device for example may include random access memory (RAM) and/or cache memory (cache) etc..It is described non-volatile to deposit Reservoir for example may include read-only memory (ROM), hard disk, flash memory etc..It can be deposited on the computer readable storage medium One or more computer program instructions are stored up, processor 11 can run described program instruction, to realize this Shen described above The function of the answer selection method based on convolution loop neural network of each embodiment please.In the computer-readable storage Such as problem data, the various contents such as candidate answers data can also be stored in medium.
In one example, electronic equipment 10 can also include: input unit 13 and output device 14, these components pass through The interconnection of bindiny mechanism's (not shown) of bus system and/or other forms.
For example, the input unit 13 can be such as keyboard, mouse etc..
The output device 14 can be output to the outside various information, the ranking results etc. including the candidate answers data. The output equipment 14 may include such as display, loudspeaker, printer and communication network and its long-range output that is connected Equipment etc..
Certainly, to put it more simply, illustrated only in Fig. 6 it is some in component related with the application in the electronic equipment 10, The component of such as bus, input/output interface etc. is omitted.In addition to this, according to concrete application situation, electronic equipment 10 is also It may include any other component appropriate.
Illustrative computer program product and computer readable storage medium
Other than the above method and equipment, embodiments herein can also be computer program product comprising meter Calculation machine program instruction, it is above-mentioned that the computer program instructions make the processor execute this specification when being run by processor It is selected described in " illustrative methods " part according to the answer based on convolution loop neural network of the various embodiments of the application Step in method.For example, the computer program instructions can be based on (SuSE) Linux OS, on the basis of PyTorch platform It is upper to write realization using Python.
The computer program product can be write with any combination of one or more programming languages for holding The program code of row the embodiment of the present application operation, described program design language includes object oriented program language, such as Java, C++ etc. further include conventional procedural programming language, such as " C " language or similar programming language.Journey Sequence code can be executed fully on the user computing device, partly execute on a user device, be independent soft as one Part packet executes, part executes on a remote computing or completely in remote computing device on the user computing device for part Or it is executed on server.
In addition, embodiments herein can also be computer readable storage medium, it is stored thereon with computer program and refers to It enables, the computer program instructions make the processor execute above-mentioned " the exemplary side of this specification when being run by processor According to the step in the answer selection method based on convolution loop neural network of the various embodiments of the application described in method " part Suddenly.
The computer readable storage medium can be using any combination of one or more readable mediums.Readable medium can To be readable signal medium or readable storage medium storing program for executing.Readable storage medium storing program for executing for example can include but is not limited to electricity, magnetic, light, electricity Magnetic, the system of infrared ray or semiconductor, device or device, or any above combination.Readable storage medium storing program for executing it is more specific Example (non exhaustive list) includes: the electrical connection with one or more conducting wires, portable disc, hard disk, random access memory Device (RAM), read-only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disc Read-only memory (CD-ROM), light storage device, magnetic memory device or above-mentioned any appropriate combination.
The basic principle of the application is described in conjunction with specific embodiments above, however, it is desirable to, it is noted that in this application The advantages of referring to, advantage, effect etc. are only exemplary rather than limitation, must not believe that these advantages, advantage, effect etc. are the application Each embodiment is prerequisite.In addition, detail disclosed above is merely to exemplary effect and the work being easy to understand With, rather than limit, it is that must be realized using above-mentioned concrete details that above-mentioned details, which is not intended to limit the application,.
Device involved in the application, device, equipment, system block diagram only as illustrative example and be not intended to It is required that or hint must be attached in such a way that box illustrates, arrange, configure.As those skilled in the art will appreciate that , it can be connected by any way, arrange, configure these devices, device, equipment, system.Such as "include", "comprise", " tool " etc. word be open vocabulary, refer to " including but not limited to ", and can be used interchangeably with it.Vocabulary used herein above "or" and "and" refer to vocabulary "and/or", and can be used interchangeably with it, unless it is not such that context, which is explicitly indicated,.Here made Vocabulary " such as " refers to phrase " such as, but not limited to ", and can be used interchangeably with it.
It may also be noted that each component or each step are can to decompose in the device of the application, device and method And/or reconfigure.These decompose and/or reconfigure the equivalent scheme that should be regarded as the application.
The above description of disclosed aspect is provided so that any person skilled in the art can make or use this Application.Various modifications in terms of these are readily apparent to those skilled in the art, and are defined herein General Principle can be applied to other aspect without departing from scope of the present application.Therefore, the application is not intended to be limited to Aspect shown in this, but according to principle disclosed herein and the consistent widest range of novel feature.
In order to which purpose of illustration and description has been presented for above description.In addition, this description is not intended to the reality of the application It applies example and is restricted to form disclosed herein.Although already discussed above multiple exemplary aspects and embodiment, this field skill Its certain modifications, modification, change, addition and sub-portfolio will be recognized in art personnel.

Claims (11)

1. a kind of answer selection method based on convolution loop neural network, comprising:
It obtains problem data and answer data corresponding with described problem data, the answer data includes title data and abstract Data;
By two-way shot and long term remember layer handle respectively described problem data, title data and summary data term vector indicate with Obtain and the hidden status switch of problem corresponding to described problem data, the hidden status switch of title corresponding with the title data and with The corresponding hidden status switch of making a summary of the summary data;
It is obtained respectively from the hidden status switch of described problem, the hidden status switch of title and the hidden status switch of abstract with convolutional neural networks Problem characteristic indicates, title feature indicates and abstract character representation;
Carry out pondization operation to described problem character representation is indicated with obtaining problem finally, and to title feature expression and institute State abstract character representation carry out splicing pondization operate indicated with obtaining answer finally;And
Calculate described problem finally indicate and the answer finally indicate between similarity it is opposite to obtain the answer data In the relevance scores of described problem data.
2. further comprising based on the answer selection method of convolution loop neural network as described in claim 1:
Multiple relevance scores based on a plurality of answer data corresponding with described problem data to a plurality of answer data into Row sequence.
3. as claimed in claim 2 based on the answer selection method of convolution loop neural network, wherein a plurality of answer number According to a plurality of candidate answers data for being the described problem data obtained by multiple search engines.
4. as claimed in claim 3 based on the answer selection method of convolution loop neural network, wherein obtain problem data and Answer data corresponding with described problem data includes:
Text participle and removal stop words are carried out to every candidate answers data in a plurality of candidate answers data.
5. as described in claim 1 based on the answer selection method of convolution loop neural network, wherein with convolutional neural networks Problem characteristic expression, title are obtained from the hidden status switch of described problem, the hidden status switch of title and the hidden status switch of abstract respectively Character representation and abstract character representation include:
Described problem character representation directly is obtained from the hidden status switch of described problem with convolutional neural networks;
Described problem character representation is averaged to obtain renewal vector on sequence length;
Word rank note is carried out respectively with renewal vector status switch hidden for the title and the hidden status switch of abstract The update for power of anticipating is to obtain the hidden status switch of updated title and hidden status switch of making a summary;And
The mark is obtained from the updated hidden status switch of title and hidden status switch of making a summary respectively with convolutional neural networks Inscribe character representation and the abstract character representation.
6. as described in claim 1 based on the answer selection method of convolution loop neural network, wherein to described problem feature It indicates that carry out pondization operation indicates to obtain problem finally, and the title feature is indicated to carry out with the abstract character representation Splicing pondization operate finally indicated with to obtain answer include:
Carry out the operation of maximum value pondization to described problem character representation is indicated with obtaining problem finally;
The title feature is indicated and the abstract character representation carries out vector splicing;And
The spliced title feature is indicated and abstract character representation progress maximum value pondization operation is final to obtain answer It indicates.
7. as described in claim 1 based on the answer selection method of convolution loop neural network, wherein calculate described problem most Eventually indicate and the answer finally indicate between similarity to obtain phase of the answer data relative to described problem data Close property score include:
Calculate described problem finally indicates and the answer finally indicate between vector cosine similarity to answer described in obtaining Relevance scores of the case data relative to described problem data.
8. a kind of answer selection device based on convolution loop neural network, comprising:
Data capture unit, for obtaining problem data and answer data corresponding with described problem data, the answer data Including title data and summary data;
Recirculating network unit handles described problem data, title data and summary data for remembering layer by two-way shot and long term Term vector indicate with the hidden status switch of the problem obtained and described problem data are corresponding, title corresponding with the title data Hidden status switch and the hidden status switch of abstract corresponding with the summary data;
Convolutional network unit, for convolutional neural networks respectively from the hidden status switch of described problem, the hidden status switch of title and Hidden status switch of making a summary obtains problem characteristic expression, title feature indicates and abstract character representation;
Pond operating unit, for being indicated with obtaining problem finally the progress pondization operation of described problem character representation, and to institute Stating title feature indicates that splicing pondization operation is carried out with the abstract character representation to be indicated to obtain answer finally;And
Score calculating unit, indicate for calculating described problem finally and the answer finally indicate between similarity to obtain Relevance scores of the answer data relative to described problem data.
9. further comprising based on the answer selection device of convolution loop neural network as claimed in claim 8:
Attention mechanism unit, is used for:
Described problem character representation is averaged to obtain renewal vector on sequence length;And
Word rank note is carried out respectively with renewal vector status switch hidden for the title and the hidden status switch of abstract The update for power of anticipating is to obtain the hidden status switch of updated title and hidden status switch of making a summary;And
The convolutional network unit is used for convolutional neural networks respectively from the hidden status switch of the updated title and abstract Hidden status switch obtains the title feature and indicates and the abstract character representation.
10. a kind of electronic equipment, comprising:
Processor;And
Memory is stored with computer program instructions in the memory, and the computer program instructions are by the processing Device makes the processor execute such as answering based on convolution loop neural network of any of claims 1-7 when running Case selection method.
11. a kind of computer readable storage medium, which is characterized in that be stored with computer on the computer readable storage medium Program instruction is operable to execute as any in claim 1-7 when the computer program instructions are executed by a computing apparatus The answer selection method based on convolution loop neural network described in.
CN201810742597.5A 2018-07-09 2018-07-09 Answer selection method, device and electronic equipment based on convolution loop neural network Pending CN109002519A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810742597.5A CN109002519A (en) 2018-07-09 2018-07-09 Answer selection method, device and electronic equipment based on convolution loop neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810742597.5A CN109002519A (en) 2018-07-09 2018-07-09 Answer selection method, device and electronic equipment based on convolution loop neural network

Publications (1)

Publication Number Publication Date
CN109002519A true CN109002519A (en) 2018-12-14

Family

ID=64599891

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810742597.5A Pending CN109002519A (en) 2018-07-09 2018-07-09 Answer selection method, device and electronic equipment based on convolution loop neural network

Country Status (1)

Country Link
CN (1) CN109002519A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109658271A (en) * 2018-12-19 2019-04-19 前海企保科技(深圳)有限公司 A kind of intelligent customer service system and method based on the professional scene of insurance
CN109740077A (en) * 2018-12-29 2019-05-10 北京百度网讯科技有限公司 Answer searching method, device and its relevant device based on semantic indexing
CN110110063A (en) * 2019-04-30 2019-08-09 南京大学 A kind of question answering system construction method based on Hash study
CN111125335A (en) * 2019-12-27 2020-05-08 北京百度网讯科技有限公司 Question and answer processing method and device, electronic equipment and storage medium
CN112329439A (en) * 2020-11-18 2021-02-05 北京工商大学 Food safety event detection method and system based on graph convolution neural network model
CN113792550A (en) * 2021-04-08 2021-12-14 北京金山数字娱乐科技有限公司 Method and device for determining predicted answer and method and device for reading and understanding

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160342895A1 (en) * 2015-05-21 2016-11-24 Baidu Usa Llc Multilingual image question answering
CN107229684A (en) * 2017-05-11 2017-10-03 合肥美的智能科技有限公司 Statement classification method, system, electronic equipment, refrigerator and storage medium
CN107256228A (en) * 2017-05-02 2017-10-17 清华大学 Answer selection system and method based on structuring notice mechanism
CN107562792A (en) * 2017-07-31 2018-01-09 同济大学 A kind of question and answer matching process based on deep learning
CN107766447A (en) * 2017-09-25 2018-03-06 浙江大学 It is a kind of to solve the method for video question and answer using multilayer notice network mechanism

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160342895A1 (en) * 2015-05-21 2016-11-24 Baidu Usa Llc Multilingual image question answering
CN107256228A (en) * 2017-05-02 2017-10-17 清华大学 Answer selection system and method based on structuring notice mechanism
CN107229684A (en) * 2017-05-11 2017-10-03 合肥美的智能科技有限公司 Statement classification method, system, electronic equipment, refrigerator and storage medium
CN107562792A (en) * 2017-07-31 2018-01-09 同济大学 A kind of question and answer matching process based on deep learning
CN107766447A (en) * 2017-09-25 2018-03-06 浙江大学 It is a kind of to solve the method for video question and answer using multilayer notice network mechanism

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109658271A (en) * 2018-12-19 2019-04-19 前海企保科技(深圳)有限公司 A kind of intelligent customer service system and method based on the professional scene of insurance
CN109740077A (en) * 2018-12-29 2019-05-10 北京百度网讯科技有限公司 Answer searching method, device and its relevant device based on semantic indexing
CN109740077B (en) * 2018-12-29 2021-02-12 北京百度网讯科技有限公司 Answer searching method and device based on semantic index and related equipment thereof
CN110110063A (en) * 2019-04-30 2019-08-09 南京大学 A kind of question answering system construction method based on Hash study
CN110110063B (en) * 2019-04-30 2023-07-18 南京大学 Question-answering system construction method based on hash learning
CN111125335A (en) * 2019-12-27 2020-05-08 北京百度网讯科技有限公司 Question and answer processing method and device, electronic equipment and storage medium
US11461556B2 (en) 2019-12-27 2022-10-04 Beijing Baidu Netcom Science Technology Co., Ltd. Method and apparatus for processing questions and answers, electronic device and storage medium
CN112329439A (en) * 2020-11-18 2021-02-05 北京工商大学 Food safety event detection method and system based on graph convolution neural network model
CN113792550A (en) * 2021-04-08 2021-12-14 北京金山数字娱乐科技有限公司 Method and device for determining predicted answer and method and device for reading and understanding

Similar Documents

Publication Publication Date Title
CN108875074A (en) Based on answer selection method, device and the electronic equipment for intersecting attention neural network
CN112487182B (en) Training method of text processing model, text processing method and device
CN111368996B (en) Retraining projection network capable of transmitting natural language representation
WO2022007823A1 (en) Text data processing method and device
AU2018214675B2 (en) Systems and methods for automatic semantic token tagging
CN109002519A (en) Answer selection method, device and electronic equipment based on convolution loop neural network
CN110019701B (en) Method for question answering service, question answering service system and storage medium
CN108846077B (en) Semantic matching method, device, medium and electronic equipment for question and answer text
CN110301117B (en) Method and apparatus for providing response in session
CN109344404B (en) Context-aware dual-attention natural language reasoning method
CN108845990A (en) Answer selection method, device and electronic equipment based on two-way attention mechanism
CN109271493A (en) A kind of language text processing method, device and storage medium
CN111898374B (en) Text recognition method, device, storage medium and electronic equipment
CN109992773A (en) Term vector training method, system, equipment and medium based on multi-task learning
CN108959482A (en) Single-wheel dialogue data classification method, device and electronic equipment based on deep learning
CN110096567A (en) Selection method, system are replied in more wheels dialogue based on QA Analysis of Knowledge Bases Reasoning
CN112580369B (en) Sentence repeating method, method and device for training sentence repeating model
CN114676234A (en) Model training method and related equipment
CN112052668A (en) Training method of address text recognition model, and address prediction method and device
CN113901191A (en) Question-answer model training method and device
CN111898636A (en) Data processing method and device
CN112307048B (en) Semantic matching model training method, matching method, device, equipment and storage medium
RU2712101C2 (en) Prediction of probability of occurrence of line using sequence of vectors
US20220383119A1 (en) Granular neural network architecture search over low-level primitives
Thomas et al. Chatbot using gated end-to-end memory networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20181214

WD01 Invention patent application deemed withdrawn after publication