CN109670029A - For determining the method, apparatus, computer equipment and storage medium of problem answers - Google Patents

For determining the method, apparatus, computer equipment and storage medium of problem answers Download PDF

Info

Publication number
CN109670029A
CN109670029A CN201811626414.XA CN201811626414A CN109670029A CN 109670029 A CN109670029 A CN 109670029A CN 201811626414 A CN201811626414 A CN 201811626414A CN 109670029 A CN109670029 A CN 109670029A
Authority
CN
China
Prior art keywords
word
target
vector
coding
coding vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811626414.XA
Other languages
Chinese (zh)
Other versions
CN109670029B (en
Inventor
李弘宇
刘璟
吕雅娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201811626414.XA priority Critical patent/CN109670029B/en
Publication of CN109670029A publication Critical patent/CN109670029A/en
Application granted granted Critical
Publication of CN109670029B publication Critical patent/CN109670029B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Machine Translation (AREA)

Abstract

The application proposes a kind of method for determining problem answers, device, electronic equipment and readable storage medium storing program for executing, wherein, method includes: by the term vector to each target word in target text information, carry out semantic coding, obtain the first coding vector of target word, to the term vector of problem word each in asked questions, carry out semantic coding, obtain the coding vector of each problem word, to each target word, according to the matching degree of respective objects word and each problem word, summation is weighted to the coding vector of each problem word, obtain the second coding vector of respective objects word, by the first coding vector and the second coding vector of each target word, input prediction model, it take each target word of determination as the probability value of the asked questions answer.This method passes through the matching degree determined between target word and problem word, so that the feature of abundant prediction model input, improves identification and ask the accuracy of in the title of the key words, and then improve the accuracy rate of model prediction answer.

Description

For determining the method, apparatus, computer equipment and storage medium of problem answers
Technical field
This application involves Internet technical field more particularly to a kind of method, apparatus for determining problem answers, calculate Machine equipment and readable storage medium storing program for executing.
Background technique
With the fast development of Internet technology and natural language processing technique, natural language processing technique has become meter An important research direction in calculation machine scientific domain and artificial intelligence field can be realized between people and computer with nature language Speech is effectively communicated.Therefore, for reading understanding technology given one or more articles and asked questions, Ke Yitong The method for crossing machine learning is predicted to obtain answer from model.
It is needed during through model prediction answer referring to the problems in asked questions word, and different problems word It is different to the importance for answering the problem.In the prior art, the significance level that attention mechanism measures each problem word is relied only on, is made There are deviations for the significance level of each problem word that must be determining, so as to cause the accurate lower of prediction answer.
Summary of the invention
The application propose it is a kind of for determining the method, apparatus, computer equipment and readable storage medium storing program for executing of problem answers, Corresponding problem word is used to indicate as target text message subject Word probability by increasing in the term vector of problem word to realize Element, the matching degree between target word and problem word is embodied, so that the feature of abundant prediction model input, solves existing The keyword in problem word cannot be accurately identified in technology, so as to cause the lower technical problem of the prediction rate of answer.
The application first aspect embodiment proposes a kind of method for determining problem answers, comprising:
To the term vector of each target word in target text information, carry out semantic coding, obtain the first of target word encode to Amount;
To the term vector of problem word each in asked questions, semantic coding is carried out, the coding vector of each problem word is obtained;To every One target word is weighted the coding vector of each problem word and asks according to the matching degree of respective objects word and each problem word With obtain the second coding vector of respective objects word;
By the first coding vector and the second coding vector of each target word, input prediction model is with each target word of determination The probability value of the asked questions answer;Wherein, the prediction model learnt to obtain first coding vector value and The value of second coding vector, with the mapping relations between the probability value.
The method for determining problem answers of the embodiment of the present application, passes through the word to each target word in target text information Vector carries out semantic coding, obtains the first coding vector of target word, to the term vector of problem word each in asked questions, carries out Semantic coding obtains the coding vector of each problem word, to each target word, according to the matching of respective objects word and each problem word Degree is weighted summation to the coding vector of each problem word, obtains the second coding vector of respective objects word, by each target word The first coding vector and the second coding vector, input prediction model, with each target word of determination for the asked questions answer Probability value;Wherein, prediction model has learnt to obtain the value of the value of the first coding vector and the second coding vector, with probability value Between mapping relations.This method is used to indicate corresponding problem word by increasing in the term vector of problem word as target text The element of message subject Word probability embodies the matching degree between target word and problem word, thus abundant prediction model input Feature improves identification and asks the accuracy of in the title of the key words, and then improves the accuracy rate of model prediction answer.
The application second aspect embodiment proposes a kind of for determining the device of problem answers, comprising:
First coding module carries out semantic coding, obtains mesh for the term vector to each target word in target text information Mark the first coding vector of word;
Second coding module carries out semantic coding, obtains each problem for the term vector to problem word each in asked questions The coding vector of word;Computing module, for each target word, according to respective objects word to the degree of concern of each problem word, Summation is weighted to the coding vector of each problem word, obtains the second coding vector of respective objects word;
Determining module, for by the first coding vector and the second coding vector of each target word, input prediction model, with true Fixed each target word is the probability value of the asked questions answer;Wherein, the prediction model has learnt to obtain first coding The value of the value of vector and second coding vector, with the mapping relations between the probability value.
The device for being used to determine problem answers of the embodiment of the present application, passes through the word to each target word in target text information Vector carries out semantic coding, obtains the first coding vector of target word, to the term vector of problem word each in asked questions, carries out Semantic coding obtains the coding vector of each problem word, to each target word, according to the matching of respective objects word and each problem word Degree is weighted summation to the coding vector of each problem word, obtains the second coding vector of respective objects word, by each target word The first coding vector and the second coding vector, input prediction model, with each target word of determination for the asked questions answer Probability value;Wherein, prediction model has learnt to obtain the value of the value of the first coding vector and the second coding vector, with probability value Between mapping relations.This method is used to indicate corresponding problem word by increasing in the term vector of problem word as target text The element of message subject Word probability embodies the matching degree between target word and problem word, thus abundant prediction model input Feature improves identification and asks the accuracy of in the title of the key words, and then improves the accuracy rate of model prediction answer.
The application third aspect embodiment proposes a kind of computer equipment, including memory, processor and is stored in On reservoir and the computer program that can run on a processor, when the processor executes described program, such as above-mentioned implementation is realized Method for determining problem answers described in example.
The application fourth aspect embodiment proposes a kind of non-transitorycomputer readable storage medium, is stored thereon with meter Calculation machine program, which is characterized in that realized when the program is executed by processor as above-mentioned as described in the examples for determining problem The method of answer.
The additional aspect of the application and advantage will be set forth in part in the description, and will partially become from the following description It obtains obviously, or recognized by the practice of the application.
Detailed description of the invention
The application is above-mentioned and/or additional aspect and advantage will become from the following description of the accompanying drawings of embodiments Obviously and it is readily appreciated that, in which:
Fig. 1 is a kind of for determining the flow diagram of the method for problem answers provided by the embodiment of the present application;
Fig. 2 is the flow diagram for the method that another kind provided by the embodiment of the present application is used to determine problem answers;
Fig. 3 is a kind of for determining the structural schematic diagram of the device of problem answers provided by the embodiment of the present application;
Fig. 4 is the structural schematic diagram for the device that another kind provided by the embodiment of the present application is used to determine problem answers;
Fig. 5 is a kind of for determining the structural schematic diagram of the model of problem answers provided by the embodiment of the present application;
Fig. 6 shows the block diagram for being suitable for the exemplary computer device for being used to realize the application embodiment.
Specific embodiment
Embodiments herein is described below in detail, examples of the embodiments are shown in the accompanying drawings, wherein from beginning to end Same or similar label indicates same or similar element or element with the same or similar functions.Below with reference to attached The embodiment of figure description is exemplary, it is intended to for explaining the application, and should not be understood as the limitation to the application.
It reads understanding technology and shows one or more fixed articles and a problem, allow model using the method for machine learning Predict answer.Model is needed when predicting answer referring to the word in problem, and word different in problem should to answer The importance of problem is different, and can be determined to pay close attention to which word according to the importance of problem word therefore, it is necessary to model, and The degree of concern of each word.
In the prior art, the significance level that attention mechanism measures each problem word is relied only on, is difficult to control by this method The learning process and accuracy of Model checking keyword, so that the keyword and criticality finally acquired may be inclined by data The influence of difference, model structure, the viewpoint with people have bigger difference.Meanwhile model may also mistakenly ignore certain keywords, or Person excessively pays close attention to certain words, so as to cause the reduction of answer prediction accuracy.
For above-mentioned the problems of the prior art, the embodiment of the present application proposes a kind of for determining the side of problem answers Method carries out semantic coding by the term vector to each target word in target text information, obtain the first of target word encode to Amount carries out semantic coding, the coding vector of each problem word is obtained, to each to the term vector of problem word each in asked questions Target word is weighted summation to the coding vector of each problem word, obtains according to the matching degree of respective objects word and each problem word To the second coding vector of respective objects word, by the first coding vector and the second coding vector of each target word, input prediction mould Type is the probability of asked questions answer according to each target word in turn using each target word of determination as the probability value of asked questions answer Value extracts answer word from each target word that target text information includes and generates the answer of asked questions according to answer word.
Below with reference to the accompanying drawings describe the embodiment of the present application for determine the method, apparatus computer equipments of problem answers with And storage medium.
Fig. 1 is a kind of for determining the flow diagram of the method for problem answers provided by the embodiment of the present application.
The embodiment of the present application is configured in the device for determining problem answers in the method for being used to determine problem answers In come for example, this is used to determine that the device of problem answers to can be applied in any computer equipment, so that the computer Equipment can execute determining problem answers function.
Wherein, computer equipment can be PC (Personal Computer, abbreviation PC), cloud device, movement Equipment etc., mobile device can for example have for mobile phone, tablet computer, personal digital assistant, wearable device, mobile unit etc. The hardware device of various operating systems, touch screen and/or display screen.
As shown in Figure 1, this be used to determine the method for problem answers the following steps are included:
Step 101, to the term vector of each target word in target text information, semantic coding is carried out, obtains the of target word One coding vector.
Wherein, target text information refers to the document information for one of reading or several corresponding textual form;Target Word refers to word all in text information;Semantic coding is to be processed by word to information, by meaning, genealogical classification or Speech material is subject to tissue and summary with the linguistic form of oneself, the main argument, argument, logical construction of material is found out, presses Semantic feature coding.
In natural language processing, natural language text is the random length sequence as made of a heap symbol sequential concatenation, very Difficulty is directly translated into computer numeric type data to understand, thus can not directly carry out further calculation processing.Word to Amount is due to containing information abundant, so that deep learning is capable of handling most natural language processing applications.Therefore, this Shen The coding of character level please be carried out to each target word in target text information first, obtains the word of each target word in embodiment Vector, and then the context of combining target text carry out semantic coding to the term vector of each target word in target text information, obtain To the first coding vector of target word.
It should be noted that the method that target word each in target text information generates term vector is had very much, but these All according to a thought, i.e., the meaning of any word can be indicated method with its periphery word.Currently, generating the mode of term vector It can be divided into: Statistics-Based Method and the method based on language model.Wherein, based on language model generate term vector method be By training neural network language model NNLM (neural network language model), term vector is as language mould The subsidiary output of type.For example, character code can be carried out to target word each in target text information by bag of words, each word is obtained Vector.The method that term vector is generated in the embodiment of the present application belongs to the prior art, and details are not described herein.
As a kind of possible implementation, by the term vector of target word each in target text information, by one layer or The Recognition with Recurrent Neural Network (Recurrent Neural Network, RNN) of multilayer carries out semantic coding, obtains the of target word One coding vector.When being encoded using term vector of the RNN network to each target word, at each moment, the first of output is compiled Code vector will not only depend on the input at current time, it is also contemplated that the model " state " of last moment.By to historic state Dependence, RNN model can effectively characterize the interdependent information of context of text data.
Wherein, the first coding vector refers to after carrying out semantic coding to the target word in target text information, carries each The coding vector of the semantic feature of target word.
As alternatively possible implementation, convolutional neural networks (Convolution Neural can also be used Network, CNN) model in target text information each target word term vector carry out semantic coding.
Step 102, to the term vector of problem word each in asked questions, semantic coding is carried out, the coding of each problem word is obtained Vector.
It, can also be according to the side for carrying out semantic coding in step 101 to the term vector of each target word in the embodiment of the present application Method carries out semantic coding, obtains the coding vector of each problem word to the term vector of problem word each in asked questions.It is specific to compile Code method, as described in above-mentioned steps 101, details are not described herein.
As a kind of possible situation, the term vector of each problem word may include being used to indicate corresponding problem word as target text The element of this message subject Word probability.It is to be understood that the term vector of each problem word includes the importance characteristic of each problem word, It is used to indicate the probability that corresponding problem word is target text message subject word.
Step 103, to each target word, according to the matching degree of respective objects word and each problem word, to each problem word Coding vector be weighted summation, obtain the second coding vector of respective objects word.
In the embodiment of the present application, for each of target text information target word, the side that can be calculated using model Method obtains the matching degree of each target word Yu each problem word, in turn, by respective objects word to the matching degree of each problem word, multiplied by Same problem word is target text message subject Word probability, the weight of each problem word is obtained, according to the weight of each problem word, to each The coding vector of problem word is weighted summation, obtains the second coding vector of respective objects word.
Wherein, the second coding vector carries target word to the matching degree feature of each problem word, the semantic feature of problem word And the theme feature of target text information.
Step 104, by the first coding vector of each target word and the second coding vector, input prediction model is each to determine Target word is the probability value of asked questions answer;Wherein, prediction model has learnt to obtain the value and second of the first coding vector The value of coding vector, the mapping relations between probability value.
In the embodiment of the present application, by the member in the first coding vector of each target word obtained in step 101 and step 103 Element in element and the second coding vector merges, and inputs in advance in trained prediction model as input vector, and then can be with Determine that each target word is the probability of asked questions answer.Wherein, the element number that input vector includes is in the first coding vector Element number and the second coding vector in the sum of element number.
It should be noted that prediction model is used to determine the corresponding probability according to each element value in input vector Value.It is to be understood that prediction model has learnt to obtain the value of the value of the first coding vector and the second coding vector, with probability Mapping relations between value, so as to determine according to the value of the first coding vector of each target word and the second coding vector Each target word is the probability of asked questions answer.
It further, is the probability value of asked questions answer, each mesh for including according to each target word from target text information It marks and extracts answer word in word, and then according to the answer word of extraction, determine start bit of the answer of asked questions in target text It sets, to generate the answer of asked questions.
The method for determining problem answers of the embodiment of the present application, passes through the word to each target word in target text information Vector carries out semantic coding, obtains the first coding vector of target word, to the term vector of problem word each in asked questions, carries out Semantic coding obtains the coding vector of each problem word, to each target word, according to the matching of respective objects word and each problem word Degree is weighted summation to the coding vector of each problem word, obtains the second coding vector of respective objects word, by each target word The first coding vector and the second coding vector, input prediction model, using each target word of determination as the probability of asked questions answer Value;Wherein, prediction model has learnt to obtain the value of the value of the first coding vector and the second coding vector, between probability value Mapping relations.This method is used to indicate corresponding problem word by increasing in the term vector of problem word as target text information The element of theme Word probability embodies the matching degree between target word and problem word, thus the feature of abundant prediction model input, It improves identification and asks the accuracy of in the title of the key words, and then improve the accuracy rate of model prediction answer.
For an embodiment in clear explanation, another method for determining problem answers is present embodiments provided, is led to The importance for determining problem word is crossed, and then understands fusion problem word importance characteristic in model in reading.Fig. 2 is the application implementation Example provides another for determining the method flow schematic diagram of problem answers.
As shown in Fig. 2, the method for being used to determine problem answers may comprise steps of:
Step 201, to the term vector of each target word in target text information, semantic coding is carried out, obtains the of target word One coding vector.
In the embodiment of the present application, the realization process of step 201 is referring to described in step 101 in above-described embodiment, herein no longer It repeats.
Step 202, determine that each problem word is target text message subject Word probability in asked questions.
In the embodiment of the present application, using the method for machine learning allow model prediction answer when, each problem word in asked questions Different for target text message subject Word probability, each problem word is different to the importance for answering the problem in asked questions.For example, In asked questions " ", "Yes" the problems such as word to answering a question, there is no what to help, and the subject in asked questions is such as " ship ", " train ", interrogative " how many ", " how soon " etc. be the key that model is answered a question.
Therefore, it first has to determine the importance of each problem word in asked questions, that is, determines that each problem word is in asked questions The probability of target text message subject word.As a kind of possible implementation, TF-IDF (Term can be passed through Frequency-Inverse Document Frequency) each problem word is target text information in feature calculation asked questions The probability of descriptor.
Wherein, TF-IDF is a kind of statistical method, the common weighting technique for information retrieval and data mining.TF-IDF Technology is to assess a words for the significance level of a copy of it file in a file set or a corpus.Words The directly proportional increase of number that occurs hereof with it of importance, but the frequency that can occur in corpus with it simultaneously Be inversely proportional decline.TF means word frequency (Term Frequency), and IDF means inverse document frequency (Inverse Document Frequency)。
In the embodiment of the present application, significance level of each problem word in target text information is determined by TF-IDF technology, Specifically, it is first determined the frequency that each problem word occurs in target text information, in order to distinguish the herein referred to as first frequency Rate.
As a kind of possible implementation, first frequency is that the number that a problem word occurs in target text information removes With total word number of the target text information.The frequency that each problem word occurs in target text information is calculated using formula (1) Rate, formula (1) are as follows:
Wherein, t is a problem word, and d is target text, fT, dThe number that problem word t occurs in target text d is represented, ΣkfK, dFor the sum of the number that target words all in target text information occur, tf (t, d) is problem word in target text information The frequency of middle appearance.
As an example, total word number of target text information is 100, and word " apple " occurs 3 times, then First frequency of " apple " word in the target text information is exactly 3/100=0.03.
It is possible to further determine frequency that each problem word occurs in the corpus of setting by TF-IDF technology, it is It distinguishes and is herein referred to as first frequency.Wherein, corpus refers to the extensive e-text through scientific sampling and processing Library, wherein what is stored is the linguistic data really occurred in the actual use of language, and corpus is that carrying language is known The basic resource of knowledge.
As a kind of possible implementation, each problem word is calculated using formula (2) and is occurred in the corpus of setting Frequency, formula (2) is as follows:
Wherein, t is a problem word, and D is all documents in corpus, and N is the number of document in D, | { d ∈ D:t ∈ D } | for the number of all documents in D comprising word t, idf (t, D) represents the frequency that each problem word occurs in the corpus of setting Rate.
Further, the first frequency and second frequency phase each problem word being calculated by formula (1) and formula (2) Multiply, that is, can determine the importance characteristic of each problem word.As an example, each problem can be calculated by following formula (3) The importance characteristic of word.
Tfidf (t, d, D)=tf (t, d) * idf (t, D) (3)
Wherein, t is a problem word in formula (3);D is target text information;D is all documents in corpus, Tfidf (t, d, D) represents the importance characteristic of each problem word.
It should be noted that tfidf (t, d, the D) value for each problem word being calculated is bigger, illustrate that the problem word is opposite It is higher in the importance of target text information;Conversely, value is smaller, illustrate that the problem word importance is lower.
As a kind of possible implementation, after the importance characteristic of each problem word is calculated, following formula can be passed through (4) probability that each problem word is target text message subject word is calculated.
wi=soft max (tfidf (qi,d,D)) (4)
Wherein, wiIt is target text message subject Word probability for each problem word;qiFor i-th of problem word;D is a table of contents mark Text information;D is all documents in corpus, and softmax () is normalized function.
Step 203, using the theme Word probability of each problem word as an element, be added to the word of corresponding problem word to In amount.
In the embodiment of the present application, the term vector of each problem word contains information abundant, carries the semanteme of each problem word Feature can be added to and accordingly ask using the theme Word probability of each the problem word determined in above-mentioned steps 202 as an element In the term vector of epigraph.
In turn, the term vector of each problem word is encoded, obtains the coding vector of each problem word, in the embodiment of the present application To the specific implementation process that each problem word is encoded, referring to step 101 in above-described embodiment to each mesh in target text information The method that the term vector of word carries out semantic coding is marked, details are not described herein.
Step 204, to each target word, the matching journey of respective objects word Yu each problem word is determined using attention model Degree.
In the embodiment of the present application, before determining that each target word is asked questions answer in target text information, to determine The matching degree of each target word and each problem word, i.e., the importance of each problem word.
Respective objects can be determined using attention model to each target word as a kind of possible implementation The matching degree of word and each problem word.
Wherein, attention model, as attention attention mechanism, can be divided into spatial attention (Spatial ) and time attention (Temporal Attention) Attention.Soft attention (Soft Attention) can also be divided into With hard attention (Hard Attention).Soft Attention is that all data all can be note that can all calculate corresponding Attention weight, screening conditions will not be set.Hard Attention can screen out a part after generating attention weight Ineligible attention allows its attention weight to be 0, it can be interpreted as not it is further noted that these ineligible portions Point.
As an example, the matching degree of respective objects word Yu each problem word can be calculated by following formula (5).
Wherein, atten () represents the function for calculating matching degree;αiFor the matching journey of respective objects word and each problem word Degree;The respectively coding vector of the coding vector of each target word of target text information and each problem word.
It step 205, is target text information to the matching degree of each problem word and each problem word according to respective objects word Theme Word probability is calculated, and the weight of each problem word is obtained.
In the embodiment of the present application, by the above-mentioned respective objects word that is calculated to the matching degree of each problem word, and it is each Problem word be target text message subject Word probability after, can further obtain the weight of each problem word.
It, can be by the matching of relatively each problem word of target word each in target text information as a kind of possible implementation Degree is target text message subject Word probability multiplied by same problem word, obtains the weight of each problem word.Specifically, pass through by The problem of being calculated in step 202 word is target text message subject Word probability, corresponding multiplied by what is be calculated in step 204 The matching degree of target word and each problem word, the weight of available each problem word.
As a kind of possible implementation, the weight of each problem word can be calculated by following formula (6).
βi=fuse (αi,wi) (6)
Wherein, αiFor the matching degree of respective objects word and each problem word;wiIt is target text message subject for each problem word Word probability;βiFor the weight of each problem word.Wherein, fuse () function can there are many selections, e.g., can use dot product, then again Normalized.
Step 206, according to the weight of each problem word, summation is weighted to the coding vector of each problem word, is obtained corresponding Second coding vector of target word.
In the embodiment of the present application, language is carried out to the term vector of each target word in target text information according in step 101 The method of justice coding carries out semantic coding to the term vector of problem word each in asked questions, obtain the coding of each problem word to Amount, details are not described herein for specific coding method.
Further, according to the weight for each problem word being calculated in step 205, to the coding vector of each problem word into Row weighted sum, to obtain the second coding vector of respective objects word.
Wherein, the second coding vector carries target word to the matching degree feature of each problem word, the semantic feature of problem word And the theme feature of target text information.
Step 207, by the first coding vector of each target word and the second coding vector, input prediction model is each to determine Target word is the probability value of asked questions answer.
In the embodiment of the present application, by the first coding vector and second of each target word obtained in step 201 and step 206 Coding vector inputs in trained prediction model in advance, may thereby determine that each target word is the probability of asked questions answer.
It should be noted that prediction model has learnt to obtain the value of the first coding vector and taking for the second coding vector Value, the mapping relations between probability value, so as to according to the first coding vector of each target word and the second coding vector Value determines that each target word is the probability of asked questions answer.
As a kind of possible implementation, by the element in the first coding vector of each target word, with second encode to Element in amount merges, and obtains input vector, and input vector is inputted in trained prediction model in advance, so as to Determine that each target word is the probability of asked questions answer.
Wherein, the element number that input vector includes is in element number and the second coding vector in the first coding vector The sum of element number.
As an example, if the first coding vector of a certain target word isSecond coding vector isBy the target Each element in first coding vector of word, merges with each element in the second coding vector, the vector after being mergedIt willIt inputs in trained prediction model in advance, may thereby determine that the target word is the probability of asked questions answer.
It step 208, is the probability value of asked questions answer, each target for including according to each target word from target text information Answer word is extracted in word.
It is the probability value of asked questions answer according to each target word determined in step 207 in the embodiment of the present application, it can be with Each target word is ranked up, by the high target word sequence of probability value preceding, further, the company of extraction from target text information Continuous at least two target words occurred are as answer word.
For example, can be by series model (sequence model), boundary model (boundary model) etc., from target Answer word is extracted in each target word that text information includes.
Step 209, according to answer word, the answer of asked questions is generated.
In the embodiment of the present application, according to the answer word extracted from each target word that target text information includes, determination is mentioned Initial position of the answer asked questions in target text, to generate the answer of asked questions.
As a kind of possible implementation, when being taken out from each target word that target text information includes by series model Take answer word.Series model does not consider the continuity problem of word, after answer word is extracted one by one, then the answer word of extraction is spelled It picks up and.
As alternatively possible implementation, each target that can also include from target text information by boundary model Answer word is extracted in word.Boundary model only selects starting and the final position of answer, and intermediate is used as answer.
As alternatively possible situation, when larger for some difficulty problem, answer is in addition to part is from given Outside target text information, it is also necessary to reasoning and voluntarily generate, also need to do extra work when then extracting answer word.Such as it is logical After crossing the answer word extracted in target text information, then pass through extraction-and then the method (extraction-then- of comprehensive framework Synthesis framework) generate asked questions answer.
The method for determining problem answers of the embodiment of the present application, passes through the word to each target word in target text information Vector carries out semantic coding, obtains the first coding vector of target word, determines that each problem word is target text letter in asked questions It ceases theme Word probability and is added to the term vector of corresponding problem word using the theme Word probability of each problem word as an element In, to each target word, the matching degree of respective objects word Yu each problem word is determined using attention model, according to corresponding mesh Mark word is that target text message subject Word probability calculates to the matching degree of each problem word and each problem word, is obtained each The weight of problem word is weighted summation to the coding vector of each problem word, obtains respective objects according to the weight of each problem word Second coding vector of word, by the first coding vector and the second coding vector of each target word, input prediction model is each to determine Target word is the probability value of asked questions answer, is the probability value of asked questions answer according to each target word, believes from target text Answer word is extracted in each target word that breath includes, and the answer of asked questions is generated according to answer word.This method can help model The problems in target word word is more accurately differentiated, to improve the accuracy of model prediction.
In order to realize above-described embodiment, the application also proposes a kind of for determining the device of problem answers.
Fig. 3 is provided by the embodiments of the present application a kind of for determining the structural schematic diagram of the device of problem answers.
As shown in figure 3, this is used to determine that the device 100 of problem answers to include: the first coding module 110, second coding mould Block 120, computing module 130 and determining module 140.
First coding module 110 carries out semantic coding, obtains for the term vector to each target word in target text information To the first coding vector of target word.
Second coding module 120 carries out semantic coding for the term vector to problem word each in asked questions, obtains each The coding vector of problem word.
Computing module 130 is used for each target word, right according to respective objects word to the degree of concern of each problem word The coding vector of each problem word is weighted summation, obtains the second coding vector of respective objects word.
Determining module 140, for by the first coding vector and the second coding vector of each target word, input prediction model, Using each target word of determination as the probability value of asked questions answer;Wherein, prediction model has learnt to obtain taking for the first coding vector The value of value and the second coding vector, the mapping relations between probability value.
As a kind of possible implementation, the term vector of each problem word includes being used to indicate corresponding problem word as target text The element of this message subject Word probability, determining module 140, is specifically used for: by the element in the first coding vector of each target word, Merge with the element in the second coding vector, is input to prediction model as input vector;
Wherein, the element number that input vector includes is in element number and the second coding vector in the first coding vector The sum of element number;
Prediction model, for determining corresponding probability value according to each element value in input vector.
As alternatively possible implementation, referring to fig. 4, this is used to determine the device 100 of problem answers, can also wrap It includes:
First determining module 150, for determining, each problem word is target text message subject Word probability in asked questions.
Adding module 160, for being added to corresponding problem using the theme Word probability of each problem word as an element In the term vector of word.
As alternatively possible implementation, the first determining module 160 is specifically used for determining each problem word in target text The first frequency occurred in this information determines the second frequency that each problem word occurs in the corpus of setting, final according to each The first frequency and second frequency of problem word determine that each problem word is target text message subject Word probability.
As alternatively possible implementation, computing module 130 is specifically used for each target word, using attention Power model determines the matching degree of respective objects word Yu each problem word, according to respective objects word to the matching degree of each problem word, And each problem word is that target text message subject Word probability is calculated, and the weight of each problem word is obtained, according to each problem word Weight, summation is weighted to the coding vector of each problem word, obtains the second coding vector of respective objects word.
As alternatively possible implementation, computing module 130 be can be also used for respective objects word to each problem word Matching degree, multiplied by same problem word be target text message subject Word probability, obtain the weight of each problem word.
As alternatively possible implementation, referring to fig. 4, this is used to determine the device 100 of problem answers, can also wrap It includes:
Abstraction module 170, for according to each target word be asked questions answer probability value, include from target text information Each target word in extract answer word.
Generation module 180, for generating the answer of asked questions according to answer word.
As alternatively possible implementation, abstraction module 170, specifically for being answered according to each target word for asked questions The probability value of case extracts at least two target words continuously occurred as answer word from target text information.
It should be noted that the aforementioned explanation to for determining the embodiment of the method for problem answers is also applied for the reality The device for being used to determine problem answers of example is applied, details are not described herein again.
The device for being used to determine problem answers of the embodiment of the present application, passes through the word to each target word in target text information Vector carries out semantic coding, obtains the first coding vector of target word, to the term vector of problem word each in asked questions, carries out Semantic coding obtains the coding vector of each problem word, to each target word, according to the matching of respective objects word and each problem word Degree is weighted summation to the coding vector of each problem word, obtains the second coding vector of respective objects word, by each target word The first coding vector and the second coding vector, input prediction model, using each target word of determination as the probability of asked questions answer Value;Wherein, prediction model has learnt to obtain the value of the value of the first coding vector and the second coding vector, between probability value Mapping relations.This method is used to indicate corresponding problem word by increasing in the term vector of problem word as target text information The element of theme Word probability embodies the matching degree between target word and problem word, thus the feature of abundant prediction model input, It improves identification and asks the accuracy of in the title of the key words, and then improve the accuracy rate of model prediction answer.
In order to realize above-described embodiment, the embodiment of the present application also proposed a kind of for determining the model of problem answers.Fig. 5 It is provided by the embodiments of the present application a kind of for determining the structural schematic diagram of the model of problem answers.
As shown in figure 5, this is used to determine that the model 200 of problem answers to include: embeding layer 210, coding layer 220, matching layer 230 and inquiry layer 240.
Wherein, embeding layer 210, for obtaining the term vector of each target word according to target text information, and according to enquirement Each problem word obtains the term vector of each problem word in problem.
Coding layer 220 carries out semantic coding for the term vector to each target word, obtains the coding vector of target word;With And the term vector to problem word each in asked questions, semantic coding is carried out, the term vector of each problem word is obtained.
Matching layer 230, for calculating the matching degree of each target word and each problem word in target text information, to each problem The coding vector of word is weighted summation, obtains the second coding vector of respective objects word.
Layer 240 is inquired, for by the first coding vector and the second coding vector of each target word, input prediction model, with It determines that each target word is the probability value of the asked questions answer, is the probability value of asked questions answer according to each target word, from Answer word is extracted in each target word that target text information includes;According to answer word, the answer of asked questions is generated.
In order to realize above-described embodiment, the application also proposes a kind of computer equipment, including memory, processor and storage On a memory and the computer program that can run on a processor, it when the processor executes described program, realizes as above-mentioned Method as described in the examples for determining problem answers.
In order to realize above-described embodiment, the application also proposes a kind of non-transitorycomputer readable storage medium, when described When instruction in storage medium is executed by processor, realize as above-mentioned as described in the examples for determining the side of problem answers Method.
Fig. 6 shows the block diagram for being suitable for the exemplary computer device for being used to realize the application embodiment.What Fig. 6 was shown Computer equipment 12 is only an example, should not function to the embodiment of the present application and use scope bring any restrictions.
As shown in fig. 6, computer equipment 12 is showed in the form of universal computing device.The component of computer equipment 12 can be with Including but not limited to: one or more processor or processing unit 16, system storage 28 connect different system components The bus 18 of (including system storage 28 and processing unit 16).
Bus 18 indicates one of a few class bus structures or a variety of, including memory bus or Memory Controller, Peripheral bus, graphics acceleration port, processor or the local bus using any bus structures in a variety of bus structures.It lifts For example, these architectures include but is not limited to industry standard architecture (Industry Standard Architecture;Hereinafter referred to as: ISA) bus, microchannel architecture (Micro Channel Architecture;Below Referred to as: MAC) bus, enhanced isa bus, Video Electronics Standards Association (Video Electronics Standards Association;Hereinafter referred to as: VESA) local bus and peripheral component interconnection (Peripheral Component Interconnection;Hereinafter referred to as: PCI) bus.
Computer equipment 12 typically comprises a variety of computer system readable media.These media can be it is any can be by The usable medium that computer equipment 12 accesses, including volatile and non-volatile media, moveable and immovable medium.
Memory 28 may include the computer system readable media of form of volatile memory, such as random access memory Device (Random Access Memory;Hereinafter referred to as: RAM) 30 and/or cache memory 32.Computer equipment 12 can be with It further comprise other removable/nonremovable, volatile/non-volatile computer system storage mediums.Only as an example, Storage system 34 can be used for reading and writing immovable, non-volatile magnetic media, and (Fig. 6 do not show, commonly referred to as " hard drive Device ").Although being not shown in Fig. 6, the disk for reading and writing to removable non-volatile magnetic disk (such as " floppy disk ") can be provided and driven Dynamic device, and to removable anonvolatile optical disk (such as: compact disc read-only memory (Compact Disc Read Only Memory;Hereinafter referred to as: CD-ROM), digital multi CD-ROM (Digital Video Disc Read Only Memory;Hereinafter referred to as: DVD-ROM) or other optical mediums) read-write CD drive.In these cases, each driving Device can be connected by one or more data media interfaces with bus 18.Memory 28 may include that at least one program produces Product, the program product have one group of (for example, at least one) program module, and it is each that these program modules are configured to perform the application The function of embodiment.
Program/utility 40 with one group of (at least one) program module 42 can store in such as memory 28 In, such program module 42 include but is not limited to operating system, one or more application program, other program modules and It may include the realization of network environment in program data, each of these examples or certain combination.Program module 42 is usual Execute the function and/or method in embodiments described herein.
Computer equipment 12 can also be with one or more external equipments 14 (such as keyboard, sensing equipment, display 24 Deng) communication, the equipment interacted with the computer system/server 12 can be also enabled a user to one or more to be communicated, and/ Or with enable the computer system/server 12 and one or more of the other any equipment (example for being communicated of calculating equipment Such as network interface card, modem etc.) communication.This communication can be carried out by input/output (I/O) interface 22.Also, it calculates Machine equipment 12 can also pass through network adapter 20 and one or more network (such as local area network (Local Area Network;Hereinafter referred to as: LAN), wide area network (Wide Area Network;Hereinafter referred to as: WAN) and/or public network, example Such as internet) communication.As shown, network adapter 20 is communicated by bus 18 with other modules of computer equipment 12.It answers When understanding, although not shown in the drawings, other hardware and/or software module can be used in conjunction with computer equipment 12, including but not Be limited to: microcode, device driver, redundant processing unit, external disk drive array, RAID system, tape drive and Data backup storage system etc..
Processing unit 16 by the program that is stored in system storage 28 of operation, thereby executing various function application and Data processing, such as realize the method referred in previous embodiment.
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically show The description of example " or " some examples " etc. means specific features, structure, material or spy described in conjunction with this embodiment or example Point is contained at least one embodiment or example of the application.In the present specification, schematic expression of the above terms are not It must be directed to identical embodiment or example.Moreover, particular features, structures, materials, or characteristics described can be in office It can be combined in any suitable manner in one or more embodiment or examples.In addition, without conflicting with each other, the skill of this field Art personnel can tie the feature of different embodiments or examples described in this specification and different embodiments or examples It closes and combines.
In addition, term " first ", " second " are used for descriptive purposes only and cannot be understood as indicating or suggesting relative importance Or implicitly indicate the quantity of indicated technical characteristic.Define " first " as a result, the feature of " second " can be expressed or Implicitly include at least one this feature.In the description of the present application, the meaning of " plurality " is at least two, such as two, three It is a etc., unless otherwise specifically defined.
Any process described otherwise above or method description are construed as in flow chart or herein, and expression includes It is one or more for realizing custom logic function or process the step of executable instruction code module, segment or portion Point, and the range of the preferred embodiment of the application includes other realization, wherein can not press shown or discussed suitable Sequence, including according to related function by it is basic simultaneously in the way of or in the opposite order, Lai Zhihang function, this should be by the application Embodiment person of ordinary skill in the field understood.
Expression or logic and/or step described otherwise above herein in flow charts, for example, being considered use In the order list for the executable instruction for realizing logic function, may be embodied in any computer-readable medium, for Instruction execution system, device or equipment (such as computer based system, including the system of processor or other can be held from instruction The instruction fetch of row system, device or equipment and the system executed instruction) it uses, or combine these instruction execution systems, device or set It is standby and use.For the purpose of this specification, " computer-readable medium ", which can be, any may include, stores, communicates, propagates or pass Defeated program is for instruction execution system, device or equipment or the dress used in conjunction with these instruction execution systems, device or equipment It sets.The more specific example (non-exhaustive list) of computer-readable medium include the following: there is the electricity of one or more wirings Interconnecting piece (electronic device), portable computer diskette box (magnetic device), random access memory (RAM), read-only memory (ROM), erasable edit read-only storage (EPROM or flash memory), fiber device and portable optic disk is read-only deposits Reservoir (CDROM).In addition, computer-readable medium can even is that the paper that can print described program on it or other are suitable Medium, because can then be edited, be interpreted or when necessary with it for example by carrying out optical scanner to paper or other media His suitable method is handled electronically to obtain described program, is then stored in computer storage.
It should be appreciated that each section of the application can be realized with hardware, software, firmware or their combination.Above-mentioned In embodiment, software that multiple steps or method can be executed in memory and by suitable instruction execution system with storage Or firmware is realized.Such as, if realized with hardware in another embodiment, following skill well known in the art can be used Any one of art or their combination are realized: have for data-signal is realized the logic gates of logic function from Logic circuit is dissipated, the specific integrated circuit with suitable combinational logic gate circuit, programmable gate array (PGA), scene can compile Journey gate array (FPGA) etc..
Those skilled in the art are understood that realize all or part of step that above-described embodiment method carries It suddenly is that relevant hardware can be instructed to complete by program, the program can store in a kind of computer-readable storage medium In matter, which when being executed, includes the steps that one or a combination set of embodiment of the method.
It, can also be in addition, can integrate in a processing module in each functional unit in each embodiment of the application It is that each unit physically exists alone, can also be integrated in two or more units in a module.Above-mentioned integrated mould Block both can take the form of hardware realization, can also be realized in the form of software function module.The integrated module is such as Fruit is realized and when sold or used as an independent product in the form of software function module, also can store in a computer In read/write memory medium.
Storage medium mentioned above can be read-only memory, disk or CD etc..Although having been shown and retouching above Embodiments herein is stated, it is to be understood that above-described embodiment is exemplary, and should not be understood as the limit to the application System, those skilled in the art can be changed above-described embodiment, modify, replace and become within the scope of application Type.

Claims (11)

1. a kind of method for determining problem answers, which is characterized in that the described method comprises the following steps:
To the term vector of each target word in target text information, semantic coding is carried out, the first coding vector of target word is obtained;
To the term vector of problem word each in asked questions, semantic coding is carried out, the coding vector of each problem word is obtained;
To each target word, according to the matching degree of respective objects word and each problem word, to the coding vector of each problem word into Row weighted sum obtains the second coding vector of respective objects word;
By the first coding vector and the second coding vector of each target word, input prediction model is described with each target word of determination The probability value of asked questions answer;Wherein, the prediction model has learnt to obtain the value of first coding vector and described Mapping relations between the value of second coding vector, with the probability value.
2. the method according to claim 1, wherein the term vector of each problem word includes being used to indicate corresponding problem Word is the element of the target text message subject Word probability;It is described by the first coding vector of each target word and second encode to Amount, input prediction model, comprising:
Merge, the element in the first coding vector of each target word as input with the element in second coding vector Vector is input to the prediction model;
Wherein, the element number that input vector includes be first coding vector in element number with described second encode to The sum of element number in amount;
The prediction model, for determining the corresponding probability value according to each element value in the input vector.
3. according to the method described in claim 2, it is characterized in that, the term vector to problem word each in asked questions, into Before row semantic coding, comprising:
Determine that each problem word is the target text message subject Word probability in the asked questions;
Using the theme Word probability of each problem word as an element, it is added in the term vector of corresponding problem word.
4. according to the method described in claim 3, it is characterized in that, each problem word is described in the determination asked questions Target text message subject Word probability, comprising:
Determine the first frequency that each problem word occurs in the target text information;
Determine the second frequency that each problem word occurs in the corpus of setting;
According to the first frequency and the second frequency of each problem word, determine that each problem word is the target text information master Write inscription probability.
5. the method according to claim 1, wherein described to each target word, according to respective objects word pair The degree of concern of each problem word is weighted summation to the coding vector of each problem word, obtains the second coding of respective objects word Vector, comprising:
To each target word, the matching degree of respective objects word Yu each problem word is determined using attention model;
It is that the target text message subject word is general to the matching degree of each problem word and each problem word according to respective objects word Rate is calculated, and the weight of each problem word is obtained;
According to the weight of each problem word, summation is weighted to the coding vector of each problem word, obtains the second of respective objects word Coding vector.
6. according to the method described in claim 5, it is characterized in that, it is described according to respective objects word to the matching journey of each problem word Degree and each problem word are that the target text message subject Word probability is calculated, comprising:
It is that the target text message subject word is general multiplied by same problem word by respective objects word to the matching degree of each problem word Rate obtains the weight of each problem word.
7. method according to claim 1-6, which is characterized in that first coding vector by each target word With the second coding vector, input prediction model take each target word of determination as the probability value of the asked questions answer, also wraps later It includes:
It is the probability value of the asked questions answer according to each target word, from each target word that the target text information includes Extract answer word;
According to the answer word, the answer of the asked questions is generated.
8. according to each target word being the asked questions answer the method according to the description of claim 7 is characterized in that described Probability value extracts answer word from each target word that the target text information includes, comprising:
It is the probability value of the asked questions answer according to each target word, extracts from the target text information and continuously to occur At least two target words are as the answer word.
9. a kind of for determining the device of problem answers, which is characterized in that described device includes:
First coding module carries out semantic coding, obtains target word for the term vector to each target word in target text information The first coding vector;
Second coding module carries out semantic coding, obtains each problem word for the term vector to problem word each in asked questions Coding vector;
Computing module is used for each target word, according to respective objects word to the degree of concern of each problem word, to each problem word Coding vector be weighted summation, obtain the second coding vector of respective objects word;
Determining module, for by the first coding vector and the second coding vector of each target word, input prediction model to be each to determine Target word is the probability value of the asked questions answer;Wherein, the prediction model has learnt to obtain first coding vector Value and second coding vector value, with the mapping relations between the probability value.
10. a kind of computer equipment, which is characterized in that including memory, processor and store on a memory and can handle The computer program run on device when the processor executes described program, realizes such as use described in any one of claims 1-8 In the method for determining problem answers.
11. a kind of non-transitorycomputer readable storage medium, is stored thereon with computer program, which is characterized in that the program Such as the method described in any one of claims 1-8 for determining problem answers is realized when being executed by processor.
CN201811626414.XA 2018-12-28 2018-12-28 Method, apparatus, computer device and storage medium for determining answers to questions Active CN109670029B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811626414.XA CN109670029B (en) 2018-12-28 2018-12-28 Method, apparatus, computer device and storage medium for determining answers to questions

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811626414.XA CN109670029B (en) 2018-12-28 2018-12-28 Method, apparatus, computer device and storage medium for determining answers to questions

Publications (2)

Publication Number Publication Date
CN109670029A true CN109670029A (en) 2019-04-23
CN109670029B CN109670029B (en) 2021-09-07

Family

ID=66146568

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811626414.XA Active CN109670029B (en) 2018-12-28 2018-12-28 Method, apparatus, computer device and storage medium for determining answers to questions

Country Status (1)

Country Link
CN (1) CN109670029B (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110222345A (en) * 2019-06-18 2019-09-10 卓尔智联(武汉)研究院有限公司 Cloze Test answer method, apparatus, electronic equipment and storage medium
CN110263330A (en) * 2019-05-22 2019-09-20 腾讯科技(深圳)有限公司 Improvement, device, equipment and the storage medium of problem sentence
CN110390001A (en) * 2019-06-04 2019-10-29 深思考人工智能机器人科技(北京)有限公司 A kind of viewpoint type machine reads the implementation method understood, device
CN110674280A (en) * 2019-06-21 2020-01-10 四川大学 Answer selection algorithm based on enhanced question importance expression
CN110688470A (en) * 2019-09-27 2020-01-14 北京百度网讯科技有限公司 Method and apparatus for transmitting information
CN110852071A (en) * 2019-11-08 2020-02-28 科大讯飞股份有限公司 Knowledge point detection method, device, equipment and readable storage medium
CN111027327A (en) * 2019-10-29 2020-04-17 平安科技(深圳)有限公司 Machine reading understanding method, device, storage medium and device
CN111078851A (en) * 2019-12-09 2020-04-28 科大讯飞(苏州)科技有限公司 Information processing method, device, equipment and readable storage medium
CN111309878A (en) * 2020-01-19 2020-06-19 支付宝(杭州)信息技术有限公司 Retrieval type question-answering method, model training method, server and storage medium
CN111414464A (en) * 2019-05-27 2020-07-14 腾讯科技(深圳)有限公司 Question generation method, device, equipment and storage medium
CN111737954A (en) * 2020-06-12 2020-10-02 百度在线网络技术(北京)有限公司 Text similarity determination method, device, equipment and medium
CN111866608A (en) * 2020-08-05 2020-10-30 北京育宝科技有限公司 Video playing method, device and system for teaching
CN111881257A (en) * 2020-07-24 2020-11-03 广州大学 Automatic matching method, system and storage medium based on subject word and sentence subject matter
CN111949791A (en) * 2020-07-28 2020-11-17 中国工商银行股份有限公司 Text classification method, device and equipment
CN112131364A (en) * 2020-09-22 2020-12-25 沈阳东软智能医疗科技研究院有限公司 Question answering method, device, electronic equipment and storage medium
CN112163405A (en) * 2020-09-08 2021-01-01 北京百度网讯科技有限公司 Question generation method and device
WO2021000675A1 (en) * 2019-07-04 2021-01-07 平安科技(深圳)有限公司 Method and apparatus for machine reading comprehension of chinese text, and computer device
CN112347231A (en) * 2020-11-17 2021-02-09 广联达科技股份有限公司 Building list matching model construction method, matching method and device
CN112685548A (en) * 2020-12-31 2021-04-20 中科讯飞互联(北京)信息科技有限公司 Question answering method, electronic device and storage device
CN112749251A (en) * 2020-03-09 2021-05-04 腾讯科技(深圳)有限公司 Text processing method and device, computer equipment and storage medium
CN113761845A (en) * 2021-01-28 2021-12-07 北京沃东天骏信息技术有限公司 Text generation method and device, storage medium and electronic equipment
CN113779968A (en) * 2021-08-11 2021-12-10 深圳追一科技有限公司 Information extraction method and device, computer equipment and storage medium
CN114218940A (en) * 2021-12-23 2022-03-22 北京百度网讯科技有限公司 Text information processing method, text information processing device, text information model training method, text information model training device, text information model training equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107305578A (en) * 2016-04-25 2017-10-31 北京京东尚科信息技术有限公司 Human-machine intelligence's answering method and device
CN108345672A (en) * 2018-02-09 2018-07-31 平安科技(深圳)有限公司 Intelligent response method, electronic device and storage medium
WO2018147543A1 (en) * 2017-02-08 2018-08-16 한국과학기술원 Concept graph based query-response system and context search method using same
CN108763284A (en) * 2018-04-13 2018-11-06 华南理工大学 A kind of question answering system implementation method based on deep learning and topic model
CN108959246A (en) * 2018-06-12 2018-12-07 北京慧闻科技发展有限公司 Answer selection method, device and electronic equipment based on improved attention mechanism
CN109033068A (en) * 2018-06-14 2018-12-18 北京慧闻科技发展有限公司 It is used to read the method, apparatus understood and electronic equipment based on attention mechanism
CN109063174A (en) * 2018-08-21 2018-12-21 腾讯科技(深圳)有限公司 Inquire the generation method and device, computer storage medium, electronic equipment of answer

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107305578A (en) * 2016-04-25 2017-10-31 北京京东尚科信息技术有限公司 Human-machine intelligence's answering method and device
WO2018147543A1 (en) * 2017-02-08 2018-08-16 한국과학기술원 Concept graph based query-response system and context search method using same
CN108345672A (en) * 2018-02-09 2018-07-31 平安科技(深圳)有限公司 Intelligent response method, electronic device and storage medium
CN108763284A (en) * 2018-04-13 2018-11-06 华南理工大学 A kind of question answering system implementation method based on deep learning and topic model
CN108959246A (en) * 2018-06-12 2018-12-07 北京慧闻科技发展有限公司 Answer selection method, device and electronic equipment based on improved attention mechanism
CN109033068A (en) * 2018-06-14 2018-12-18 北京慧闻科技发展有限公司 It is used to read the method, apparatus understood and electronic equipment based on attention mechanism
CN109063174A (en) * 2018-08-21 2018-12-21 腾讯科技(深圳)有限公司 Inquire the generation method and device, computer storage medium, electronic equipment of answer

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZHANG WEI 等: "Design and implementation of influenza Question Answering System based on multi-strategies", 《2012 IEEE INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND AUTOMATION ENGINEERING (CSAE)》 *
相洋: "问答系统的答案优化方法研究", 《中国博士学位论文全文数据库 信息科技辑》 *

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110263330A (en) * 2019-05-22 2019-09-20 腾讯科技(深圳)有限公司 Improvement, device, equipment and the storage medium of problem sentence
CN111414464B (en) * 2019-05-27 2023-04-07 腾讯科技(深圳)有限公司 Question generation method, device, equipment and storage medium
CN111414464A (en) * 2019-05-27 2020-07-14 腾讯科技(深圳)有限公司 Question generation method, device, equipment and storage medium
CN110390001A (en) * 2019-06-04 2019-10-29 深思考人工智能机器人科技(北京)有限公司 A kind of viewpoint type machine reads the implementation method understood, device
CN110222345A (en) * 2019-06-18 2019-09-10 卓尔智联(武汉)研究院有限公司 Cloze Test answer method, apparatus, electronic equipment and storage medium
CN110674280A (en) * 2019-06-21 2020-01-10 四川大学 Answer selection algorithm based on enhanced question importance expression
CN110674280B (en) * 2019-06-21 2023-12-15 北京中科微末生物科技有限公司 Answer selection algorithm based on enhanced question importance representation
WO2021000675A1 (en) * 2019-07-04 2021-01-07 平安科技(深圳)有限公司 Method and apparatus for machine reading comprehension of chinese text, and computer device
CN110688470A (en) * 2019-09-27 2020-01-14 北京百度网讯科技有限公司 Method and apparatus for transmitting information
CN110688470B (en) * 2019-09-27 2022-04-26 北京百度网讯科技有限公司 Method and apparatus for transmitting information
CN111027327A (en) * 2019-10-29 2020-04-17 平安科技(深圳)有限公司 Machine reading understanding method, device, storage medium and device
CN111027327B (en) * 2019-10-29 2022-09-06 平安科技(深圳)有限公司 Machine reading understanding method, device, storage medium and device
CN110852071A (en) * 2019-11-08 2020-02-28 科大讯飞股份有限公司 Knowledge point detection method, device, equipment and readable storage medium
CN110852071B (en) * 2019-11-08 2023-10-24 科大讯飞股份有限公司 Knowledge point detection method, device, equipment and readable storage medium
CN111078851B (en) * 2019-12-09 2024-04-12 科大讯飞(苏州)科技有限公司 Information processing method, device, equipment and readable storage medium
CN111078851A (en) * 2019-12-09 2020-04-28 科大讯飞(苏州)科技有限公司 Information processing method, device, equipment and readable storage medium
CN111309878B (en) * 2020-01-19 2023-08-22 支付宝(杭州)信息技术有限公司 Search type question-answering method, model training method, server and storage medium
CN111309878A (en) * 2020-01-19 2020-06-19 支付宝(杭州)信息技术有限公司 Retrieval type question-answering method, model training method, server and storage medium
CN112749251B (en) * 2020-03-09 2023-10-31 腾讯科技(深圳)有限公司 Text processing method, device, computer equipment and storage medium
CN112749251A (en) * 2020-03-09 2021-05-04 腾讯科技(深圳)有限公司 Text processing method and device, computer equipment and storage medium
CN111737954B (en) * 2020-06-12 2023-07-28 百度在线网络技术(北京)有限公司 Text similarity determination method, device, equipment and medium
CN111737954A (en) * 2020-06-12 2020-10-02 百度在线网络技术(北京)有限公司 Text similarity determination method, device, equipment and medium
CN111881257A (en) * 2020-07-24 2020-11-03 广州大学 Automatic matching method, system and storage medium based on subject word and sentence subject matter
CN111949791A (en) * 2020-07-28 2020-11-17 中国工商银行股份有限公司 Text classification method, device and equipment
CN111949791B (en) * 2020-07-28 2024-01-30 中国工商银行股份有限公司 Text classification method, device and equipment
CN111866608B (en) * 2020-08-05 2022-08-16 北京华盛互联科技有限公司 Video playing method, device and system for teaching
CN111866608A (en) * 2020-08-05 2020-10-30 北京育宝科技有限公司 Video playing method, device and system for teaching
CN112163405A (en) * 2020-09-08 2021-01-01 北京百度网讯科技有限公司 Question generation method and device
CN112131364A (en) * 2020-09-22 2020-12-25 沈阳东软智能医疗科技研究院有限公司 Question answering method, device, electronic equipment and storage medium
CN112131364B (en) * 2020-09-22 2024-03-26 沈阳东软智能医疗科技研究院有限公司 Question answering method and device, electronic equipment and storage medium
CN112347231A (en) * 2020-11-17 2021-02-09 广联达科技股份有限公司 Building list matching model construction method, matching method and device
CN112685548B (en) * 2020-12-31 2023-09-08 科大讯飞(北京)有限公司 Question answering method, electronic device and storage device
CN112685548A (en) * 2020-12-31 2021-04-20 中科讯飞互联(北京)信息科技有限公司 Question answering method, electronic device and storage device
CN113761845A (en) * 2021-01-28 2021-12-07 北京沃东天骏信息技术有限公司 Text generation method and device, storage medium and electronic equipment
CN113779968A (en) * 2021-08-11 2021-12-10 深圳追一科技有限公司 Information extraction method and device, computer equipment and storage medium
CN114218940B (en) * 2021-12-23 2023-08-04 北京百度网讯科技有限公司 Text information processing and model training method, device, equipment and storage medium
CN114218940A (en) * 2021-12-23 2022-03-22 北京百度网讯科技有限公司 Text information processing method, text information processing device, text information model training method, text information model training device, text information model training equipment and storage medium

Also Published As

Publication number Publication date
CN109670029B (en) 2021-09-07

Similar Documents

Publication Publication Date Title
CN109670029A (en) For determining the method, apparatus, computer equipment and storage medium of problem answers
Da The computational case against computational literary studies
CN108780445B (en) Parallel hierarchical model for machine understanding of small data
Mandera et al. Explaining human performance in psycholinguistic tasks with models of semantic similarity based on prediction and counting: A review and empirical validation
Moens Argumentation mining: How can a machine acquire common sense and world knowledge?
Ling et al. Context-controlled topic-aware neural response generation for open-domain dialog systems
Kumar et al. BERT based semi-supervised hybrid approach for aspect and sentiment classification
Tran et al. Capturing contextual factors in sentiment classification: An ensemble approach
Diao et al. Multi-granularity bidirectional attention stream machine comprehension method for emotion cause extraction
Alqahtani et al. Emotion analysis of Arabic tweets: Language models and available resources
Dadas et al. Evaluation of sentence representations in polish
Kondurkar et al. Modern applications with a focus on training chatgpt and gpt models: Exploring generative ai and nlp
Tüselmann et al. Recognition-free question answering on handwritten document collections
Shafiq et al. Enhancing Arabic Aspect-Based Sentiment Analysis Using End-to-End Model
Akdemir et al. A review on deep learning applications with semantics
Gasimova Automated enriched medical concept generation for chest X-ray images
CN111914084A (en) Deep learning-based emotion label text generation and evaluation system
Ba Alawi et al. Performance Analysis of Embedding Methods for Deep Learning-Based Turkish Sentiment Analysis Models
Zhou et al. Multimodal embedding for lifelog retrieval
CN109933788A (en) Type determines method, apparatus, equipment and medium
Kusal et al. Transfer learning for emotion detection in conversational text: a hybrid deep learning approach with pre-trained embeddings
Sun et al. Informed graph convolution networks for multilingual short text understanding
Dunn et al. Designing and Evaluating Context-Sensitive Visualization Models for Deep Learning Text Classifiers
Bashir et al. A comprehensive survey of Sentiment Analysis: Word Embeddings approach, research challenges and opportunities
Maitra et al. Understanding the insights of privacy policies using bert

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant