CN107679224B - Intelligent question and answer method and system for unstructured text - Google Patents

Intelligent question and answer method and system for unstructured text Download PDF

Info

Publication number
CN107679224B
CN107679224B CN201710985745.1A CN201710985745A CN107679224B CN 107679224 B CN107679224 B CN 107679224B CN 201710985745 A CN201710985745 A CN 201710985745A CN 107679224 B CN107679224 B CN 107679224B
Authority
CN
China
Prior art keywords
text
vector
question
phrase
answer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710985745.1A
Other languages
Chinese (zh)
Other versions
CN107679224A (en
Inventor
简仁贤
王海波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Emotibot Technologies Ltd
Original Assignee
Emotibot Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Emotibot Technologies Ltd filed Critical Emotibot Technologies Ltd
Priority to CN201710985745.1A priority Critical patent/CN107679224B/en
Publication of CN107679224A publication Critical patent/CN107679224A/en
Application granted granted Critical
Publication of CN107679224B publication Critical patent/CN107679224B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention belongs to the technical field of computer intelligent dialogue, and provides a method and a system for intelligent question answering of unstructured text, which comprises the following steps: s1, the coding layer respectively codes the acquired text and the problem to obtain a text hidden vector and a problem hidden vector; s2, the information fusion layer fuses the text hidden vector and the problem hidden vector to obtain a fused association vector group; and S3, decoding the text by the decoding layer according to the association vector group to obtain an answer to the question, and outputting the answer. The invention can directly provide answers aiming at the questions of the unstructured text without establishing a question-answer library in advance; there is no restriction on the type of question; the returned answers are accurate; and the data is driven, and the big data is effectively utilized.

Description

Intelligent question and answer method and system for unstructured text
Technical Field
The invention belongs to the technical field of computer intelligent dialogue, and particularly relates to a method and a system for intelligent question answering of unstructured text.
Background
Unstructured text intelligent question answering refers to any given piece of unstructured text and any question for the text that satisfies the condition that the answer to the question appears in the given unstructured text. In this case, the intelligent question-answering system should be able to find out the corresponding answer to answer the question.
The current intelligent question-answering technology without the structure text mainly has four types, but all have respective defects:
the method based on the question-answer library is difficult to construct, especially under the condition that unstructured texts cannot be known in advance. Meanwhile, it is difficult to list all questions and answers for unstructured text in advance in consideration of the openness of user questions.
Search-based methods are particularly prone to congenital defects. The answer is first made only according to the similarity between the cut-out sentence and the question of the user, and it is possible to answer the question. Meanwhile, the whole sentence is returned as an answer, the granularity is too large, and the most accurate answer is not found.
The method based on named entity recognition first needs to judge the intention of the question, and only when the question is asking for the named entity is suitable for answering. Therefore, the method has limited questions to answer and can not answer questions of non-named entities. Meanwhile, when a plurality of named entities of the same type appear in the unstructured text, how to select and judge the questioning intention may be inaccurate, which may affect the effectiveness of the method.
The method based on structural atlas analysis firstly needs to analyze the whole unstructured text and extract key elements in the unstructured text to construct an atlas. How to analyze the map and further find out the answer is not a perfect method at present, and the answer is obtained based on various rules, so that the accuracy is not high. The same elements appearing in different places in the long article and the article can increase the difficulty of the atlas analysis.
In summary, the current intelligent question-answering technology without structured texts mainly has the following defects: a question-answer library needs to be constructed in advance; the answer returned may be too coarse or too fine in size, not particularly accurate; the types of questions that can be answered accurately are relatively limited; large data cannot be effectively utilized.
Disclosure of Invention
Aiming at the defects of the problems, the invention provides a method and a system for intelligent question answering of unstructured text, which can directly give answers to questions of unstructured text without establishing a question-answering library in advance; there is no restriction on the type of question; the returned answers are accurate; and the data is driven, and the big data is effectively utilized.
In order to achieve the above object, the method for intelligent question answering oriented to unstructured text provided by the invention comprises the following steps:
s1, the coding layer respectively codes the acquired text and the problem to obtain a text hidden vector and a problem hidden vector;
s2, the information fusion layer fuses the text hidden vector and the problem hidden vector to obtain a fused association vector group;
and S3, decoding the text by the decoding layer according to the association vector group to obtain an answer to the question, and outputting the answer.
Preferably, the specific method of S1 is:
s11, acquiring input texts and questions;
s12: performing word segmentation on the text and the question to obtain a text phrase and a question phrase;
s13: mapping the text phrase and the problem phrase to corresponding word vectors respectively to obtain a text phrase vector and a problem phrase vector;
s14: and coding the text phrase vector and the problem phrase vector by using a bidirectional cyclic neural network to obtain a text hidden vector and a problem hidden vector.
Preferably, the specific method of S12 is:
respectively segmenting the text C and the question Q to obtain a text phrase C1
Figure BDA0001440470420000021
And question phrase Q1
Figure BDA0001440470420000022
Wherein
Figure BDA0001440470420000023
For the ith word in the text word group,
Figure BDA0001440470420000024
is the jth word in the question phrase, n is the total number of words in the text phrase, and m is the total number of words in the question phrase.
Preferably, the specific method of S13 is:
will text phrase C1
Figure BDA0001440470420000031
And question phrase Q1
Figure BDA0001440470420000032
Are respectively provided withMapping to corresponding word vector to obtain text phrase vector C2
Figure BDA0001440470420000033
And question phrase vector Q2
Figure BDA0001440470420000034
Wherein
Figure BDA0001440470420000035
Is composed of
Figure BDA0001440470420000036
The corresponding word vector is then used to generate the word vector,
Figure BDA0001440470420000037
is composed of
Figure BDA0001440470420000038
The corresponding word vector.
Preferably, the S14 is specifically:
applying bidirectional cyclic neural network to text phrase vector C2
Figure BDA0001440470420000039
And question phrase vector Q2
Figure BDA00014404704200000310
Respectively coding to obtain a text hidden vector C3
Figure BDA00014404704200000311
And problem hiding vector Q3
Figure BDA00014404704200000312
Wherein
Figure BDA00014404704200000313
Is composed of
Figure BDA00014404704200000314
The corresponding hidden vector(s) is (are),
Figure BDA00014404704200000315
is composed of
Figure BDA00014404704200000316
Corresponding hidden vector, wherein
Figure BDA00014404704200000317
Preferably, the S2 is specifically:
hiding the last hidden vector of the problem hidden vector
Figure BDA00014404704200000318
And text hidden vector
Figure BDA00014404704200000319
Each hidden vector in (1) is weighted to calculate a similarity value
Figure BDA00014404704200000320
Figure BDA00014404704200000321
Wherein the content of the first and second substances,
Figure BDA00014404704200000322
representing hidden vectors
Figure BDA00014404704200000323
Transpose of (w)sIs a parameter matrix;
will be similar to the value aiAnd text hidden vector
Figure BDA00014404704200000324
Multiplying each hidden vector in the table to calculate a fused association vector group
Figure BDA00014404704200000325
Figure BDA00014404704200000326
Preferably, the S3 is specifically:
combining the fused association vectors together, g ═ concat (H)i) Taking the merged vector g as the input of two different fully-connected networks, wherein the two different fully-connected networks comprise a first fully-connected network and a second fully-connected network, and the output value of the first fully-connected network is the probability distribution p of the starting position of the predicted answer1The output value of the second fully-connected network is the probability distribution p of the predicted answer ending position2
p1=softmax(w1g)
p2=softmax(w2g)
Wherein w1And w2For the parameters, the starting position p of the answer is calculatedsAnd an end position pe
ps=argmax(p1)
pe=argmax(p2)
In the text, a start position p is extractedsAnd an end position peThe text content in between serves as the answer to the question and outputs the answer.
A system for intelligent question answering oriented to unstructured text, comprising:
the coding layer module is used for coding the acquired text and the problem respectively to obtain a text hidden vector and a problem hidden vector;
the information fusion layer module is used for fusing the text hidden vector and the problem hidden vector to obtain a fused association vector group;
and the decoding layer module is used for decoding the text according to the association vector group to obtain an answer of the question and outputting the answer.
According to the scheme, the unstructured text oriented intelligent question answering method and system can directly give answers to questions of unstructured texts without establishing a question answering library in advance; there is no restriction on the type of question; the returned answers are accurate; and the data is driven, and the big data is effectively utilized.
Drawings
In order to more clearly illustrate the detailed description of the invention or the technical solutions in the prior art, the drawings that are needed in the detailed description of the invention or the prior art will be briefly described below. Throughout the drawings, like elements or portions are generally identified by like reference numerals. In the drawings, elements or portions are not necessarily drawn to scale.
FIG. 1 is a flowchart of the intelligent question answering method for unstructured text in this embodiment;
fig. 2 is a block diagram of a system structure for intelligent question answering based on unstructured text in the embodiment.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings. The following examples are given solely for the purpose of illustrating the products of the invention more clearly and are therefore to be considered as examples only and are not intended to limit the scope of the invention.
Example (b):
the embodiment of the invention provides a method for intelligent question answering of unstructured text, which is shown in figure 1 and comprises the following steps:
s1, the coding layer respectively codes the acquired text and the problem to obtain a text hidden vector and a problem hidden vector;
s2, the information fusion layer fuses the text hidden vector and the problem hidden vector to obtain a fused association vector group;
and S3, decoding the text by the decoding layer according to the association vector group to obtain an answer to the question, and outputting the answer.
Preferably, the specific method of S1 is:
s11, acquiring input texts and questions;
s12: performing word segmentation on the text and the question to obtain a text phrase and a question phrase;
s13: mapping each word in the text phrase and the question phrase to a corresponding word vector respectively to obtain a text word vector and a question word vector;
s14: and coding the text phrase vector and the problem phrase vector by using a bidirectional cyclic neural network to obtain a text hidden vector and a problem hidden vector.
The specific method of S12 is as follows:
respectively segmenting the text C and the question Q to obtain a text phrase C1
Figure BDA0001440470420000051
And question phrase Q1
Figure BDA0001440470420000052
Wherein
Figure BDA0001440470420000053
For the ith word in the text word group,
Figure BDA0001440470420000054
is the jth word in the question phrase, n is the total number of words in the text phrase, and m is the total number of words in the question phrase.
The specific method of S13 is as follows:
will text phrase C1
Figure BDA0001440470420000061
And question phrase Q1
Figure BDA0001440470420000062
Respectively mapping to corresponding word vectors to obtain text phrase vectors C2
Figure BDA0001440470420000063
And question phrase vector Q2
Figure BDA0001440470420000064
Wherein
Figure BDA0001440470420000065
Is composed of
Figure BDA0001440470420000066
The corresponding word vector is then used to generate the word vector,
Figure BDA0001440470420000067
is composed of
Figure BDA0001440470420000068
The corresponding word vector.
The S14 specifically includes:
applying bidirectional cyclic neural network to text phrase vector C2
Figure BDA0001440470420000069
And question phrase vector Q2
Figure BDA00014404704200000610
Respectively coding to obtain a text hidden vector C3
Figure BDA00014404704200000611
And problem hiding vector Q3
Figure BDA00014404704200000612
Wherein
Figure BDA00014404704200000613
Is composed of
Figure BDA00014404704200000614
The corresponding hidden vector(s) is (are),
Figure BDA00014404704200000615
is composed of
Figure BDA00014404704200000616
Corresponding hidden vector, wherein
Figure BDA00014404704200000617
The S2 specifically includes:
hiding the last hidden vector of the problem hidden vector
Figure BDA00014404704200000618
And text hidden vector
Figure BDA00014404704200000619
Each hidden vector in (1) is weighted to calculate a similarity value
Figure BDA00014404704200000620
Figure BDA00014404704200000621
Wherein the content of the first and second substances,
Figure BDA00014404704200000622
representing hidden vectors
Figure BDA00014404704200000623
Transpose of (w)sIs a parameter matrix;
will be similar to the value aiAnd text hidden vector
Figure BDA00014404704200000624
Multiplying each hidden vector in the table to calculate a fused association vector group
Figure BDA00014404704200000625
Figure BDA00014404704200000626
The S3 specifically includes:
combining the fused association vectors together, g ═ concat (H)i) The combined vector g is taken as two different vectorsInputting a fully-connected network, wherein the two different fully-connected networks comprise a first fully-connected network and a second fully-connected network, and the output value of the first fully-connected network is the probability distribution p of the starting position of the predicted answer1The output value of the second fully-connected network is the probability distribution p of the predicted answer ending position2
p1=softmax(w1g)
p2=softmax(w2g)
Wherein w1And w2For the parameters, the starting position p of the answer is calculatedsAnd an end position pe
ps=argmax(p1)
pe=argmax(p2)
In the text, a start position p is extractedsAnd an end position peThe text content in between serves as the answer to the question and outputs the answer.
A system for intelligent question answering oriented to unstructured text, as shown in fig. 2, includes:
the coding layer module is used for coding the acquired text and the problem respectively to obtain a text hidden vector and a problem hidden vector;
the information fusion layer module is used for fusing the text hidden vector and the problem hidden vector to obtain a fused association vector group;
and the decoding layer module is used for decoding the text according to the association vector group to obtain an answer of the question and outputting the answer.
BilSTM and softmax are existing machine learning algorithms in this embodiment, the concat () method is used to join two or more arrays without changing the existing arrays, and argmax represents finding the parameter with the largest score. The embodiment can be applied to various aspects in life, such as a chat system, and can answer the relevant questions of the user about the historical conversation to improve the understanding of the conversation; the system can also be applied to a system interacting with the user, such as story interaction, the system tells a story to the user, the user can ask questions about the story, the system can answer the story and the like. There are other applications of this technology, such as can be used to understand lengthy instructions, answer questions from a user; the technology can also be used to understand legal provisions and answer questions about legal aspects; even legacy documents of an organization may be converted into a question-answering system or the like.
The first embodiment is as follows: story understanding
The text is: tender bud petals grow on the tree in spring; the summer trees are full of fertile leaves; coating fresh red and golden leaves on autumn trees; in winter, the leaves fall to the ground to form soil. Fallen leaves are stamps of nature, and are sent to you, me and everywhere all the year round.
The problems are as follows: what is growing on the tree in spring?
In this embodiment, after analyzing the text and the question, a start position and an end position of the answer are calculated, and the text content between the start position and the end position is extracted as the answer to the question, where the answer is: tender bud petals grow out
Example two: understanding of legal provisions
The text is: chapter iv museum social services
The twenty-eighth museum should be open to the public within 6 months of the day the certificate of registration was obtained.
The twenty-ninth museum should announce specific open times to the public. In the national legal holidays and the school summer and chill holidays, the museum should be opened.
The thirty-th museum, which holds the exhibition, should comply with the following regulations:
subject matter and content should conform to the basic principles determined by constitution and maintain national security and ethnic group, propagate the patriots … …
The problems are as follows: how long should a museum be open to the public?
In this embodiment, after analyzing the text and the question, a start position and an end position of the answer are calculated, and the text content between the start position and the end position is extracted as the answer to the question, where the answer is: the twenty-eighth museum should have 6 months since the date the certificate of registration was obtained
And (3) implementation: news understanding
The text is: at present, the main field of the rocket is attacked by thunder, the rocket in the field is opened to obtain 9-0 times, then the Weisbulux rate team is tightly chased for the score, the Williams is actively represented after rotation, the rocket in the next section is attacked under the belt of the rocket to obtain wind and water, the score difference gradually approaches 20 minutes, and the rocket at the end of half field is advanced by 79-59. The thunder coming back at the lower half field has the great starting potential and reduces the difference to 12 minutes, however, the rocket is as much as three times as rain, 7 minutes in the violent, the rocket still leads 25 minutes at the end of the half field, the Weis Bruker band team reduces the difference to 12 minutes again after the rocket is hit to the last section, and finally the rocket wins the success.
The thunder data is aided by a Lansel-Weiss Bruk 39 divided into 11 backboard 13; the Vicker-Orladibo 15 is divided into 4 to assist attack; stevens-adars 11 points to 4 backboard; eines-candel 23 in 4 backboard; tai-gibson 12 minutes 4 backboard.
The problems are as follows: how many points are ari?
In this embodiment, after analyzing the text and the question, a start position and an end position of the answer are calculated, and the text content between the start position and the end position is extracted as the answer to the question, where the answer is: 24-in-5 backboard
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention, and they should be construed as being included in the following claims and description.

Claims (6)

1. An intelligent question and answer method for unstructured texts is characterized by comprising the following steps:
s1, the coding layer respectively codes the acquired text and the problem to obtain a text hidden vector and a problem hidden vector;
s2, the information fusion layer fuses the text hidden vector and the problem hidden vector to obtain a fused association vector group;
s3, decoding the text by the decoding layer according to the association vector group to obtain an answer to the question and outputting the answer;
the specific method of S1 is as follows:
s11, acquiring input texts and questions;
s12: performing word segmentation on the text and the question to obtain a text phrase and a question phrase;
s13: mapping the text phrase and the problem phrase to corresponding word vectors respectively to obtain a text phrase vector and a problem phrase vector;
s14: coding the text phrase vector and the problem phrase vector by using a bidirectional cyclic neural network to obtain a text hidden vector and a problem hidden vector;
the S2 specifically includes:
hiding the last hidden vector of the problem hidden vector
Figure FDA0002512037130000011
And text hidden vector
Figure FDA0002512037130000012
Each hidden vector in (1) is weighted to calculate a similarity value
Figure FDA0002512037130000013
Will be similar to the value aiAnd text hidden vector
Figure FDA0002512037130000014
Multiplying each hidden vector in the table to calculate a fused association vector group
Figure FDA0002512037130000015
2. The method for intelligent question answering oriented to unstructured text according to claim 1, wherein the specific method of S12 is as follows:
respectively segmenting the text C and the question Q to obtain a text phrase C1
Figure FDA0002512037130000016
And question phrase Q1
Figure FDA0002512037130000021
Wherein
Figure FDA0002512037130000022
For the ith word in the text word group,
Figure FDA0002512037130000023
is the jth word in the question phrase, n is the total number of words in the text phrase, and m is the total number of words in the question phrase.
3. The method for intelligent question answering oriented to unstructured text according to claim 2, wherein the specific method of S13 is as follows:
will text phrase C1
Figure FDA0002512037130000024
And question phrase Q1
Figure FDA0002512037130000025
Respectively mapping to corresponding word vectors to obtain text phrase vectors C2
Figure FDA0002512037130000026
And question phrase vector Q2
Figure FDA0002512037130000027
Wherein
Figure FDA0002512037130000028
Is composed of
Figure FDA0002512037130000029
The corresponding word vector is then used to generate the word vector,
Figure FDA00025120371300000210
is composed of
Figure FDA00025120371300000211
The corresponding word vector.
4. The unstructured text intelligent question answering method according to claim 3, wherein the S14 specifically comprises:
applying bidirectional cyclic neural network to text phrase vector C2
Figure FDA00025120371300000212
And question phrase vector Q2
Figure FDA00025120371300000213
Respectively coding to obtain a text hidden vector C3
Figure FDA00025120371300000214
And problem hiding vector Q3
Figure FDA00025120371300000215
Wherein
Figure FDA00025120371300000216
Is composed of
Figure FDA00025120371300000217
The corresponding hidden vector(s) is (are),
Figure FDA00025120371300000218
is composed of
Figure FDA00025120371300000219
Corresponding hidden vector, wherein
Figure FDA00025120371300000220
5. The unstructured text intelligent question answering method according to claim 4,
Figure FDA00025120371300000221
wherein the content of the first and second substances,
Figure FDA00025120371300000222
representing hidden vectors
Figure FDA00025120371300000223
Transpose of (w)sIs a parameter matrix;
Hi=aihi C
6. the method for intelligent question answering oriented to unstructured text according to claim 5, wherein the S3 specifically comprises:
combining the fused association vectors together, g ═ concat (H)i) Taking the merged vector g as the input of two different fully-connected networks, wherein the two different fully-connected networks comprise a first fully-connected network and a second fully-connected network, and the output value of the first fully-connected network is the probability distribution p of the starting position of the predicted answer1The output value of the second fully-connected network is the probability distribution p of the predicted answer ending position2
p1=softmax (w1g)
p2=softmax (w2g)
Wherein w1And w2For the parameters, the starting position p of the answer is calculatedsAnd an end position pe
ps=argmax (p1)
pe=argmax (p2)
In the text, a start position p is extractedsAnd an end position peThe text content in between serves as the answer to the question and outputs the answer.
CN201710985745.1A 2017-10-20 2017-10-20 Intelligent question and answer method and system for unstructured text Active CN107679224B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710985745.1A CN107679224B (en) 2017-10-20 2017-10-20 Intelligent question and answer method and system for unstructured text

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710985745.1A CN107679224B (en) 2017-10-20 2017-10-20 Intelligent question and answer method and system for unstructured text

Publications (2)

Publication Number Publication Date
CN107679224A CN107679224A (en) 2018-02-09
CN107679224B true CN107679224B (en) 2020-09-08

Family

ID=61141785

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710985745.1A Active CN107679224B (en) 2017-10-20 2017-10-20 Intelligent question and answer method and system for unstructured text

Country Status (1)

Country Link
CN (1) CN107679224B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108959388B (en) * 2018-05-31 2020-09-11 科大讯飞股份有限公司 Information generation method and device
CN108959387B (en) * 2018-05-31 2020-09-11 科大讯飞股份有限公司 Information acquisition method and device
CN108959396B (en) * 2018-06-04 2021-08-17 众安信息技术服务有限公司 Machine reading model training method and device and question and answer method and device
CN108959467B (en) * 2018-06-20 2021-10-15 华东师范大学 Method for calculating correlation degree of question sentences and answer sentences based on reinforcement learning
CN108846138B (en) * 2018-07-10 2022-06-07 苏州大学 Question classification model construction method, device and medium fusing answer information
CN109408624B (en) * 2018-11-06 2020-11-27 江西师范大学 Visual chat robot session generation method
CN109492227A (en) * 2018-11-16 2019-03-19 大连理工大学 It is a kind of that understanding method is read based on the machine of bull attention mechanism and Dynamic iterations
CN111428520B (en) * 2018-11-30 2021-11-23 腾讯科技(深圳)有限公司 Text translation method and device
CN111309875B (en) * 2018-12-10 2023-08-04 百度在线网络技术(北京)有限公司 Method, device, equipment and storage medium for answering questions
CN110348462B (en) * 2019-07-09 2022-03-04 北京金山数字娱乐科技有限公司 Image feature determination and visual question and answer method, device, equipment and medium
CN110928987B (en) * 2019-10-18 2023-07-25 平安科技(深圳)有限公司 Legal provision retrieval method and related equipment based on neural network hybrid model
CN110765254A (en) * 2019-10-21 2020-02-07 北京理工大学 Multi-document question-answering system model integrating multi-view answer reordering
CN110990547B (en) * 2019-11-29 2023-03-14 支付宝(杭州)信息技术有限公司 Phone operation generation method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1924995A (en) * 2005-08-31 2007-03-07 中国科学院声学研究所 Content analysis based short message ask/answer system and implementing method thereof
CN101770512A (en) * 2010-01-20 2010-07-07 何吴迪 Cloud stored hypertext file storage and framework method of WEB window expression thereof
CN104715168A (en) * 2015-02-13 2015-06-17 陈佳阳 File security control and trace method and system based on digital fingerprints
CN106569998A (en) * 2016-10-27 2017-04-19 浙江大学 Text named entity recognition method based on Bi-LSTM, CNN and CRF

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6083173A (en) * 1998-03-06 2000-07-04 Research Foundation Of State University Of New York Artificial neural network for predicting respiratory disturbances and method for developing the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1924995A (en) * 2005-08-31 2007-03-07 中国科学院声学研究所 Content analysis based short message ask/answer system and implementing method thereof
CN101770512A (en) * 2010-01-20 2010-07-07 何吴迪 Cloud stored hypertext file storage and framework method of WEB window expression thereof
CN104715168A (en) * 2015-02-13 2015-06-17 陈佳阳 File security control and trace method and system based on digital fingerprints
CN106569998A (en) * 2016-10-27 2017-04-19 浙江大学 Text named entity recognition method based on Bi-LSTM, CNN and CRF

Also Published As

Publication number Publication date
CN107679224A (en) 2018-02-09

Similar Documents

Publication Publication Date Title
CN107679224B (en) Intelligent question and answer method and system for unstructured text
CN110263324B (en) Text processing method, model training method and device
CN107239446B (en) A kind of intelligence relationship extracting method based on neural network Yu attention mechanism
CN105589844B (en) It is a kind of to be used to take turns the method for lacking semantic supplement in question answering system more
CN102262634B (en) Automatic questioning and answering method and system
CN107818164A (en) A kind of intelligent answer method and its system
CN106897559B (en) A kind of symptom and sign class entity recognition method and device towards multi-data source
CN108763510A (en) Intension recognizing method, device, equipment and storage medium
DE102008040739A1 (en) Method and system for calculating or determining confidence or confidence scores for syntax trees at all levels
CN107343223A (en) The recognition methods of video segment and device
CN110263854B (en) Live broadcast label determining method, device and storage medium
CN109213856A (en) A kind of method for recognizing semantics and system
EP3798922A1 (en) Device and method for machine learning and controlling a machine
WO2023108991A1 (en) Model training method and apparatus, knowledge classification method and apparatus, and device and medium
CN108829823A (en) A kind of file classification method
CN112905868A (en) Event extraction method, device, equipment and storage medium
CN112182249A (en) Automatic classification method and device for aviation safety report
CN110309509A (en) A kind of semantic knowledge base construction method
CN112466316A (en) Zero-sample voice conversion system based on generation countermeasure network
CN115600605A (en) Method, system, equipment and storage medium for jointly extracting Chinese entity relationship
CN116737922A (en) Tourist online comment fine granularity emotion analysis method and system
Lhasiw et al. A bidirectional LSTM model for classifying Chatbot messages
Wiza et al. Classification Analysis Using C4. 5 Algorithm To Predict The Level of Graduation of Nurul Falah Pekanbaru High School Students
CN111401069A (en) Intention recognition method and intention recognition device for conversation text and terminal
CN114443632A (en) Intelligent conversion method and system for credit of credit bank and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant