CN111858859A - Automatic question-answering processing method, device, computer equipment and storage medium - Google Patents

Automatic question-answering processing method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN111858859A
CN111858859A CN201910256997.XA CN201910256997A CN111858859A CN 111858859 A CN111858859 A CN 111858859A CN 201910256997 A CN201910256997 A CN 201910256997A CN 111858859 A CN111858859 A CN 111858859A
Authority
CN
China
Prior art keywords
question
target
similarity
vector
semantic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910256997.XA
Other languages
Chinese (zh)
Inventor
黄强
卜建辉
陈林
吴伟佳
谢炜坚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910256997.XA priority Critical patent/CN111858859A/en
Publication of CN111858859A publication Critical patent/CN111858859A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3343Query execution using phonetics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis

Abstract

The application provides an automatic question answering processing method, an automatic question answering processing device, computer equipment and a storage medium, wherein the method comprises the following steps: acquiring a target problem input by a user; performing semantic vectorization on the target problem by using a pre-trained sentence semantic coding model to obtain a sentence vector of the target problem; calculating the similarity between the sentence vector and the semantic index vector of each question in the question-answer library; and determining standard questions matched with the target questions from the question-answer library according to the similarity, and feeding answers corresponding to the standard questions back to the user. By the method, problem matching based on semantic relevance of the problem can be realized, the accuracy of semantic understanding of the target problem is improved, and the technical problems that in the prior art, the returned answer cannot meet the requirements of users and the accuracy is low due to problem matching based on literal relevance are solved.

Description

Automatic question-answering processing method, device, computer equipment and storage medium
Technical Field
The present application relates to the field of automatic question answering technologies, and in particular, to an automatic question answering processing method and apparatus, a computer device, and a storage medium.
Background
The automatic question-answering system comprehensively uses the technologies of knowledge representation, information retrieval, natural language processing and the like, and can return simple and accurate answers according to received questions input by a user in a natural language form. Compared with the traditional search engine, the automatic question-answering system has the advantages of being more convenient and more accurate, and is a research hotspot in the fields of current natural language processing and artificial intelligence.
Conventional automated question and answer systems typically employ word-based inverted indexing techniques to retrieve questions entered by a user and return matching answers. However, this method performs question matching based on literal relevance of questions, and does not concern the true intention of the user, so that the returned answers cannot meet the needs of the user, and the accuracy of the automatic question-answering system is affected.
Disclosure of Invention
The present application is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, the application provides an automatic question and answer processing method, an automatic question and answer processing device, a computer device and a storage medium, and is used for solving the technical problems that in the prior art, the returned answer cannot meet the requirements of users and the accuracy is low due to problem matching based on literal correlation.
In order to achieve the above object, an embodiment of a first aspect of the present application provides an automatic question-answering processing method, including:
acquiring a target problem input by a user;
performing semantic vectorization on the target problem by using a pre-trained sentence semantic coding model to obtain a sentence vector of the target problem;
calculating the similarity between the sentence vector and the semantic index vector of each question in the question-answer library;
and determining standard questions matched with the target questions from the question-answer library according to the similarity, and feeding answers corresponding to the standard questions back to the user.
According to the automatic question-answering processing method, the target question input by the user is obtained, semantic vectorization is carried out on the target question by utilizing the pre-trained sentence semantic coding model, the sentence vector of the target question is obtained, the similarity between the sentence vector and the semantic index vector of each question in the question-answering library is calculated, the standard question matched with the target question is determined from the question-answering library according to the similarity, and the answer corresponding to the standard question is fed back to the user. Therefore, the similarity between the sentence vector of the target question and the semantic index vector of each question in the question-answering library is calculated, the standard question matched with the target question is determined according to the similarity, the answer corresponding to the standard question is fed back to the user, the semantics of the question input by the user can be quickly and automatically understood, question matching based on the semantic relevance of the question is realized, the accuracy of semantic understanding of the target question is improved, the accuracy of the automatic question-answering system is improved, and the user experience is improved.
To achieve the above object, a second embodiment of the present application provides an automatic question answering device, including:
the first acquisition module is used for acquiring a target problem input by a user;
the second acquisition module is used for carrying out semantic vectorization on the target problem by utilizing a pre-trained sentence semantic coding model to acquire a sentence vector of the target problem;
the calculation module is used for calculating the similarity between the sentence vector and the semantic index vector of each question in the question-answer library;
the determining module is used for determining a standard question matched with the target question from the question-answering library according to the similarity;
and the feedback module is used for feeding back the answer corresponding to the standard question to the user.
According to the automatic question-answering processing device, the target question input by the user is obtained, semantic vectorization is carried out on the target question by utilizing the pre-trained sentence semantic coding model, the sentence vector of the target question is obtained, the similarity between the sentence vector and the semantic index vector of each question in the question-answering library is calculated, then the standard question matched with the target question is determined from the question-answering library according to the similarity, and the answer corresponding to the standard question is fed back to the user. Therefore, the similarity between the sentence vector of the target question and the semantic index vector of each question in the question-answering library is calculated, the standard question matched with the target question is determined according to the similarity, the answer corresponding to the standard question is fed back to the user, the semantics of the question input by the user can be quickly and automatically understood, question matching based on the semantic relevance of the question is realized, the accuracy of semantic understanding of the target question is improved, the accuracy of the automatic question-answering system is improved, and the user experience is improved.
To achieve the above object, a third aspect of the present application provides a computer device, including: a processor and a memory; wherein, the processor executes a program corresponding to the executable program code by reading the executable program code stored in the memory, so as to implement the automatic question answering processing method according to the embodiment of the first aspect.
To achieve the above object, a fourth aspect of the present application provides a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the automatic question answering method according to the first aspect.
To achieve the above object, a fifth aspect of the present application provides a computer program product, where instructions of the computer program product, when executed by a processor, implement the automatic question answering method according to the first aspect.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
Fig. 1 is a schematic flow chart illustrating an automatic question answering processing method according to an embodiment of the present application;
fig. 2 is a schematic flow chart illustrating an automatic question answering method according to another embodiment of the present application;
fig. 3 is a schematic flow chart illustrating an automatic question answering method according to another embodiment of the present application;
fig. 4 is a schematic structural diagram of an automatic question answering processing device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an automatic question answering processing device according to another embodiment of the present application;
fig. 6 is a schematic structural diagram of an automatic question answering device according to yet another embodiment of the present application;
fig. 7 is a schematic structural diagram of an automatic question answering processing device according to yet another embodiment of the present application; and
fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
An automatic question answering method, an automatic question answering device, a computer device and a storage medium according to the embodiments of the present application are described below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of an automatic question-answering processing method according to an embodiment of the present disclosure, which may be executed by an automatic question-answering processing apparatus according to an embodiment of the present disclosure, where the automatic question-answering processing apparatus may be applied to an automatic question-answering system, for example, an intelligent customer service robot, and the automatic question-answering system may exist in an independent individual form or may be installed in a terminal device.
As shown in fig. 1, the automatic question answering method may include the steps of:
step 101, obtaining a target question input by a user.
When a user wants to obtain an answer to a certain question, the user can input the question to be consulted by means of text input or voice input. For example, the user may input a question in a text form through a touch panel or an external input device, or may input a question in a voice form through a microphone provided by the automatic question-answering system itself or a microphone provided by a terminal device where the automatic question-answering system is located. When a user inputs a question in a voice form, the automatic question-answering processing device acquires a text question obtained by performing voice recognition on the received question in the voice form by the automatic question-answering system and takes the text question as a target question; when the user inputs a question in text form, the automatic question-and-answer processing device acquires the question in text form received by the automatic question-and-answer system as a target question. Of course, it is also possible to integrate a voice recognition function in the automatic question-and-answer processing apparatus, which voice recognizes a question in the form of voice input by the user to obtain a target question in the form of text.
And 102, performing semantic vectorization on the target problem by using a pre-trained sentence semantic coding model to obtain a sentence vector of the target problem.
In this embodiment, after the target problem is obtained, the target problem may be input into a sentence semantic coding model obtained by pre-training, and the sentence semantic coding model is used to perform semantic vectorization on the target problem to obtain a sentence vector of the target problem.
Before semantic vectorization is performed on a target problem to obtain a corresponding sentence vector, a sentence semantic coding model needs to be obtained through training.
In a possible implementation manner of the embodiment of the present application, when a sentence semantic code model is obtained through training, a training sample set may be obtained first, where the training sample set includes a plurality of problems and a vector corresponding to each problem. The problems contained in the training sample set can be collected from massive internet data; specifically, for each problem, a relevant word segmentation technology is firstly adopted to perform word segmentation processing on the problem to obtain a plurality of segmented words corresponding to the problem, then word vectors of each segmented word are obtained by using word2vector, and then the sum of each word vector is calculated as a vector corresponding to the problem. And then, taking each problem and the corresponding vector as a training sample pair, constructing a training sample set, and then training the neural network model by using the training sample set to generate a sentence semantic coding model. The neural network model can adopt a convolutional neural network model, a cyclic neural network model and the like. And during model training, taking each problem as the input of the model, taking the vector corresponding to each problem as the output of the model for training, continuously adjusting the parameters of the neural network model according to the input result in the model training process, and finishing the model training when the accuracy of the output result reaches the expectation or the preset loss function is converged to obtain the sentence semantic coding model.
Further, in a possible implementation manner of the embodiment of the present application, the sentence semantic coding model may be obtained by combining the attention mechanism and the neural network model training, for example, the sentence semantic coding model is obtained based on the attention mechanism and the recurrent neural network model training. By adding an attention mechanism, parallel computation can be realized, so that the training speed and the computation efficiency of the model are improved.
Furthermore, by using the trained sentence semantic coding model, the semantic vectorization of the target problem can be realized, and the sentence vector corresponding to the target problem is obtained.
And 103, calculating the similarity between the sentence vector and the semantic index vector of each question in the question-answer library.
Each question in the question-answering library only corresponds to one semantic index vector, and the semantic index vector can express the semantics of the corresponding question.
In this embodiment, after the sentence vector of the target question is determined, the similarity between the sentence vector and the semantic index vector of each question in the question-and-answer library may be sequentially calculated.
For example, the similarity between the sentence vector and the semantic index vector of the target problem can be calculated by the euclidean distance formula; alternatively, the similarity between the sentence vector and the semantic index vector may be determined by calculating the manhattan distance between the sentence vector and the semantic index vector; or the cosine similarity between the sentence vector and the semantic index vector can be calculated through an included angle cosine formula; and so on.
And 104, determining standard questions matched with the target questions from the question-answer library according to the similarity, and feeding answers corresponding to the standard questions back to the user.
In this embodiment, after the similarity between the sentence vector of the target question and the semantic index vector of each question in the question-and-answer library is calculated, the matching degree between the target question and each question in the question-and-answer library is also determined, and the higher the similarity between the sentence vector and the semantic index vector is, the higher the matching degree between the target question and the question corresponding to the semantic index vector is, so that the standard answer matching the target question can be determined from the question-and-answer library according to the calculated similarity, for example, the question corresponding to the semantic index vector with the highest similarity can be determined as the standard question matching the target question, and the answer corresponding to the standard question is obtained and fed back to the user, thereby completing automatic question-and-answer.
In the automatic question-answering processing method of the embodiment, a target question input by a user is obtained, semantic vectorization is performed on the target question by using a pre-trained sentence semantic coding model, a sentence vector of the target question is obtained, the similarity between the sentence vector and a semantic index vector of each question in a question-answering library is calculated, a standard question matched with the target question is determined from the question-answering library according to the similarity, and an answer corresponding to the standard question is fed back to the user. Therefore, the similarity between the sentence vector of the target question and the semantic index vector of each question in the question-answering library is calculated, the standard question matched with the target question is determined according to the similarity, the answer corresponding to the standard question is fed back to the user, the semantics of the question input by the user can be quickly and automatically understood, question matching based on the semantic relevance of the question is realized, the accuracy of semantic understanding of the target question is improved, the accuracy of the automatic question-answering system is improved, and the user experience is improved.
In the embodiment of the present application, an index relationship is established between semantic index vectors corresponding to each question-answer pair in a question-answer library, a specific implementation process for determining the semantic index vectors in the present application is described in detail below with reference to fig. 2, and fig. 2 is a schematic flow chart of an automatic question-answer processing method provided in another embodiment of the present application.
As shown in fig. 2, on the basis of the embodiment shown in fig. 1, before step 103, the following steps are further included:
step 201, performing semantic vectorization on each question in the question-answer library by using a sentence semantic coding model, and acquiring a question vector corresponding to each question in the question-answer library.
In this embodiment, for each question in the question-and-answer library, semantic vectorization may be performed on each question in the question-and-answer library by using a pre-trained sentence semantic coding model, and each question in the question-and-answer library is sequentially input into the sentence semantic coding model to obtain a question vector corresponding to each question.
It should be noted that, in this embodiment, the sentence semantic coding model used for performing semantic vectorization on each question in the question-and-answer library is the same as the sentence semantic coding model used for performing semantic vectorization on the target question, so as to ensure the accuracy of question matching.
Step 202, determining a semantic index vector corresponding to each question in the question-answering library based on the approximate neighbor retrieval and the question vector.
In this embodiment, after the question vector of each question in the question-and-answer library is obtained, the semantic index vector corresponding to each question may be determined based on the approximate neighbor search technology and the question vector.
When the semantic index vector is determined by adopting an approximate neighbor retrieval technology, firstly, a tree is built by utilizing all problem vectors, for example, an annoy tree can be built, retrieval can be carried out after the tree building is completed, for a given certain problem vector, top k neighbors most similar to the problem vector are found from the annoy tree, and the semantic index vector is built according to the determined neighbors, wherein k is a preset value.
Furthermore, after the semantic index vector corresponding to each question is determined, the index relationship between the question-answer pairs and the semantic index vectors in the question-answer library can be established.
In the automatic question-answering processing method of the embodiment, semantic vectorization is performed on each question in a question-answering library by using a sentence semantic coding model, a question vector corresponding to each question in the question-answering library is obtained, and a semantic index vector corresponding to each question in the question-answering library is determined based on approximate neighbor retrieval and the question vector, so that conditions are provided for determining matched questions based on semantic relevance of the questions, the semantic comprehension of questions input by a user is facilitated, and the intention of the user is recognized.
Fig. 3 is a schematic flow chart of an automatic question answering method according to another embodiment of the present application. As shown in fig. 3, step 104 may include the following steps based on the embodiment shown in fig. 1:
step 301, sorting each question in the question-and-answer library according to the sequence of similarity from high to low, and acquiring a preset number of questions ranked at the top as candidate questions.
In this embodiment, after obtaining the similarity between the sentence vector of the target question and the semantic index vector of each question in the question-and-answer library, each question in the question-and-answer library may be sorted according to the sequence of the similarity from high to low, that is, each question-and-answer pair in the question-and-answer library is sorted, and a preset number of questions ranked at the top is obtained as candidate questions, where the number of the obtained candidate questions may be preset, for example, the preset number is set to 10, and then 10 questions ranked at the top of the similarity are obtained as candidate questions.
Step 302, using a pre-trained question-answer ranking model to obtain question similarity between the target question and the candidate question.
In this embodiment, after determining the candidate questions that are similar to the target question, the question similarity between the target question and the candidate questions may be obtained by continuously using the pre-trained question-answer ranking model.
In a possible implementation manner of the embodiment of the present application, the question-answer ranking model includes a recurrent neural network layer, an attention layer, and an output layer, and the problem similarity between the target problem and the candidate problem is obtained by using a pre-trained question-answer ranking model, which includes: inputting the target problem and the candidate problem into a recurrent neural network layer to obtain a target problem feature matrix corresponding to the target problem and a candidate problem feature matrix corresponding to the candidate problem; inputting the target problem feature matrix and the candidate problem feature matrix into an attention layer, acquiring a target sentence vector of the target problem feature matrix relative to the candidate problem feature matrix, and acquiring a candidate sentence vector of the candidate problem feature matrix relative to the target problem feature matrix; and obtaining the problem similarity between the target sentence vector and the candidate sentence vector calculated by the output layer.
Specifically, when the target problem feature matrix is obtained, word segmentation processing may be performed on the target problem, a word vector corresponding to each word segmentation is obtained, and the target problem feature matrix is constructed according to the word vectors. For example, the target problem includes m participles, each participle corresponds to a k-dimensional word vector, and the target problem feature matrix is an m × k-dimensional matrix. The candidate problem feature matrix may be obtained in the same manner. When candidate sentence vectors of each candidate problem feature matrix relative to the target problem feature matrix are obtained, the cosine similarity between the word vectors and each word vector in the target problem feature matrix is calculated as the weight of the word vectors relative to each word segmentation in the target problem feature matrix for each word vector in the candidate problem feature matrix, then the obtained weights are normalized, and the normalized weights and the corresponding word vectors in the target problem feature matrix are used for carrying out weighted summation to obtain the word vector expression of the word vectors based on the target problem feature matrix.
After word vector expression of each word vector in a candidate problem feature matrix based on the target problem feature matrix is obtained, candidate sentence vectors of the candidate problem feature matrix relative to the target problem feature matrix are obtained by calculating the sum of the word vector expressions. When determining the target sentence vectors of the target problem feature matrix relative to the candidate problem feature matrix, the single target sentence vector of the target problem feature matrix relative to each candidate problem feature matrix may be calculated first in a similar manner as described above, and then the target sentence vectors of the target problem feature matrix relative to the candidate problem feature matrix may be determined in a manner of solving the mean of all the single target sentence vectors.
Further, the question similarity between the target sentence vector and each candidate sentence vector is calculated at the output layer, and a question similarity calculation result is obtained from the output layer.
The question-answer ordering model constructed based on the attention mechanism and the recurrent neural network is used for obtaining the question similarity between the target question and the candidate question, so that the calculation efficiency can be improved, and the question matching accuracy can be improved.
Step 303, determining the candidate problem corresponding to the highest value of the problem similarity as the standard problem.
In this embodiment, after the question similarity between the target question and the candidate question is obtained, the candidate question corresponding to the highest value of the question similarity may be determined as the standard question, and the answer corresponding to the standard question may be fed back to the user as the answer to the target question.
According to the automatic question-answering processing method, each question in the question-answering library is sequenced from high to low in similarity, a preset number of questions ranked at the top are obtained as candidate questions, question similarity between a target question and the candidate questions is obtained by utilizing a pre-trained question-answering sequencing model, and the candidate questions corresponding to the highest value of the question similarity are determined as standard questions.
In order to implement the above embodiments, the present application further provides an automatic question answering processing device.
Fig. 4 is a schematic structural diagram of an automatic question answering processing device according to an embodiment of the present application.
As shown in fig. 4, the automatic question answering processing apparatus 40 includes: a first acquisition module 410, a second acquisition module 420, a calculation module 430, a determination module 440, and a feedback module 450.
The first obtaining module 410 is configured to obtain a target question input by a user.
The second obtaining module 420 is configured to perform semantic vectorization on the target problem by using a pre-trained sentence semantic coding model, and obtain a sentence vector of the target problem.
And the calculating module 430 is used for calculating the similarity between the sentence vector and the semantic index vector of each question in the question-answer library.
And the determining module 440 is configured to determine a standard question matched with the target question from the question-answering library according to the similarity.
And a feedback module 450, configured to feed back the answer corresponding to the standard question to the user.
In a possible implementation manner of the embodiment of the present application, as shown in fig. 5, on the basis of the embodiment shown in fig. 4, the automatic question answering processing apparatus 40 further includes:
a training module 400 configured to obtain a training sample set, where the training sample set includes a plurality of questions and a vector corresponding to each question; and training the neural network model by utilizing the training sample set to generate a sentence semantic coding model.
In a possible implementation manner of the embodiment of the present application, as shown in fig. 6, on the basis of the embodiment shown in fig. 4, the automatic question answering processing apparatus 40 further includes:
The preprocessing module 401 is configured to perform semantic vectorization on each question in the question-answer library by using a sentence semantic coding model, and obtain a question vector corresponding to each question in the question-answer library.
And an index building module 402, configured to determine a semantic index vector corresponding to each question in the question-and-answer library based on the approximate neighbor retrieval and the question vector.
In a possible implementation manner of the embodiment of the present application, as shown in fig. 7, on the basis of the embodiment shown in fig. 4, the determining module 440 includes:
the eliminating unit 441 is configured to sort the questions in the question-and-answer library in the order from high to low in similarity, and obtain a preset number of questions ranked at the top as candidate questions.
An obtaining unit 442, configured to obtain question similarities between the target question and the candidate questions by using a pre-trained question-answer ranking model.
In a possible implementation manner of the embodiment of the present application, the question-answer ranking model includes a recurrent neural network layer, an attention layer, and an output layer, and the obtaining unit 442 is specifically configured to input the target question and the candidate question into the recurrent neural network layer, so as to obtain a target question feature matrix corresponding to the target question and a candidate question feature matrix corresponding to the candidate question; inputting the target problem feature matrix and the candidate problem feature matrix into an attention layer, acquiring a target sentence vector of the target problem feature matrix relative to the candidate problem feature matrix, and acquiring a candidate sentence vector of the candidate problem feature matrix relative to the target problem feature matrix; and obtaining the problem similarity between the target sentence vector and the candidate sentence vector calculated by the output layer.
The question-answer ordering model constructed based on the attention mechanism and the recurrent neural network is used for obtaining the question similarity between the target question and the candidate question, so that the calculation efficiency can be improved, and the question matching accuracy can be improved.
The determining unit 443 is configured to determine the candidate question corresponding to the highest question similarity value as the standard question.
The questions in the question and answer library are sequenced according to the sequence of similarity from high to low, the questions with the preset number ranked at the top are obtained as candidate questions, the question similarity between the target question and the candidate questions is obtained by utilizing a pre-trained question and answer sequencing model, and the candidate questions corresponding to the highest value of the question similarity are determined as standard questions.
It should be noted that the foregoing explanation of the embodiment of the automatic question-answering processing method is also applicable to the automatic question-answering processing apparatus of the embodiment, and the implementation principle thereof is similar, and is not repeated here.
According to the automatic question-answering processing device, the target question input by the user is obtained, semantic vectorization is carried out on the target question by utilizing the pre-trained sentence semantic coding model, the sentence vector of the target question is obtained, the similarity between the sentence vector and the semantic index vector of each question in the question-answering library is calculated, then the standard question matched with the target question is determined from the question-answering library according to the similarity, and the answer corresponding to the standard question is fed back to the user. Therefore, the similarity between the sentence vector of the target question and the semantic index vector of each question in the question-answering library is calculated, the standard question matched with the target question is determined according to the similarity, the answer corresponding to the standard question is fed back to the user, the semantics of the question input by the user can be quickly and automatically understood, question matching based on the semantic relevance of the question is realized, the accuracy of semantic understanding of the target question is improved, the accuracy of the automatic question-answering system is improved, and the user experience is improved.
In order to implement the foregoing embodiments, the present application also provides a computer device, including: a processor and a memory. Wherein, the processor runs the program corresponding to the executable program code by reading the executable program code stored in the memory, so as to implement the automatic question answering processing method according to the foregoing embodiment.
FIG. 8 is a block diagram of a computer device provided in an embodiment of the present application, illustrating an exemplary computer device 90 suitable for use in implementing embodiments of the present application. The computer device 90 shown in fig. 8 is only an example, and should not bring any limitation to the function and the scope of use of the embodiments of the present application.
As shown in fig. 8, the computer device 90 is in the form of a general purpose computer device. The components of computer device 90 may include, but are not limited to: one or more processors or processing units 906, a system memory 910, and a bus 908 that couples the various system components (including the system memory 910 and the processing unit 906).
Bus 908 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. These architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, to name a few.
Computer device 90 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 90 and includes both volatile and nonvolatile media, removable and non-removable media.
The system Memory 910 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) 911 and/or cache Memory 912. The computer device 90 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 913 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 8, and commonly referred to as a "hard disk drive"). Although not shown in FIG. 8, a disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a Compact disk Read Only Memory (CD-ROM), a Digital versatile disk Read Only Memory (DVD-ROM), or other optical media) may be provided. In these cases, each drive may be connected to bus 908 by one or more data media interfaces. System memory 910 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the application.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
Program/utility 914 having a set (at least one) of program modules 9140 may be stored, for example, in system memory 910, such program modules 9140 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which or some combination of these examples may comprise an implementation of a network environment. Program modules 9140 generally perform the functions and/or methods of embodiments described herein.
The computer device 90 may also communicate with one or more external devices 10 (e.g., keyboard, pointing device, display 100, etc.), with one or more devices that enable a user to interact with the terminal device 90, and/or with any devices (e.g., network card, modem, etc.) that enable the computer device 90 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 902. Moreover, computer device 90 may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public Network such as the Internet) via Network adapter 900. As shown in FIG. 8, network adapter 900 communicates with the other modules of computer device 90 via bus 908. It should be appreciated that although not shown in FIG. 8, other hardware and/or software modules may be used in conjunction with computer device 90, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 906 executes various functional applications and data processing by running a program stored in the system memory 910, for example, implementing the automatic question and answer processing method mentioned in the foregoing embodiments.
In order to implement the above embodiments, the present application also proposes a non-transitory computer-readable storage medium on which a computer program is stored, which when executed by a processor, implements the automatic question-answering processing method according to the foregoing embodiments.
In order to implement the foregoing embodiments, the present application also proposes a computer program product, wherein when the instructions in the computer program product are executed by a processor, the automatic question answering processing method according to the foregoing embodiments is implemented.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (10)

1. An automatic question-answering processing method is characterized by comprising the following steps:
acquiring a target problem input by a user;
performing semantic vectorization on the target problem by using a pre-trained sentence semantic coding model to obtain a sentence vector of the target problem;
calculating the similarity between the sentence vector and the semantic index vector of each question in the question-answer library;
and determining standard questions matched with the target questions from the question-answer library according to the similarity, and feeding answers corresponding to the standard questions back to the user.
2. The method of claim 1, wherein before the semantic vectorizing the target question using the pre-trained sentence semantic coding model to obtain the sentence vector of the target question, further comprising:
Obtaining a training sample set, wherein the training sample set comprises a plurality of questions and a vector corresponding to each question;
and training a neural network model by using the training sample set to generate the sentence semantic coding model.
3. The method of claim 1, prior to said computing a similarity between the sentence vector and a semantic index vector for each question in a question-and-answer library, further comprising:
performing semantic vectorization on each question in the question-answer library by using the sentence semantic coding model to obtain a question vector corresponding to each question in the question-answer library;
and determining a semantic index vector corresponding to each question in the question-answering library based on the approximate neighbor retrieval and the question vector.
4. The method of claim 1, wherein the determining the standard questions from the question-and-answer library that match the target question based on the similarity comprises:
sequencing each question in the question-answering library according to the sequence of the similarity from high to low, and acquiring a preset number of questions ranked at the top as candidate questions;
obtaining question similarity between the target question and the candidate question by using a pre-trained question-answer ordering model;
And determining the candidate problem corresponding to the highest value of the problem similarity as the standard problem.
5. The method of claim 4, wherein the question-and-answer ranking model comprises a recurrent neural network layer, an attention layer, and an output layer;
the obtaining of the question similarity between the target question and the candidate question by using the pre-trained question-answer ranking model includes:
inputting the target problem and the candidate problem into the recurrent neural network layer to obtain a target problem feature matrix corresponding to the target problem and a candidate problem feature matrix corresponding to the candidate problem;
inputting the target problem feature matrix and the candidate problem feature matrix into the attention layer, acquiring a target sentence vector of the target problem feature matrix relative to the candidate problem feature matrix, and acquiring a candidate sentence vector of the candidate problem feature matrix relative to the target problem feature matrix;
and acquiring the problem similarity between the target sentence vector and the candidate sentence vector calculated by the output layer.
6. An automatic question answering processing apparatus, comprising:
the first acquisition module is used for acquiring a target problem input by a user;
The second acquisition module is used for carrying out semantic vectorization on the target problem by utilizing a pre-trained sentence semantic coding model to acquire a sentence vector of the target problem;
the calculation module is used for calculating the similarity between the sentence vector and the semantic index vector of each question in the question-answer library;
the determining module is used for determining a standard question matched with the target question from the question-answering library according to the similarity;
and the feedback module is used for feeding back the answer corresponding to the standard question to the user.
7. The apparatus of claim 6, further comprising:
the preprocessing module is used for performing semantic vectorization on each question in the question-answer library by using the sentence semantic coding model to obtain a question vector corresponding to each question in the question-answer library;
and the index construction module is used for determining a semantic index vector corresponding to each question in the question-answering library based on the approximate neighbor retrieval and the question vector.
8. The apparatus of claim 6, wherein the determining module comprises:
the eliminating unit is used for sequencing each question in the question-answering library according to the sequence of the similarity from high to low, and acquiring a preset number of questions ranked at the top as candidate questions;
The acquisition unit is used for acquiring the question similarity between the target question and the candidate question by utilizing a pre-trained question-answer sequencing model;
and the determining unit is used for determining the candidate problem corresponding to the highest value of the problem similarity as the standard problem.
9. A computer device comprising a processor and a memory;
wherein the processor executes a program corresponding to the executable program code by reading the executable program code stored in the memory for implementing the automatic question answering processing method according to any one of claims 1 to 5.
10. A non-transitory computer-readable storage medium on which a computer program is stored, the program realizing the automatic question-answering processing method according to any one of claims 1 to 5 when executed by a processor.
CN201910256997.XA 2019-04-01 2019-04-01 Automatic question-answering processing method, device, computer equipment and storage medium Pending CN111858859A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910256997.XA CN111858859A (en) 2019-04-01 2019-04-01 Automatic question-answering processing method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910256997.XA CN111858859A (en) 2019-04-01 2019-04-01 Automatic question-answering processing method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111858859A true CN111858859A (en) 2020-10-30

Family

ID=72951109

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910256997.XA Pending CN111858859A (en) 2019-04-01 2019-04-01 Automatic question-answering processing method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111858859A (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112287085A (en) * 2020-11-06 2021-01-29 中国平安财产保险股份有限公司 Semantic matching method, system, device and storage medium
CN112487165A (en) * 2020-12-02 2021-03-12 税友软件集团股份有限公司 Question and answer method, device and medium based on keywords
CN112507078A (en) * 2020-12-15 2021-03-16 浙江诺诺网络科技有限公司 Semantic question and answer method and device, electronic equipment and storage medium
CN112507091A (en) * 2020-12-01 2021-03-16 百度健康(北京)科技有限公司 Method, device, equipment and storage medium for retrieving information
CN112597208A (en) * 2020-12-29 2021-04-02 深圳价值在线信息科技股份有限公司 Enterprise name retrieval method, enterprise name retrieval device and terminal equipment
CN112632248A (en) * 2020-12-22 2021-04-09 深圳追一科技有限公司 Question answering method, device, computer equipment and storage medium
CN112668632A (en) * 2020-12-25 2021-04-16 浙江大华技术股份有限公司 Data processing method and device, computer equipment and storage medium
CN112749266A (en) * 2021-01-19 2021-05-04 海尔数字科技(青岛)有限公司 Industrial question and answer method, device, system, equipment and storage medium
CN112765306A (en) * 2020-12-30 2021-05-07 金蝶软件(中国)有限公司 Intelligent question answering method and device, computer equipment and storage medium
CN112784600A (en) * 2021-01-29 2021-05-11 北京百度网讯科技有限公司 Information sorting method and device, electronic equipment and storage medium
CN112989001A (en) * 2021-03-31 2021-06-18 建信金融科技有限责任公司 Question and answer processing method, device, medium and electronic equipment
CN113158682A (en) * 2021-04-09 2021-07-23 泰康保险集团股份有限公司 Product name identification method and device, electronic equipment and medium
CN113282733A (en) * 2021-06-11 2021-08-20 上海寻梦信息技术有限公司 Customer service problem matching method, system, device and storage medium
CN113342958A (en) * 2021-07-02 2021-09-03 马上消费金融股份有限公司 Question-answer matching method, text matching model training method and related equipment
CN113553412A (en) * 2021-06-30 2021-10-26 北京百度网讯科技有限公司 Question and answer processing method and device, electronic equipment and storage medium
CN113590790A (en) * 2021-07-30 2021-11-02 北京壹心壹翼科技有限公司 Question retrieval method, device, equipment and medium applied to multiple rounds of question answering
CN113746899A (en) * 2021-07-29 2021-12-03 济南浪潮数据技术有限公司 Cloud platform access method and device
CN113761107A (en) * 2021-09-18 2021-12-07 杭州网易智企科技有限公司 Information processing method, medium, device and computing equipment based on question-answering system
CN114090747A (en) * 2021-10-14 2022-02-25 特斯联科技集团有限公司 Automatic question answering method, device, equipment and medium based on multiple semantic matching
CN114416953A (en) * 2022-01-20 2022-04-29 北京百度网讯科技有限公司 Question-answer processing method, question-answer model training method and device
CN114490965A (en) * 2021-12-23 2022-05-13 北京百度网讯科技有限公司 Question processing method and device, electronic equipment and storage medium
CN114817505A (en) * 2022-05-10 2022-07-29 国网江苏省电力有限公司南通供电分公司 Rapid power supply work order reply method based on historical work order matching system
CN114969291A (en) * 2022-05-31 2022-08-30 湖南工商大学 Automatic question answering method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107220380A (en) * 2017-06-27 2017-09-29 北京百度网讯科技有限公司 Question and answer based on artificial intelligence recommend method, device and computer equipment
CN108536708A (en) * 2017-03-03 2018-09-14 腾讯科技(深圳)有限公司 A kind of automatic question answering processing method and automatically request-answering system
CN109522394A (en) * 2018-10-12 2019-03-26 北京奔影网络科技有限公司 Knowledge base question and answer system and method for building up

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108536708A (en) * 2017-03-03 2018-09-14 腾讯科技(深圳)有限公司 A kind of automatic question answering processing method and automatically request-answering system
CN107220380A (en) * 2017-06-27 2017-09-29 北京百度网讯科技有限公司 Question and answer based on artificial intelligence recommend method, device and computer equipment
CN109522394A (en) * 2018-10-12 2019-03-26 北京奔影网络科技有限公司 Knowledge base question and answer system and method for building up

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112287085B (en) * 2020-11-06 2023-12-05 中国平安财产保险股份有限公司 Semantic matching method, system, equipment and storage medium
CN112287085A (en) * 2020-11-06 2021-01-29 中国平安财产保险股份有限公司 Semantic matching method, system, device and storage medium
CN112507091A (en) * 2020-12-01 2021-03-16 百度健康(北京)科技有限公司 Method, device, equipment and storage medium for retrieving information
CN112487165A (en) * 2020-12-02 2021-03-12 税友软件集团股份有限公司 Question and answer method, device and medium based on keywords
CN112507078A (en) * 2020-12-15 2021-03-16 浙江诺诺网络科技有限公司 Semantic question and answer method and device, electronic equipment and storage medium
CN112507078B (en) * 2020-12-15 2022-05-10 浙江诺诺网络科技有限公司 Semantic question and answer method and device, electronic equipment and storage medium
CN112632248A (en) * 2020-12-22 2021-04-09 深圳追一科技有限公司 Question answering method, device, computer equipment and storage medium
CN112668632A (en) * 2020-12-25 2021-04-16 浙江大华技术股份有限公司 Data processing method and device, computer equipment and storage medium
CN112668632B (en) * 2020-12-25 2022-04-08 浙江大华技术股份有限公司 Data processing method and device, computer equipment and storage medium
CN112597208A (en) * 2020-12-29 2021-04-02 深圳价值在线信息科技股份有限公司 Enterprise name retrieval method, enterprise name retrieval device and terminal equipment
CN112765306A (en) * 2020-12-30 2021-05-07 金蝶软件(中国)有限公司 Intelligent question answering method and device, computer equipment and storage medium
CN112749266A (en) * 2021-01-19 2021-05-04 海尔数字科技(青岛)有限公司 Industrial question and answer method, device, system, equipment and storage medium
CN112784600A (en) * 2021-01-29 2021-05-11 北京百度网讯科技有限公司 Information sorting method and device, electronic equipment and storage medium
CN112784600B (en) * 2021-01-29 2024-01-16 北京百度网讯科技有限公司 Information ordering method, device, electronic equipment and storage medium
CN112989001B (en) * 2021-03-31 2023-05-26 建信金融科技有限责任公司 Question and answer processing method and device, medium and electronic equipment
CN112989001A (en) * 2021-03-31 2021-06-18 建信金融科技有限责任公司 Question and answer processing method, device, medium and electronic equipment
CN113158682B (en) * 2021-04-09 2024-03-01 泰康保险集团股份有限公司 Product name identification method, device, electronic equipment and medium
CN113158682A (en) * 2021-04-09 2021-07-23 泰康保险集团股份有限公司 Product name identification method and device, electronic equipment and medium
CN113282733A (en) * 2021-06-11 2021-08-20 上海寻梦信息技术有限公司 Customer service problem matching method, system, device and storage medium
CN113282733B (en) * 2021-06-11 2024-04-09 上海寻梦信息技术有限公司 Customer service problem matching method, system, equipment and storage medium
CN113553412A (en) * 2021-06-30 2021-10-26 北京百度网讯科技有限公司 Question and answer processing method and device, electronic equipment and storage medium
CN113553412B (en) * 2021-06-30 2023-07-25 北京百度网讯科技有限公司 Question-answering processing method, question-answering processing device, electronic equipment and storage medium
CN113342958A (en) * 2021-07-02 2021-09-03 马上消费金融股份有限公司 Question-answer matching method, text matching model training method and related equipment
CN113342958B (en) * 2021-07-02 2023-06-16 马上消费金融股份有限公司 Question-answer matching method, text matching model training method and related equipment
CN113746899A (en) * 2021-07-29 2021-12-03 济南浪潮数据技术有限公司 Cloud platform access method and device
CN113746899B (en) * 2021-07-29 2023-04-07 济南浪潮数据技术有限公司 Cloud platform access method and device
CN113590790A (en) * 2021-07-30 2021-11-02 北京壹心壹翼科技有限公司 Question retrieval method, device, equipment and medium applied to multiple rounds of question answering
CN113590790B (en) * 2021-07-30 2023-11-28 北京壹心壹翼科技有限公司 Question retrieval method, device, equipment and medium applied to multi-round question and answer
CN113761107A (en) * 2021-09-18 2021-12-07 杭州网易智企科技有限公司 Information processing method, medium, device and computing equipment based on question-answering system
CN114090747A (en) * 2021-10-14 2022-02-25 特斯联科技集团有限公司 Automatic question answering method, device, equipment and medium based on multiple semantic matching
CN114490965A (en) * 2021-12-23 2022-05-13 北京百度网讯科技有限公司 Question processing method and device, electronic equipment and storage medium
CN114490965B (en) * 2021-12-23 2022-11-08 北京百度网讯科技有限公司 Question processing method and device, electronic equipment and storage medium
CN114416953A (en) * 2022-01-20 2022-04-29 北京百度网讯科技有限公司 Question-answer processing method, question-answer model training method and device
CN114416953B (en) * 2022-01-20 2023-10-31 北京百度网讯科技有限公司 Question-answering processing method, question-answering model training method and device
CN114817505A (en) * 2022-05-10 2022-07-29 国网江苏省电力有限公司南通供电分公司 Rapid power supply work order reply method based on historical work order matching system
CN114969291B (en) * 2022-05-31 2023-08-08 湖南工商大学 Automatic question and answer method and device
CN114969291A (en) * 2022-05-31 2022-08-30 湖南工商大学 Automatic question answering method and device

Similar Documents

Publication Publication Date Title
CN111858859A (en) Automatic question-answering processing method, device, computer equipment and storage medium
CN108280061B (en) Text processing method and device based on ambiguous entity words
CN110427463B (en) Search statement response method and device, server and storage medium
CN108846126B (en) Generation of associated problem aggregation model, question-answer type aggregation method, device and equipment
CN107330023B (en) Text content recommendation method and device based on attention points
CN111767366B (en) Question and answer resource mining method and device, computer equipment and storage medium
CN109815487B (en) Text quality inspection method, electronic device, computer equipment and storage medium
CN110377916B (en) Word prediction method, word prediction device, computer equipment and storage medium
CN112819023B (en) Sample set acquisition method, device, computer equipment and storage medium
US11755668B1 (en) Apparatus and method of performance matching
JP2020512651A (en) Search method, device, and non-transitory computer-readable storage medium
CN111563158B (en) Text ranking method, ranking apparatus, server and computer-readable storage medium
CN107844531B (en) Answer output method and device and computer equipment
CN112380421A (en) Resume searching method and device, electronic equipment and computer storage medium
Zhao et al. An end-to-end deep framework for answer triggering with a novel group-level objective
CN114925174A (en) Document retrieval method and device and electronic equipment
CN112307048B (en) Semantic matching model training method, matching method, device, equipment and storage medium
CN111881264B (en) Method and electronic equipment for searching long text in question-answering task in open field
WO2023177723A1 (en) Apparatuses and methods for querying and transcribing video resumes
CN115062769A (en) Knowledge distillation-based model training method, device, equipment and storage medium
US11501071B2 (en) Word and image relationships in combined vector space
CN112749554B (en) Method, device, equipment and storage medium for determining text matching degree
CN114398482A (en) Dictionary construction method and device, electronic equipment and storage medium
CN112749268A (en) FAQ system sequencing method, device and system based on hybrid strategy
CN113569018A (en) Question and answer pair mining method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination