CN110765250A - Retrieval method, retrieval device, readable storage medium and electronic equipment - Google Patents

Retrieval method, retrieval device, readable storage medium and electronic equipment Download PDF

Info

Publication number
CN110765250A
CN110765250A CN201911000788.5A CN201911000788A CN110765250A CN 110765250 A CN110765250 A CN 110765250A CN 201911000788 A CN201911000788 A CN 201911000788A CN 110765250 A CN110765250 A CN 110765250A
Authority
CN
China
Prior art keywords
vector
sequence
hidden
information
statement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911000788.5A
Other languages
Chinese (zh)
Inventor
姜梦晓
赵扬
李佩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rajax Network Technology Co Ltd
Original Assignee
Rajax Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rajax Network Technology Co Ltd filed Critical Rajax Network Technology Co Ltd
Priority to CN201911000788.5A priority Critical patent/CN110765250A/en
Publication of CN110765250A publication Critical patent/CN110765250A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis

Abstract

The embodiment of the invention discloses a retrieval method, a retrieval device, a readable storage medium and electronic equipment, wherein the method comprises the steps of sequentially inputting retrieval questions and information of the questions into a bidirectional cyclic neural network layer and an attention mechanism layer to determine corresponding first statement vectors and second statement vectors, calculating the similarity of the first statement vectors and the second statement vectors to determine the matching degree of a retrieval text and the information of the questions, and outputting answer information corresponding to the retrieval text. The method comprises the steps of inputting a retrieval text and problem information to be retrieved into a bidirectional cyclic neural network layer and a self-attention mechanism layer to determine a first statement vector and a second statement vector which take the weight of each word into consideration, and then determining the matching degree of the retrieval text and the problem information by calculating the first statement vector and the second statement vector, so that the accuracy of a retrieval process is improved.

Description

Retrieval method, retrieval device, readable storage medium and electronic equipment
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a retrieval method, a retrieval apparatus, a readable storage medium, and an electronic device.
Background
In the process of using the software, users often encounter a plurality of problems, so that a plurality of users can consult in a voice or text mode by dialing a customer service telephone or contacting with an online customer service. As the number of users increases, more and more users will be served to the questions and services that need to be solved. To save costs, more and more software is beginning to respond to user questions with machine voice or machine on-line service. In this process, the user's question is actually obtained by means of speech recognition or text input by the user, and then the search answer is fed back to the client. However, in the process of solving the problem by adopting the machine customer service, the accuracy of the returned problem answer is not high.
Disclosure of Invention
In view of this, embodiments of the present invention provide a retrieval method, an apparatus, a readable storage medium, and an electronic device, which aim to improve the accuracy of a retrieval process.
In a first aspect, an embodiment of the present invention discloses a retrieval method, where the method includes:
determining a retrieval text and a data information set, wherein the data information set comprises an information pair, and the information pair comprises question information and corresponding answer information;
determining a first word vector sequence and a second word vector sequence corresponding to the retrieval text and each question message;
inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer respectively to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence;
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively to determine corresponding first weight vectors and second weight vectors;
calculating an inner product of the first sequence of concealment vectors and first weight vectors to determine first statement vectors, and calculating an inner product of each second sequence of concealment vectors and corresponding second weight vectors to determine corresponding second statement vectors;
calculating the similarity of the first statement vector and the second statement vector to determine the matching degree of the corresponding retrieval text and the problem information;
determining candidate question information according to the matching degree of the retrieval text and the question information;
and determining and outputting answer information corresponding to the retrieval text according to the candidate question information.
Further, the respectively inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer to determine a corresponding first hidden vector sequence and second hidden vector sequence comprises:
inputting the first word vector sequence into a bidirectional recurrent neural network layer to determine a first forward hidden state sequence and a first backward hidden state sequence;
inputting each second word vector sequence into a bidirectional recurrent neural network layer to determine a corresponding second forward hidden state sequence and a second backward hidden state sequence;
concatenating the first forward and first backward sequence of hidden states to determine a corresponding first sequence of hidden vectors, and concatenating the second forward and second backward sequence of hidden states to determine a second sequence of hidden vectors.
Further, the respectively inputting the first concealment vector sequence and the second concealment vector sequence into a self-attention mechanism layer to determine corresponding first weight vector and second weight vector specifically includes:
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively as parameters of a weight calculation function to determine corresponding first weight vector and second weight vector, wherein the weight calculation function is that a is softmax (w)s2tanh(ws1HT) H is the first concealment vector sequence or the second concealment vector sequence, a is the corresponding first weight vector or the second weight vector, ws1To preset the matrix, ws2Is a preset parameter.
Further, the calculating the similarity between the first sentence vector and the second sentence vector to determine the matching degree between the corresponding search text and the question information includes:
determining similar features and distance features of the first statement vector and each second statement vector, wherein the similar features are preliminary similarity scores of the first statement vector and each second statement vector, and the distance features are used for representing the distance between the first statement vector and each second statement vector;
determining a joint statement vector according to the first statement vector, the second statement vector, the corresponding similar features and the corresponding distance features;
inputting the joint statement vector into a hidden layer to determine a weighted statement vector;
and taking the weighted statement vector as the input of a normalized index function, and outputting the similarity of the first statement vector and the second statement vector so as to determine the matching degree of the corresponding retrieval text and the problem information.
Further, the determining of candidate question information according to the matching degree of the search text and the question information specifically includes:
and determining the problem information with the matching degree larger than the threshold value as candidate problem information.
Further, the determining of candidate question information according to the matching degree of the search text and the question information specifically includes:
and determining N pieces of problem information with the maximum matching degree as candidate problem information, wherein N is a preset constant.
In a second aspect, an embodiment of the present invention discloses a retrieval apparatus, where the apparatus includes:
the information determining module is used for determining a retrieval text and a data information set, wherein the data information set comprises an information pair, and the information pair comprises question information and corresponding answer information;
the word vector determining module is used for determining a first word vector sequence and a second word vector sequence corresponding to the retrieval text and each question information;
a hidden vector determining module, configured to input the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer, respectively, so as to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence;
a statement vector determination module, configured to input the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer, respectively, to determine corresponding first weight vectors and second weight vectors;
a calculation module, configured to calculate an inner product of the first hidden vector sequence and the first weight vector to determine a first sentence vector, and calculate an inner product of each second hidden vector sequence and a corresponding second weight vector to determine a corresponding second sentence vector;
the matching module is used for calculating the similarity of the first statement vector and the second statement vector so as to determine the matching degree of the corresponding retrieval text and the corresponding question information;
the candidate question determining module is used for determining candidate question information according to the matching degree of the retrieval text and the question information;
and the answer output module is used for determining and outputting answer information corresponding to the retrieval text according to the candidate question information.
In a third aspect, an embodiment of the present invention discloses a computer-readable storage medium for storing computer program instructions, which when executed by a processor implement the method according to any one of the first aspect.
In a fourth aspect, an embodiment of the present invention discloses an electronic device, including a memory and a processor, where the memory is used to store one or more computer program instructions, where the one or more computer program instructions are executed by the processor to implement the following steps:
determining a retrieval text and a data information set, wherein the data information set comprises an information pair, and the information pair comprises question information and corresponding answer information;
determining a first word vector sequence and a second word vector sequence corresponding to the retrieval text and each question message;
inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer respectively to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence;
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively to determine corresponding first weight vectors and second weight vectors;
calculating an inner product of the first sequence of concealment vectors and first weight vectors to determine first statement vectors, and calculating an inner product of each second sequence of concealment vectors and corresponding second weight vectors to determine corresponding second statement vectors;
calculating the similarity of the first statement vector and the second statement vector to determine the matching degree of the corresponding retrieval text and the problem information;
determining candidate question information according to the matching degree of the retrieval text and the question information;
and determining and outputting answer information corresponding to the retrieval text according to the candidate question information.
Further, the respectively inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer to determine a corresponding first hidden vector sequence and second hidden vector sequence comprises:
inputting the first word vector sequence into a bidirectional recurrent neural network layer to determine a first forward hidden state sequence and a first backward hidden state sequence;
inputting each second word vector sequence into a bidirectional recurrent neural network layer to determine a corresponding second forward hidden state sequence and a second backward hidden state sequence;
concatenating the first forward and first backward sequence of hidden states to determine a corresponding first sequence of hidden vectors, and concatenating the second forward and second backward sequence of hidden states to determine a second sequence of hidden vectors.
Further, the respectively inputting the first concealment vector sequence and the second concealment vector sequence into a self-attention mechanism layer to determine corresponding first weight vector and second weight vector specifically includes:
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively as parameters of a weight calculation function to determine corresponding first weight vector and second weight vector, wherein the weight calculation function is that a is softmax (w)s2tanh(ws1HT) H is the first concealment vector sequence or the second concealment vector sequence, a is the corresponding first weight vector or the second weight vector, ws1To preset the matrix, ws2Is a preset parameter.
Further, the calculating the similarity between the first sentence vector and the second sentence vector to determine the matching degree between the corresponding search text and the question information includes:
determining similar features and distance features of the first statement vector and each second statement vector, wherein the similar features are preliminary similarity scores of the first statement vector and each second statement vector, and the distance features are used for representing the distance between the first statement vector and each second statement vector;
determining a joint statement vector according to the first statement vector, the second statement vector, the corresponding similar features and the corresponding distance features;
inputting the joint statement vector into a hidden layer to determine a weighted statement vector;
and taking the weighted statement vector as the input of a normalized index function, and outputting the similarity of the first statement vector and the second statement vector so as to determine the matching degree of the corresponding retrieval text and the problem information.
Further, the determining of candidate question information according to the matching degree of the search text and the question information specifically includes:
and determining the problem information with the matching degree larger than the threshold value as candidate problem information.
Further, the determining of candidate question information according to the matching degree of the search text and the question information specifically includes:
and determining N pieces of problem information with the maximum matching degree as candidate problem information, wherein N is a preset constant.
The method of the embodiment of the invention determines the first statement vector and the second statement vector considering the weight of each word by inputting the retrieval text and the problem information to be retrieved into the bidirectional cyclic neural network layer and the self-attention mechanism layer, and then determines the matching degree of the retrieval text and the problem information by calculating the first statement vector and the second statement vector, thereby improving the accuracy of the retrieval process.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent from the following description of the embodiments of the present invention with reference to the accompanying drawings, in which:
FIG. 1 is a flow chart of a retrieval method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a retrieval process according to an alternative implementation of the present invention;
FIG. 3 is a diagram illustrating candidate problem information according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating answer information according to an embodiment of the present invention;
FIG. 5 is a diagram of a retrieving apparatus according to an embodiment of the present invention;
fig. 6 is a schematic diagram of an electronic device according to an embodiment of the invention.
Detailed Description
The present invention will be described below based on examples, but the present invention is not limited to only these examples. In the following detailed description of the present invention, certain specific details are set forth. It will be apparent to one skilled in the art that the present invention may be practiced without these specific details. Well-known methods, procedures, and procedures have not been described in detail so as not to obscure the present invention.
Further, those of ordinary skill in the art will appreciate that the drawings provided herein are for illustrative purposes and are not necessarily drawn to scale.
Unless the context clearly requires otherwise, throughout this specification, the words "comprise", "comprising", and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is, what is meant is "including, but not limited to".
In the description of the present invention, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified.
Fig. 1 is a flowchart of a retrieval method according to an embodiment of the present invention, and as shown in fig. 1, the method includes:
and step S100, determining a retrieval text and data information set.
Specifically, the retrieval information is a question which is input by the user and contains retrieval content, and may be text information input by the user or text information converted from input voice information. For example, "how to modify the shipping address". The data information set is a set of all data in the database or a data set consisting of partial data obtained through preliminary screening and comprises at least one information pair, and the information pair comprises question information and corresponding answer information. For example: "question information: how to modify the address; answer information: click on 'shipping address' in 'personal information' to edit.
And S200, determining a first word vector sequence and a second word vector sequence corresponding to the retrieval text and each question message.
Specifically, the search question and each question information may be input into a word segmentation model, and a word sequence composed of a plurality of words is obtained through the word segmentation model, for example, when the input search information is "what is wrong with address after ordering", the word sequence obtained after word segmentation is { "order placing", "after", "address", "wrong", "what" }. And determining word vectors corresponding to the words in the word sequence, and forming a first word vector sequence or a second word vector sequence according to the word vectors. The word segmentation model may be, for example, a hidden markov model (HMM model) or an N-gram model, and the process of determining the word vector corresponding to each word and determining the first word vector sequence or the second word vector sequence may be, for example, sequentially inputting each word in the word sequence obtained by word segmentation into the word vector determination model, and sequentially outputting the corresponding word vectors to form the first word vector sequence or the second word vector sequence.
Step S300, determining a first hidden vector sequence and a second hidden vector sequence corresponding to the first word vector sequence and each second word vector sequence, respectively.
Specifically, the first word vector sequence and each second word vector sequence are respectively input into a bidirectional recurrent neural network layer to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence, and the process includes:
step S310, inputting the first word vector sequence into a bidirectional recurrent neural network layer to determine a first forward hidden state sequence and a first backward hidden state sequence.
Specifically, in this embodiment, the bidirectional recurrent neural network is Bi-LSTM, and the first word vector sequence is input into the bidirectional recurrent neural network layer from the front to the back and then a first forward hidden state sequence is output; and inputting the first word vector sequence into the bidirectional cyclic neural network layer from back to front and then outputting a first backward hidden state sequence.
Step S320, inputting the second word vector sequences into the bidirectional recurrent neural network layer to determine corresponding second forward hidden state sequences and second backward hidden state sequences.
Specifically, the bidirectional recurrent neural network layer applied in the step is the same as the bidirectional recurrent neural network layer applied in step S310, and the second word vector sequence is input into the bidirectional recurrent neural network layer from front to back and then a second forward hidden state sequence is output; and inputting the second word vector sequence into the bidirectional cyclic neural network layer from back to front and then outputting a second backward hidden state sequence.
Step S330, determining a first hidden vector sequence and a second hidden vector sequence.
Specifically, a first hidden vector sequence corresponding to the first word vector sequence is determined by splicing the first forward hidden state sequence and the first backward hidden state sequence, and a corresponding second hidden vector sequence is determined by splicing the second forward hidden state sequence and the second backward hidden state sequence corresponding to each second word vector.
Step S400, determining a first weight vector and a second weight vector corresponding to the first hidden vector sequence and the second hidden vector sequence, respectively.
Specifically, the first weight vector and the second weight vector are determined by inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention machine layer respectively, the first weight vector and the second weight vector pass through a weight after being input into the self-attention machine layer, and the weight calculation function is a ═ softmax (w ═ softmax)s2tanh(ws1HT) H is the first concealment vector sequence or the second concealment vector sequence, a is the corresponding first weight vector or the second weight vector, ws1For a predetermined matrix of dimension H, ws2For the default parameter, a one-dimensional vector containing the same number of elements as the number of vectors contained in H is used. And calculating the function to obtain a corresponding first weight vector and a second weight vector.
Step S500, determining a first sentence vector and a second sentence vector corresponding to the first hidden vector sequence and the second hidden vector sequence, respectively.
Specifically, an inner product of the first hidden vector sequence and a corresponding first weight vector is calculated, and then the obtained result is tiled to determine a first statement vector; and calculating the inner product of each second hidden vector sequence and the corresponding second weight vector, and tiling the obtained result to determine a second statement vector.
And S600, calculating the similarity of the first statement vector and the second statement vector to determine the matching degree of the corresponding retrieval text and the corresponding question information.
Specifically, the step of determining the matching degree between the search text and the question information further includes:
and step S610, determining similar features and distance features of the first statement vector and each second statement vector.
Specifically, the similarity feature is a vector representation of the preliminary similarity score of the first sentence vector and each second sentence vector, and may be determined by inputting the first sentence vector and each second sentence vector into a trained neural network model, that is, by a formula sim ═ qWa, where sim is the similarity feature, q is the first sentence vector, a is the second sentence vector, and W is a parameter of the neural network model, and a parameter W of the neural network model is optimized in each determination of the similarity feature sim. The distance feature is a vector representation for representing a distance between the first term vector and each second term vector, and may be determined in advance by calculating a plurality of distances such as an euclidean distance, a cosine distance, and an edit distance between the first term vector and each second term vector.
And S620, determining a joint statement vector according to the first statement vector, the second statement vector, the corresponding similar features and the corresponding distance features.
Specifically, the joint statement vector is obtained by sequentially splicing the first statement vector, the corresponding similar feature, the second statement vector, and the distance feature.
Step S630, inputting the joint statement vector into a hidden layer to determine a weighted statement vector.
Specifically, after the joint statement vector is input into the hidden layer, a weighted statement vector is determined by the following formula:
x′=f(bn(zx+b)
wherein x' is the weighted statement vector, f is an activation function relu, bn is a batch normalization algorithm (batch normalization), x is the joint statement vector, z and b are preset parameters, wherein z is a weight vector, and b is a bias vector.
And step S640, determining the matching degree of the retrieval text and the question information according to the weighted statement vector.
Specifically, the weighted statement vector is used as an input of a normalized exponential function (softmax), the similarity of the first statement vector and the second statement vector is output, and then the matching degree of the corresponding retrieval text and the corresponding question information is determined.
And S700, determining candidate question information according to the matching degree of the retrieval text and the question information.
Specifically, the candidate question information may be determined according to a preset threshold value, that is, when the question information pairAnd when the corresponding matching degree is greater than the threshold value, determining the problem information as candidate problem information. For example, when the question information includes { a }1,a2,a3,a4,a5And when the matching degrees are respectively {0.23,0.75,0.19,0.52 and 0.91} and the preset threshold is 0.5, determining that the candidate problem information is { a }2,a4,a5}。
As another optional implementation manner of the embodiment of the application, the candidate problem information may be determined according to a preset number, and N problem information with the largest matching degree is determined as the candidate problem information, where N is a preset constant. For example, when the question information includes { a }1,a2,a3,a4,a5And when the preset constant N is 3, determining that the candidate problem information is { a }2,a4,a5}。
As a further optional implementation manner of the embodiment of the present application, the candidate problem information may be determined by combining the two rules, that is, determining the problem information with the matching degree greater than the threshold in the database, and then determining the N problem information with the largest matching degree from among the problem information with the matching degree greater than the threshold as the candidate problem information. For example, when the question information includes { a }1,a2,a3,a4,a5The matching degrees are {0.23,0.75,0.19,0.52,0.91}, and when the preset threshold is 0.5 and the preset constant N is 2, the candidate problem information is determined to be { a }2,a5}。
Optionally, N may be set to be a constant range. When the matching degree is larger than the threshold value, the quantity of the problem information is smaller than the minimum value N of the NminThen, N with the maximum matching degree is selected from the data information setminThe individual question information is used as candidate question information; when the matching degree is larger than the threshold value, the number of the problem information is within the minimum value N of NminAnd maximum value NmaxIn the meantime, determining all the problem information with the matching degree larger than the threshold value as candidate problem information; when the matching degree isThe number of problem information greater than the threshold is greater than the maximum value N of NmaxThen, determining N in the problem information with the matching degree larger than the threshold valuemaxThe individual question information is taken as candidate question information.
Fig. 3 is a schematic diagram of displaying candidate question information according to an embodiment of the present invention, and as shown in fig. 3, the candidate question information 30 may be displayed through a display interface of a client. Optionally, the candidate question information 30 is displayed through a selection control on the client display interface, that is, the user may select at least one candidate task information by triggering the selection control.
And S800, determining and outputting answer information corresponding to the retrieval text according to the candidate question information.
Specifically, the answer information corresponding to the retrieval information is answer information corresponding to one candidate question information. After determining candidate question information according to step S700, when one of the candidate question information is selected, that is, it is determined that answer information corresponding to the candidate question information is answer information corresponding to the retrieval information, the answer information is output. The method for selecting the candidate information can be, for example, sending a selection instruction containing the identification of the selected candidate problem information to the server, and the selection instruction can be sent by triggering a selection control at the client.
Fig. 4 is a schematic diagram illustrating answer information display according to an embodiment of the present invention, and as shown in fig. 4, when the candidate task information is selected, answer information 40 corresponding to the search information is output, and the answer information 40 may be displayed through a display interface of a client.
The method comprises the steps of inputting a retrieval text and problem information to be retrieved into a bidirectional cyclic neural network layer and a self-attention mechanism layer to determine a first statement vector and a second statement vector which take the weight of each word into consideration, and then determining the matching degree of the retrieval text and the problem information by calculating the first statement vector and the second statement vector, so that the accuracy of a retrieval process is improved.
Fig. 2 is a schematic diagram of a retrieval process in an optional implementation manner according to an embodiment of the present invention, where the retrieval process is used to determine a matching degree between the retrieval question and question information to be retrieved, and is implemented according to the retrieval method shown in fig. 1.
Specifically, the search text and the problem information to be searched are determined first, a word embedding layer (word embedding)20 is input into the search text and the problem information to be searched, a first word vector sequence S1 and a second word vector sequence S2 corresponding to the search text and the problem information are obtained, the obtained first word vector sequence S1 and second word vector sequence S2 are input into the bidirectional recurrent neural network layer 21, a corresponding first hidden vector sequence H1 and second hidden vector sequence H2 are output, the first hidden vector sequence H1 and second hidden vector sequence H2 are input into the attention control layer 22, and a corresponding first weight vector a1 and second weight vector a2 are output. Outputting the first hidden vector sequence H1 and the corresponding first weight vector A1 and second hidden vector sequence and the corresponding H2 second weight vector A2 as an inner product layer 23 to obtain a first inner product vector M1 by calculating an inner product of the first hidden vector sequence and the first weight vector, and obtaining a second inner product vector M2 by calculating an inner product of the second hidden vector sequence and the corresponding second weight vector, inputting the first inner product vector M1 and the second inner product vector M2 into a flat layer 24 to determine a first sentence vector q and a second sentence vector a. The first sentence vector q and the second sentence vector a are input to the similar feature calculation layer 25 to determine the similar feature w of the first sentence vector q and the second sentence vector a, and the distance feature y, which is predetermined by inputting the first sentence vector q and the second sentence vector a in advance to the distance feature calculation layer 26, is determined at the same time. And sequentially splicing the first statement vector q, the similar feature w, the second statement vector a and the distance feature y through a splicing layer 27 to determine a combined statement vector x, wherein the combined statement vector x is processed by a hiding layer 28 to obtain a weighted statement vector x ', and finally, the weighted statement vector x' is input into a similarity calculation layer 29 to be used as the input of a normalized index function, and the similarity of the first statement vector and the second statement vector, namely the matching degree p of the retrieval text and the problem information, is obtained through the calculation of the normalized index function.
Fig. 5 is a schematic diagram of a search apparatus according to an embodiment of the present invention, as shown in fig. 5, the search apparatus includes an information determining module 50, a word vector determining module 51, a hidden vector determining module 52, a sentence vector determining module 53, a calculating module 54, a matching module 55, a candidate question determining module 56, and an answer outputting module 57.
Specifically, the information determining module 50 is configured to determine a search text and a data information set, where the data information set includes an information pair, and the information pair includes question information and corresponding answer information. The word vector determining module 51 is configured to determine a first word vector sequence and a second word vector sequence corresponding to the search text and each question information. The hidden vector determining module 52 is configured to input the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer, so as to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence. The statement vector determination module 53 is configured to input the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer to determine corresponding first weight vectors and second weight vectors. The calculation module 54 is configured to calculate an inner product of the first sequence of concealment vectors and the first weight vector to determine a first sentence vector, and calculate an inner product of each second sequence of concealment vectors and the corresponding second weight vector to determine a corresponding second sentence vector. The matching module 55 is configured to calculate similarity between the first sentence vector and the second sentence vector to determine matching degree between the corresponding search text and the question information. The candidate question determining module 56 is configured to determine candidate question information according to a matching degree of the search text and the question information. The answer output module 57 is configured to determine and output answer information corresponding to the search text according to the candidate question information.
The device determines a first statement vector and a second statement vector considering the weight of each word by inputting a retrieval text and problem information to be retrieved into a bidirectional cyclic neural network layer and a self-attention mechanism layer, and determines the matching degree of the retrieval text and the problem information by calculating the first statement vector and the second statement vector, so that the accuracy of the retrieval process is improved.
Fig. 6 is a schematic view of an electronic device according to an embodiment of the present invention, as shown in fig. 6, in this embodiment, the electronic device may be a server or a terminal, and the terminal may be, for example, an intelligent device such as a mobile phone, a computer, a tablet computer, and the like. As shown, the electronic device includes: at least one processor 62; a memory 61 communicatively coupled to the at least one processor; and a communication component 63 communicatively coupled to the storage medium, the communication component 63 receiving and transmitting data under control of the processor; wherein the memory 61 stores instructions executable by the at least one processor 62, the instructions being executable by the at least one processor 62 to implement the steps of:
determining a retrieval text and a data information set, wherein the data information set comprises an information pair, and the information pair comprises question information and corresponding answer information;
determining a first word vector sequence and a second word vector sequence corresponding to the retrieval text and each question message;
inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer respectively to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence;
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively to determine corresponding first weight vectors and second weight vectors;
calculating an inner product of the first sequence of concealment vectors and first weight vectors to determine first statement vectors, and calculating an inner product of each second sequence of concealment vectors and corresponding second weight vectors to determine corresponding second statement vectors;
calculating the similarity of the first statement vector and the second statement vector to determine the matching degree of the corresponding retrieval text and the problem information;
determining candidate question information according to the matching degree of the retrieval text and the question information;
and determining and outputting answer information corresponding to the retrieval text according to the candidate question information.
Further, the respectively inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer to determine a corresponding first hidden vector sequence and second hidden vector sequence comprises:
inputting the first word vector sequence into a bidirectional recurrent neural network layer to determine a first forward hidden state sequence and a first backward hidden state sequence;
inputting each second word vector sequence into a bidirectional recurrent neural network layer to determine a corresponding second forward hidden state sequence and a second backward hidden state sequence;
concatenating the first forward and first backward sequence of hidden states to determine a corresponding first sequence of hidden vectors, and concatenating the second forward and second backward sequence of hidden states to determine a second sequence of hidden vectors.
Further, the respectively inputting the first concealment vector sequence and the second concealment vector sequence into a self-attention mechanism layer to determine corresponding first weight vector and second weight vector specifically includes:
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively as parameters of a weight calculation function to determine corresponding first weight vector and second weight vector, wherein the weight calculation function is that a is softmax (w)s2tanh(ws1HT) H is the first concealment vector sequence or the second concealment vector sequence, a is the corresponding first weight vector or the second weight vector, ws1To preset the matrix, ws2Is a preset parameter.
Further, the calculating the similarity between the first sentence vector and the second sentence vector to determine the matching degree between the corresponding search text and the question information includes:
determining similar features and distance features of the first statement vector and each second statement vector, wherein the similar features are preliminary similarity scores of the first statement vector and each second statement vector, and the distance features are used for representing the distance between the first statement vector and each second statement vector;
determining a joint statement vector according to the first statement vector, the second statement vector, the corresponding similar features and the corresponding distance features;
inputting the joint statement vector into a hidden layer to determine a weighted statement vector;
and taking the weighted statement vector as the input of a normalized index function, and outputting the similarity of the first statement vector and the second statement vector so as to determine the matching degree of the corresponding retrieval text and the problem information.
Further, the determining of candidate question information according to the matching degree of the search text and the question information specifically includes:
and determining the problem information with the matching degree larger than the threshold value as candidate problem information.
Further, the determining of candidate question information according to the matching degree of the search text and the question information specifically includes:
and determining N pieces of problem information with the maximum matching degree as candidate problem information, wherein N is a preset constant.
In particular, the memory 61, as a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules. The processor 62 executes various functional applications of the device and data processing by executing nonvolatile software programs, instructions, and modules stored in the memory, that is, implements the above-described retrieval method.
The memory 61 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store a list of options, etc. Further, the memory 61 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory 61 may optionally include memory located remotely from the processor 62, which may be connected to an external device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
One or more modules are stored in the memory 61 and, when executed by the one or more processors 62, perform the retrieval method of any of the method embodiments described above.
The product can execute the method disclosed in the embodiment of the present application, and has corresponding functional modules and beneficial effects of the execution method, and reference may be made to the method disclosed in the embodiment of the present application without detailed technical details in the embodiment.
The present invention also relates to a computer-readable storage medium for storing a computer-readable program for causing a computer to perform some or all of the above-described method embodiments.
That is, as can be understood by those skilled in the art, all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip microcomputer, a chip, or the like) or a proceSSor (proceSSor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random AcceSS Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
The embodiment of the invention discloses A1 and a retrieval method, wherein the method comprises the following steps:
determining a retrieval text and a data information set, wherein the data information set comprises an information pair, and the information pair comprises question information and corresponding answer information;
determining a first word vector sequence and a second word vector sequence corresponding to the retrieval text and each question message;
inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer respectively to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence;
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively to determine corresponding first weight vectors and second weight vectors;
calculating an inner product of the first sequence of concealment vectors and first weight vectors to determine first statement vectors, and calculating an inner product of each second sequence of concealment vectors and corresponding second weight vectors to determine corresponding second statement vectors;
calculating the similarity of the first statement vector and the second statement vector to determine the matching degree of the corresponding retrieval text and the problem information;
determining candidate question information according to the matching degree of the retrieval text and the question information;
and determining and outputting answer information corresponding to the retrieval text according to the candidate question information.
A2, the inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer to determine corresponding first and second concealment vector sequences according to the method of a1, respectively, comprising:
inputting the first word vector sequence into a bidirectional recurrent neural network layer to determine a first forward hidden state sequence and a first backward hidden state sequence;
inputting each second word vector sequence into a bidirectional recurrent neural network layer to determine a corresponding second forward hidden state sequence and a second backward hidden state sequence;
concatenating the first forward and first backward sequence of hidden states to determine a corresponding first sequence of hidden vectors, and concatenating the second forward and second backward sequence of hidden states to determine a second sequence of hidden vectors.
A3, inputting the first and second sequences of concealment vectors into a self-attention mechanism layer according to the method of a1, respectively, to determine corresponding first and second weight vectors, specifically:
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively as parameters of a weight calculation function to determine corresponding first weight vector and second weight vector, wherein the weight calculation function is that a is softmax (w)s2tanh(ws1HT) H is the first concealment vector sequence or the second concealment vector sequence, a is the corresponding first weight vector or the second weight vector, ws1To preset the matrix, ws2Is a preset parameter.
A4, according to the method of A1, the calculating the similarity of the first sentence vector and the second sentence vector to determine the matching degree of the corresponding search text and question information includes:
determining similar features and distance features of the first statement vector and each second statement vector, wherein the similar features are preliminary similarity scores of the first statement vector and each second statement vector, and the distance features are used for representing the distance between the first statement vector and each second statement vector;
determining a joint statement vector according to the first statement vector, the second statement vector, the corresponding similar features and the corresponding distance features;
inputting the joint statement vector into a hidden layer to determine a weighted statement vector;
and taking the weighted statement vector as the input of a normalized index function, and outputting the similarity of the first statement vector and the second statement vector so as to determine the matching degree of the corresponding retrieval text and the problem information.
A5, according to the method in A1, the determining candidate question information according to the matching degree of the search text and the question information specifically includes:
and determining the problem information with the matching degree larger than the threshold value as candidate problem information.
A6, according to the method in A1, the determining candidate question information according to the matching degree of the search text and the question information specifically includes:
and determining N pieces of problem information with the maximum matching degree as candidate problem information, wherein N is a preset constant.
The embodiment of the invention also discloses B1 and a retrieval device, wherein the retrieval device comprises:
the information determining module is used for determining a retrieval text and a data information set, wherein the data information set comprises an information pair, and the information pair comprises question information and corresponding answer information;
the word vector determining module is used for determining a first word vector sequence and a second word vector sequence corresponding to the retrieval text and each question information;
a hidden vector determining module, configured to input the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer, respectively, so as to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence;
a statement vector determination module, configured to input the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer, respectively, to determine corresponding first weight vectors and second weight vectors;
a calculation module, configured to calculate an inner product of the first hidden vector sequence and the first weight vector to determine a first sentence vector, and calculate an inner product of each second hidden vector sequence and a corresponding second weight vector to determine a corresponding second sentence vector;
the matching module is used for calculating the similarity of the first statement vector and the second statement vector so as to determine the matching degree of the corresponding retrieval text and the corresponding question information;
the candidate question determining module is used for determining candidate question information according to the matching degree of the retrieval text and the question information;
and the answer output module is used for determining and outputting answer information corresponding to the retrieval text according to the candidate question information.
The embodiment of the invention also discloses C1 and a computer readable storage medium for storing computer program instructions, wherein the computer program instructions realize the method according to any one of A1-A6 when being executed by a processor.
The embodiment of the invention also discloses D1, an electronic device, comprising a memory and a processor, wherein the memory is used for storing one or more computer program instructions, and the one or more computer program instructions are executed by the processor to realize the following steps:
determining a retrieval text and a data information set, wherein the data information set comprises an information pair, and the information pair comprises question information and corresponding answer information;
determining a first word vector sequence and a second word vector sequence corresponding to the retrieval text and each question message;
inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer respectively to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence;
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively to determine corresponding first weight vectors and second weight vectors;
calculating an inner product of the first sequence of concealment vectors and first weight vectors to determine first statement vectors, and calculating an inner product of each second sequence of concealment vectors and corresponding second weight vectors to determine corresponding second statement vectors;
calculating the similarity of the first statement vector and the second statement vector to determine the matching degree of the corresponding retrieval text and the problem information;
determining candidate question information according to the matching degree of the retrieval text and the question information;
and determining and outputting answer information corresponding to the retrieval text according to the candidate question information.
D2, the electronic device of D1, the entering the first and second sequences of word vectors into a bidirectional recurrent neural network layer to determine corresponding first and second sequences of concealment vectors, respectively, comprising:
inputting the first word vector sequence into a bidirectional recurrent neural network layer to determine a first forward hidden state sequence and a first backward hidden state sequence;
inputting each second word vector sequence into a bidirectional recurrent neural network layer to determine a corresponding second forward hidden state sequence and a second backward hidden state sequence;
concatenating the first forward and first backward sequence of hidden states to determine a corresponding first sequence of hidden vectors, and concatenating the second forward and second backward sequence of hidden states to determine a second sequence of hidden vectors.
D3, the electronic device according to D1, wherein the respectively inputting the first and second sequences of concealment vectors into a self-attention mechanism layer to determine corresponding first and second weight vectors is specifically:
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively as parameters of a weight calculation function to determine corresponding first weight vector and second weight vector, wherein the weight calculation function is that a is softmax (w)s2tanh(ws1HT) H is the first concealment vector sequence or the second concealment vector sequence, a is the corresponding first weight vector or the second weight vector, ws1To preset the matrix, ws2Is a preset parameter.
D4, the electronic device of D1, the calculating the similarity of the first sentence vector and the second sentence vector to determine the matching degree of the corresponding search text and question information includes:
determining similar features and distance features of the first statement vector and each second statement vector, wherein the similar features are preliminary similarity scores of the first statement vector and each second statement vector, and the distance features are used for representing the distance between the first statement vector and each second statement vector;
determining a joint statement vector according to the first statement vector, the second statement vector, the corresponding similar features and the corresponding distance features;
inputting the joint statement vector into a hidden layer to determine a weighted statement vector;
and taking the weighted statement vector as the input of a normalized index function, and outputting the similarity of the first statement vector and the second statement vector so as to determine the matching degree of the corresponding retrieval text and the problem information.
D5, according to the electronic device of D1, the determining candidate question information according to the matching degree of the search text and the question information specifically includes:
and determining the problem information with the matching degree larger than the threshold value as candidate problem information.
D6, according to the electronic device of D1, the determining candidate question information according to the matching degree of the search text and the question information specifically includes:
and determining N pieces of problem information with the maximum matching degree as candidate problem information, wherein N is a preset constant.

Claims (10)

1. A method of searching, the method comprising:
determining a retrieval text and a data information set, wherein the data information set comprises an information pair, and the information pair comprises question information and corresponding answer information;
determining a first word vector sequence and a second word vector sequence corresponding to the retrieval text and each question message;
inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer respectively to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence;
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively to determine corresponding first weight vectors and second weight vectors;
calculating an inner product of the first sequence of concealment vectors and first weight vectors to determine first statement vectors, and calculating an inner product of each second sequence of concealment vectors and corresponding second weight vectors to determine corresponding second statement vectors;
calculating the similarity of the first statement vector and the second statement vector to determine the matching degree of the corresponding retrieval text and the problem information;
determining candidate question information according to the matching degree of the retrieval text and the question information;
and determining and outputting answer information corresponding to the retrieval text according to the candidate question information.
2. The method of claim 1, wherein the inputting the first word vector sequence and each second word vector sequence into a bi-directional recurrent neural network layer to determine a corresponding first concealment vector sequence and second concealment vector sequence comprises:
inputting the first word vector sequence into a bidirectional recurrent neural network layer to determine a first forward hidden state sequence and a first backward hidden state sequence;
inputting each second word vector sequence into a bidirectional recurrent neural network layer to determine a corresponding second forward hidden state sequence and a second backward hidden state sequence;
concatenating the first forward and first backward sequence of hidden states to determine a corresponding first sequence of hidden vectors, and concatenating the second forward and second backward sequence of hidden states to determine a second sequence of hidden vectors.
3. The method according to claim 1, wherein the inputting the first and second sequences of concealment vectors into the attention mechanism layer to determine the corresponding first and second weight vectors is performed by:
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively as parameters of a weight calculation function to determine corresponding first weight vector and second weight vector, wherein the weight calculation function is that a is softmax (w)s2tanh(ws1HT))Wherein H is the first concealment vector sequence or the second concealment vector sequence, a is the corresponding first weight vector or the second weight vector, ws1To preset the matrix, ws2Is a preset parameter.
4. The method of claim 1, wherein the calculating the similarity between the first sentence vector and the second sentence vector to determine the matching degree between the corresponding search text and the question information comprises:
determining similar features and distance features of the first statement vector and each second statement vector, wherein the similar features are preliminary similarity scores of the first statement vector and each second statement vector, and the distance features are used for representing the distance between the first statement vector and each second statement vector;
determining a joint statement vector according to the first statement vector, the second statement vector, the corresponding similar features and the corresponding distance features;
inputting the joint statement vector into a hidden layer to determine a weighted statement vector;
and taking the weighted statement vector as the input of a normalized index function, and outputting the similarity of the first statement vector and the second statement vector so as to determine the matching degree of the corresponding retrieval text and the problem information.
5. The method according to claim 1, wherein the determining candidate question information according to the matching degree of the search text and the question information specifically comprises:
and determining the problem information with the matching degree larger than the threshold value as candidate problem information.
6. The method according to claim 1, wherein the determining candidate question information according to the matching degree of the search text and the question information specifically comprises:
and determining N pieces of problem information with the maximum matching degree as candidate problem information, wherein N is a preset constant.
7. A retrieval apparatus, characterized in that the apparatus comprises:
the information determining module is used for determining a retrieval text and a data information set, wherein the data information set comprises an information pair, and the information pair comprises question information and corresponding answer information;
the word vector determining module is used for determining a first word vector sequence and a second word vector sequence corresponding to the retrieval text and each question information;
a hidden vector determining module, configured to input the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer, respectively, so as to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence;
a statement vector determination module, configured to input the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer, respectively, to determine corresponding first weight vectors and second weight vectors;
a calculation module, configured to calculate an inner product of the first hidden vector sequence and the first weight vector to determine a first sentence vector, and calculate an inner product of each second hidden vector sequence and a corresponding second weight vector to determine a corresponding second sentence vector;
the matching module is used for calculating the similarity of the first statement vector and the second statement vector so as to determine the matching degree of the corresponding retrieval text and the corresponding question information;
the candidate question determining module is used for determining candidate question information according to the matching degree of the retrieval text and the question information;
and the answer output module is used for determining and outputting answer information corresponding to the retrieval text according to the candidate question information.
8. A computer readable storage medium storing computer program instructions, which when executed by a processor implement the method of any one of claims 1-6.
9. An electronic device comprising a memory and a processor, wherein the memory is configured to store one or more computer program instructions, wherein the one or more computer program instructions are executed by the processor to implement the steps of:
determining a retrieval text and a data information set, wherein the data information set comprises an information pair, and the information pair comprises question information and corresponding answer information;
determining a first word vector sequence and a second word vector sequence corresponding to the retrieval text and each question message;
inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer respectively to determine a corresponding first hidden vector sequence and a corresponding second hidden vector sequence;
inputting the first hidden vector sequence and the second hidden vector sequence into a self-attention mechanism layer respectively to determine corresponding first weight vectors and second weight vectors;
calculating an inner product of the first sequence of concealment vectors and first weight vectors to determine first statement vectors, and calculating an inner product of each second sequence of concealment vectors and corresponding second weight vectors to determine corresponding second statement vectors;
calculating the similarity of the first statement vector and the second statement vector to determine the matching degree of the corresponding retrieval text and the problem information;
determining candidate question information according to the matching degree of the retrieval text and the question information;
and determining and outputting answer information corresponding to the retrieval text according to the candidate question information.
10. The electronic device of claim 9, wherein the inputting the first word vector sequence and each second word vector sequence into a bidirectional recurrent neural network layer to determine a corresponding first hidden vector sequence and second hidden vector sequence comprises:
inputting the first word vector sequence into a bidirectional recurrent neural network layer to determine a first forward hidden state sequence and a first backward hidden state sequence;
inputting each second word vector sequence into a bidirectional recurrent neural network layer to determine a corresponding second forward hidden state sequence and a second backward hidden state sequence;
concatenating the first forward and first backward sequence of hidden states to determine a corresponding first sequence of hidden vectors, and concatenating the second forward and second backward sequence of hidden states to determine a second sequence of hidden vectors.
CN201911000788.5A 2019-10-21 2019-10-21 Retrieval method, retrieval device, readable storage medium and electronic equipment Pending CN110765250A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911000788.5A CN110765250A (en) 2019-10-21 2019-10-21 Retrieval method, retrieval device, readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911000788.5A CN110765250A (en) 2019-10-21 2019-10-21 Retrieval method, retrieval device, readable storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN110765250A true CN110765250A (en) 2020-02-07

Family

ID=69331500

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911000788.5A Pending CN110765250A (en) 2019-10-21 2019-10-21 Retrieval method, retrieval device, readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN110765250A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112040076A (en) * 2020-09-01 2020-12-04 中国平安财产保险股份有限公司 Method, device, computer equipment and storage medium for processing agent report text
CN112445934A (en) * 2021-02-01 2021-03-05 北京远鉴信息技术有限公司 Voice retrieval method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108845990A (en) * 2018-06-12 2018-11-20 北京慧闻科技发展有限公司 Answer selection method, device and electronic equipment based on two-way attention mechanism
CN109815318A (en) * 2018-12-24 2019-05-28 平安科技(深圳)有限公司 The problems in question answering system answer querying method, system and computer equipment
CN110019741A (en) * 2018-06-01 2019-07-16 中国平安人寿保险股份有限公司 Request-answer system answer matching process, device, equipment and readable storage medium storing program for executing
CN110162636A (en) * 2019-05-30 2019-08-23 中森云链(成都)科技有限责任公司 Text mood reason recognition methods based on D-LSTM

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110019741A (en) * 2018-06-01 2019-07-16 中国平安人寿保险股份有限公司 Request-answer system answer matching process, device, equipment and readable storage medium storing program for executing
CN108845990A (en) * 2018-06-12 2018-11-20 北京慧闻科技发展有限公司 Answer selection method, device and electronic equipment based on two-way attention mechanism
CN109815318A (en) * 2018-12-24 2019-05-28 平安科技(深圳)有限公司 The problems in question answering system answer querying method, system and computer equipment
CN110162636A (en) * 2019-05-30 2019-08-23 中森云链(成都)科技有限责任公司 Text mood reason recognition methods based on D-LSTM

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
栾克鑫: ""基于句内注意力机制的答案自动抽取方法"", 《智能计算机与应用》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112040076A (en) * 2020-09-01 2020-12-04 中国平安财产保险股份有限公司 Method, device, computer equipment and storage medium for processing agent report text
CN112040076B (en) * 2020-09-01 2022-11-04 中国平安财产保险股份有限公司 Method, device, computer equipment and storage medium for processing agent report text
CN112445934A (en) * 2021-02-01 2021-03-05 北京远鉴信息技术有限公司 Voice retrieval method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
US11398236B2 (en) Intent-specific automatic speech recognition result generation
CN106873799B (en) Input method and device
US20180336193A1 (en) Artificial Intelligence Based Method and Apparatus for Generating Article
CN107797984B (en) Intelligent interaction method, equipment and storage medium
CN110909145B (en) Training method and device for multi-task model
US20200135213A1 (en) Electronic device and control method thereof
US9922650B1 (en) Intent-specific automatic speech recognition result generation
CN107330120A (en) Inquire answer method, inquiry answering device and computer-readable recording medium
US20220050833A1 (en) Dynamically suppressing query answers in search
CN110147494B (en) Information searching method and device, storage medium and electronic equipment
CN109145213A (en) Inquiry recommended method and device based on historical information
CN110263218B (en) Video description text generation method, device, equipment and medium
CN113094478B (en) Expression reply method, device, equipment and storage medium
JP2023531346A (en) Using a single request for multi-person calling in auxiliary systems
CN110765250A (en) Retrieval method, retrieval device, readable storage medium and electronic equipment
CN113342948A (en) Intelligent question and answer method and device
CN113095085A (en) Text emotion recognition method and device, electronic equipment and storage medium
CN114974253A (en) Natural language interpretation method and device based on character image and storage medium
CN114706945A (en) Intention recognition method and device, electronic equipment and storage medium
KR101927050B1 (en) User terminal and computer readable recorindg medium including a user adaptive learning model to be tranined with user customized data without accessing a server
CN111858966B (en) Knowledge graph updating method and device, terminal equipment and readable storage medium
KR20220109238A (en) Device and method for providing recommended sentence related to utterance input of user
CN110825859A (en) Retrieval method, retrieval device, readable storage medium and electronic equipment
CN110263135B (en) Data exchange matching method, device, medium and electronic equipment
CN114119123A (en) Information pushing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200207