CN110532368B - Question answering method, electronic equipment and computer readable storage medium - Google Patents

Question answering method, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN110532368B
CN110532368B CN201910832947.1A CN201910832947A CN110532368B CN 110532368 B CN110532368 B CN 110532368B CN 201910832947 A CN201910832947 A CN 201910832947A CN 110532368 B CN110532368 B CN 110532368B
Authority
CN
China
Prior art keywords
question
entity
map
relation
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910832947.1A
Other languages
Chinese (zh)
Other versions
CN110532368A (en
Inventor
陈贝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Robotics Co Ltd
Original Assignee
Cloudminds Shanghai Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Shanghai Robotics Co Ltd filed Critical Cloudminds Shanghai Robotics Co Ltd
Priority to CN201910832947.1A priority Critical patent/CN110532368B/en
Publication of CN110532368A publication Critical patent/CN110532368A/en
Application granted granted Critical
Publication of CN110532368B publication Critical patent/CN110532368B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to the technical field of information search, and discloses a question answering method, electronic equipment and a computer readable storage medium. The question answering method comprises the following steps: acquiring an entity in a question to be answered as a question entity; acquiring a plurality of map entities associated with the question entities and a plurality of map relations associated with the map entities from the knowledge map, and acquiring entity scores of the map entities and relation scores of the map relations; determining a preset weight according to training question sentences of a plurality of known answers; carrying out weighted calculation on the entity score and the relation score according to a preset weight value, and acquiring a target entity and a target relation according to a calculation result; and obtaining answers of the question to be answered according to the target entity and the target relation. Compared with the prior art, the question answering method, the electronic device and the computer readable storage medium provided by the embodiment of the invention can effectively improve the accuracy of question answer.

Description

Question answering method, electronic equipment and computer readable storage medium
Technical Field
The present invention relates to the field of information search technologies, and in particular, to a question answering method, an electronic device, and a computer-readable storage medium.
Background
A question-answering (QA) system is mainly used for answering natural language questions, and has attracted a great deal of research in the fields of information retrieval and natural language at present. Knowledge-graph question-answering, namely, giving natural language questions, semantically understanding and analyzing the questions, and further utilizing the established knowledge graph to inquire and reason to obtain answers. For example, who is the author of the western script?
However, the inventors of the present invention have found that the prior art question-answering method is less accurate in determining the answer to the question.
Disclosure of Invention
The embodiment of the invention aims to provide a question answering method, electronic equipment and a computer readable storage medium, which effectively improve the accuracy of question answers.
In order to solve the above technical problem, an embodiment of the present invention provides a question answering method, including the following steps: acquiring an entity in a question to be answered as a question entity; acquiring a plurality of map entities associated with the question entities and a plurality of map relations associated with the map entities from a knowledge map, and acquiring entity scores of the map entities and relation scores of the map relations; determining a preset weight according to training question sentences of a plurality of known answers; carrying out weighted calculation on the entity score and the relation score according to the preset weight value, and acquiring a target entity and target relation according to a calculation result; and obtaining the answer of the question to be answered according to the target entity and the target relation.
An embodiment of the present invention also provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the question-answering method as described above.
The embodiment of the present invention also provides a computer-readable storage medium, which stores a computer program, wherein the computer program is characterized in that the computer program is executed by a processor to execute the aforementioned question answering method.
Compared with the prior art, the embodiment of the invention has the advantages that after the questions to be answered are obtained, the relation between the knowledge graph and the graph is obtained according to the knowledge graph and the question entities in the questions to be answered, the entity scores of all the graph entities and the relation scores of all the graph relations are obtained, the entity scores and the relation scores are weighted and calculated according to the preset weight, the preset weight is determined according to the training questions of a plurality of known answers, the accuracy of the target entities and the target relations obtained according to the result of the weighted calculation is higher, and therefore the accuracy of the answers of the questions to be answered is effectively improved.
In addition, the obtaining of the preset weight specifically includes: obtaining a plurality of first training question sentences and standard answers of the first training question sentences, and setting a plurality of initial weights; acquiring a training entity set and a training relation set corresponding to each first training question from the knowledge graph; according to the training entity set and the training relationship set, obtaining answers of the first training question as test answers under the initial weights; comparing the test answers with the standard answers to obtain the accuracy of each initial weight; and taking the initial weight with the highest accuracy as the preset weight. The initial weight with the highest accuracy is used as a preset weight, so that the accuracy of the question answer is effectively improved.
In addition, the acquiring a plurality of map entities associated with the question entity from the knowledge map specifically includes: acquiring the name of the question entity; and acquiring a plurality of entities in the knowledge graph, wherein the name similarity between the entity name and the question entity is greater than a preset threshold value, and the entities are used as graph entities.
In addition, the obtaining of the entity score of each map entity specifically includes: obtaining a question word vector of the question to be answered; obtaining word vectors of all the map entities to obtain a plurality of entity word vectors; obtaining word vector similarity of each entity word vector and the question word vector; and taking the word vector similarity as an entity score of each map entity.
In addition, before obtaining the relationship score of each map relationship, the method further comprises the following steps: constructing a relation prediction model; performing data training on the relationship prediction model according to a plurality of second training question sentences, a plurality of training entities corresponding to the second training question sentences, and a plurality of training relationships corresponding to the training entities; the relationship score of each map relationship specifically includes: after the relation prediction model is trained, inputting the question to be answered, each map entity and each map relation into the relation prediction model; and taking the probability of each map relation output by the relation prediction model as the score of each map relation.
In addition, before the data training of the relationship prediction model, the method further comprises: judging whether the number of the training relations is larger than or equal to a preset threshold value or not; if the number of the training relations is smaller than the preset threshold value, at least one relation is randomly added into the training relations, so that the number of the training relations is equal to the preset threshold value. When the number of the training relations is smaller than a preset threshold value, at least one relation is added into the training relations, so that the number of the training relations reaches the preset threshold value, a sufficient number of the training relations are guaranteed to carry out data training on the relation prediction model, the effectiveness of the training on the relation prediction model is guaranteed, the accuracy of the trained relation prediction model on relation prediction is improved, and the accuracy of answers of the question answering method is further improved.
In addition, before the question to be answered, each graph entity and each graph relation are input into the relation prediction model, the method further comprises the following steps: all the map entities are converted into preset entities; the inputting the question sentence to be answered, each map entity and each map relationship into the relationship prediction model specifically includes: and inputting the question sentence to be answered, the preset entity and the map relation into the relation prediction model.
In addition, before performing weighted calculation on the entity score and the relationship score according to the preset weight value, the method further includes: obtaining N map relations with larger scores in the map relations, wherein N is an integer larger than zero; the performing weighted calculation on the entity score and the relationship score according to the preset weight specifically includes: and carrying out weighted calculation on the entity score and the relation score of the N map relations according to the preset weight value.
Drawings
FIG. 1 is a flowchart of a question answering method according to a first embodiment of the present invention;
FIG. 2 is a flowchart of a question answering method according to a second embodiment of the present invention;
FIG. 3 is a flowchart of a question answering method according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present application in various embodiments of the present invention. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
A first embodiment of the present invention relates to a question answering method, and a specific flow is shown in fig. 1, including the following steps:
step S101: and acquiring an entity in the question to be answered as a question entity.
Specifically, in this step, the question to be answered is the question input by the user. After a question to be answered input by a user is obtained, firstly, normalization processing is carried out on the question to be answered, redundant punctuations (for example, punctuations at the tail of the question) are deleted, redundant meaningless word strings are deleted, and the question is segmented.
Step S102: a plurality of map entities associated with the question entities are obtained.
Specifically, in the present embodiment, first, the entity name of a question entity is obtained, and an entity having the same entity name as the question entity is obtained from the knowledge-graph as a graph entity, for example, the question entity is "apple", and a plurality of entities having the entity name "apple" (including fruit apples, apple phones, and the like) are obtained from the knowledge-graph as graph entities.
It is to be understood that the obtaining of an entity with the same entity name as a question entity from a knowledge graph as a graph entity is merely a specific example in the present embodiment, and is not a limitation, and in other embodiments of the present invention, an entity with a similar entity name as a question entity may be obtained from a knowledge graph as a graph entity. Specifically, in another embodiment of the present invention, an entity with a similarity greater than a preset value to the noun vector of the question entity may be obtained from the knowledge graph as the graph entity. For example, the question entity is a worker ant, and entities such as an ant are obtained from the knowledge graph as graph entities.
In particular, in this embodiment, the knowledge-graph is essentially a database that includes triples of entities and relationships between the entities.
Further, in this embodiment, after the knowledge graph is constructed, word vector representations of all entities and relationships in the knowledge graph are obtained through algorithms such as a TransE algorithm, the TransE is a distributed word vector representation based on the entities and the relationships, the relationships in each triple instance (head entity, relationship, tail entity) are regarded as translations from the head entity to the tail entity, the head entity and the relationships are subjected to word vector addition to obtain the tail entity, that is, h + r = t, where h represents a word vector of the head entity, r represents a word vector of the relationship, and t represents a word vector of the tail entity, and h, r, and t are continuously adjusted to make (h + r) equal to t as much as possible. The entities and the relations in the knowledge graph are continuously adjusted through a TransE algorithm, so that the word vector representation of the entities and the relations in the knowledge graph is more accurate.
Step S103: and obtaining the score of each map entity.
Specifically, in this step, word vector representations of the question to be answered are first obtained, word vector representations of the respective map entities are obtained, and word vector similarities between the respective map entities and the question to be answered are calculated as scores of the respective map entities.
It should be understood that the above is only one specific application example of obtaining the score of each map entity in the present embodiment, and is not limited thereto, and in other embodiments of the present invention, the score of each map entity may be obtained by a method such as semantic analysis, which is not illustrated herein, and may be flexibly set according to actual needs.
Step S104: a plurality of graph relationships associated with graph entities are obtained.
Specifically, in the present embodiment, all relationships associated with each graph entity are obtained from the knowledge graph as graph relationships.
Step S105: and obtaining the relation score of each map relation.
Specifically, in the present embodiment, the score of each map relationship is obtained by a trained relationship prediction model. Inputting a question to be answered, each map entity and each map relation into a trained relation prediction model; and taking the probability of each map relation output by the relation prediction model as the score of each map relation.
Further, in this embodiment, the relationship prediction model may be a neural network model, which may output probabilities that the relationships in the question are the respective relationships according to the input question, the entity and the relationship. Before data training is performed on the relationship prediction model, an initial relationship prediction model needs to be constructed, a plurality of second training question sentences, a plurality of training entities corresponding to the second training question sentences and a plurality of training relationships corresponding to the training entities are obtained, the second training question sentences, the training entities and the training relationships are input into the initial relationship prediction model, and data training is performed on the initial relationship prediction model.
Preferably, in this embodiment, before performing data training on the initial relationship prediction model, it is first determined whether the number of the plurality of training relationships is greater than or equal to a preset threshold, and if the number of the plurality of training relationships is less than the preset threshold, at least one relationship is immediately added to the training relationships, so that the number of the training relationships reaches the preset threshold. When the number of the training relations is smaller than a preset threshold value, at least one relation is added into the training relations, so that the number of the training relations is guaranteed to reach the preset threshold value, the training relations with enough number are guaranteed to carry out data training on the relation prediction model, the effectiveness of the training on the relation prediction model is guaranteed, the accuracy of the relation prediction model after the training is finished on relation prediction is improved, and the accuracy of answers of the question answering method is further improved.
Step S106: and determining a preset weight according to the training question of a plurality of known answers.
Specifically, in this step, first, a plurality of first training question sentences and standard answers of the plurality of first training question sentences are obtained, and a plurality of initial weights are set; acquiring a training entity set and a training relation set corresponding to each first training question from the knowledge graph; according to the training entity set and the training relationship set, obtaining answers of all first training question sentences under all initial weights as test answers; comparing the test answers with the standard answers to obtain the accuracy of each initial weight; and taking the initial weight with the highest accuracy as a preset weight.
Step S107: and performing weighted calculation on the entity score and the relation score according to a preset weight value, and acquiring a target entity and a target relation according to a calculation result.
Specifically, in this step, the entity score and the relationship score are subjected to weighted summation according to a preset weight value, and an entity-relationship pair of the map entity and the map relationship with the largest sum is obtained and used as a target entity and target relationship.
Step S108: and obtaining answers of the question to be answered according to the target entity and the target relation.
Specifically, in this step, after the target entity and the target relationship are obtained, an entity corresponding to the target entity and the target relationship is obtained from the knowledge graph as an answer to the question to be answered.
Compared with the prior art, the question answering method provided by the first embodiment of the invention acquires the question to be answered, acquires the map entity and the map relation according to the knowledge map and the question entity in the question to be answered, acquires the entity score of each map entity and the relation score of each map relation, and performs weighting calculation on the entity score and the relation score according to the preset weight value.
A second embodiment of the present invention relates to a question answering method, which includes the following steps, as shown in fig. 2:
step S201: and acquiring an entity in the question to be answered as a question entity.
Step S202: obtaining a plurality of map entities associated with the question entities.
Step S203: and obtaining the score of each map entity.
Step S204: a plurality of graph relationships associated with graph entities are obtained.
It is to be understood that steps S201 to S204 in the present embodiment are substantially the same as steps S101 to S104 in the first embodiment, and specific reference may be made to the specific description in the first embodiment, which is not repeated herein.
Step S205: and all the map entities are converted into preset entities.
Step S206: and obtaining the relationship score of each map relationship according to a preset entity.
Specifically, in this step, the question to be answered, each graph relationship, and the preset entity are input into the trained relationship prediction model, and the probability of each graph relationship output by the relationship prediction model is used as the score of each graph relationship.
Step S207: and determining a preset weight according to the training question with a plurality of known answers.
Step S208: and performing weighted calculation on the entity score and the relation score according to a preset weight value, and acquiring a target entity and a target relation according to a calculation result.
Step S209: and obtaining answers of the question to be answered according to the target entity and the target relation.
It is to be understood that steps S207 to S209 in the present embodiment are substantially the same as steps S106 to S108 in the first embodiment, and specific reference may be made to the specific description in the first embodiment, which is not repeated herein.
Compared with the prior art, the second embodiment of the invention maintains all the technical effects of the first embodiment, and converts the graph entity into the preset entity before the score of each graph relation is obtained through the relation prediction model, thereby avoiding the influence of a plurality of complicated graph entities on the prediction result of the relation prediction model and improving the accuracy of the prediction result.
A third embodiment of the present invention relates to a question answering method, which includes the following steps, as shown in fig. 3:
step S301: and acquiring an entity in the question to be answered as a question entity.
Step S302: obtaining a plurality of map entities associated with the question entities.
Step S303: and obtaining the score of each map entity.
Step S304: a plurality of graph relationships associated with graph entities are obtained.
Step S305: and obtaining the relationship score of each map relationship according to a preset entity.
It is to be understood that steps S301 to S305 in the present embodiment are substantially the same as steps S101 to S105 in the first embodiment, and specific reference may be made to the specific description in the first embodiment, which is not repeated herein.
Step S306: and obtaining N map relations with larger scores in each map relation.
Specifically, N is an integer greater than zero.
Further, in this step, the map relations are sorted from large to small according to the scores, and N map relations ranked at the top are obtained.
Step S307: and determining a preset weight according to the training question of a plurality of known answers.
It is to be understood that step S307 in this embodiment is substantially the same as step S106 in the first embodiment, and specific reference may be made to the specific description in the first embodiment, which is not repeated herein.
Step S308: and performing weighted calculation on the entity score and the relation scores of the N map relations according to a preset weight value, and acquiring a target entity and target relation according to a calculation result.
Step S309: and obtaining answers of the question to be answered according to the target entity and the target relation.
It is to be understood that step S309 in this embodiment is substantially the same as step S108 in the first embodiment, and specific reference may be made to the specific description in the first embodiment, which is not repeated herein.
Compared with the prior art, the third embodiment of the invention retains all technical effects of the first embodiment, and obtains the N atlas relations with larger scores in each atlas relation for weighted calculation, so that the calculation process of the atlas relation with too low score is eliminated, the calculation amount of weighted calculation is effectively simplified, and the response efficiency of the question-answering method provided by the third embodiment of the invention is improved.
The steps of the above methods are divided for clarity, and the implementation may be combined into one step or split some steps, and the steps are divided into multiple steps, so long as the steps contain the same logical relationship, which is within the protection scope of the present patent; it is within the scope of the patent to add insignificant modifications to the algorithms or processes or to introduce insignificant design changes to the core design without changing the algorithms or processes.
A fourth embodiment of the present invention relates to an electronic apparatus, as shown in fig. 4, including: at least one processor 401; and a memory 402 communicatively coupled to the at least one processor 401; the memory 402 stores instructions executable by the at least one processor 401, and the instructions are executed by the at least one processor 401 to enable the at least one processor 401 to perform the above-described question-and-answer method.
Where the memory 402 and the processor 401 are coupled by a bus, which may include any number of interconnected buses and bridges that couple one or more of the various circuits of the processor 401 and the memory 402 together. The bus may also connect various other circuits such as peripherals, voltage regulators, power management circuits, etc., which are well known in the art, and therefore, will not be described any further herein. A bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor 401 may be transmitted over a wireless medium via an antenna, which may receive the data and transmit the data to the processor 401.
The processor 401 is responsible for managing the bus and general processing and may provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. And memory 402 may be used to store data used by processor 401 in performing operations.
A fifth embodiment of the present invention relates to a computer program product for use with an electronic device, and since the principle is the same as the above-mentioned question-answering method, the implementation thereof can be referred to the above-mentioned method embodiment, and repeated details are omitted. The computer program product comprises a computer readable storage medium and a computer program mechanism embedded therein, the computer program mechanism comprising instructions for performing any of the foregoing method embodiments.
A sixth embodiment of the present invention relates to a computer-readable storage medium storing a computer program. The computer program realizes the above-described method embodiments when executed by a processor.
That is, as can be understood by those skilled in the art, all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the invention, and that various changes in form and details may be made therein without departing from the spirit and scope of the invention in practice.

Claims (8)

1. A question-answering method, comprising:
acquiring an entity in a question to be answered as a question entity;
acquiring a plurality of map entities associated with the question entities and a plurality of map relations associated with the map entities from a knowledge map, and acquiring entity scores of the map entities and relation scores of the map relations; the graph relation is used for representing the relation between two graph entities;
determining a preset weight according to training question sentences of a plurality of known answers;
performing weighted calculation on the entity score and the relation score according to the preset weight value, and acquiring a target entity and target relation according to a calculation result;
obtaining answers of the question to be answered according to the target entity and the target relation;
before obtaining the relationship score of each map relationship, the method further comprises the following steps:
constructing a relation prediction model;
performing data training on the relationship prediction model according to a plurality of second training question sentences, a plurality of training entities corresponding to the second training question sentences and a plurality of training relationships corresponding to the training entities;
the relationship score of each map relationship specifically includes:
after the relation prediction model is trained, inputting the question to be answered, each map entity and each map relation into the relation prediction model;
taking the probability of each map relation output by the relation prediction model as a score of each map relation;
before the question to be answered, each graph entity and each graph relation are input into the relation prediction model, the method further comprises the following steps:
all the map entities are converted into preset entities;
the inputting the question sentence to be answered, each map entity and each map relationship into the relationship prediction model specifically includes:
and inputting the question sentence to be answered, the preset entity and the map relation into the relation prediction model.
2. The question-answering method according to claim 1, wherein the determining of the preset weight according to the training question sentences of the plurality of known answers specifically comprises:
obtaining a plurality of first training question sentences and standard answers of the first training question sentences, and setting a plurality of initial weights;
acquiring a training entity set and a training relation set corresponding to each first training question from the knowledge graph;
according to the training entity set and the training relationship set, obtaining answers of the first training question as test answers under the initial weights;
comparing the test answers with the standard answers to obtain the accuracy of each initial weight;
and taking the initial weight with the highest accuracy as the preset weight.
3. The question-answering method according to claim 1, wherein the obtaining of the plurality of graph entities associated with the question entity from the knowledge graph specifically comprises:
acquiring the name of the question entity;
and acquiring a plurality of entities in the knowledge graph, wherein the name similarity between the entity name and the question entity is greater than a preset threshold value, and the entities are used as graph entities.
4. The question-answering method according to claim 1, wherein the obtaining of the entity score of each graph entity specifically comprises:
obtaining a question word vector of the question to be answered;
obtaining word vectors of all the map entities to obtain a plurality of entity word vectors;
obtaining word vector similarity of each entity word vector and the question word vector;
and taking the word vector similarity as an entity score of each map entity.
5. The question-answering method according to claim 1, wherein before the data training of the relational prediction model, the method further comprises:
judging whether the number of the training relations is larger than or equal to a preset threshold value or not;
if the number of the training relations is smaller than the preset threshold value, at least one relation is randomly added into the training relations, so that the number of the training relations is equal to the preset threshold value.
6. The question-answering method according to any one of claims 1 to 5, wherein before performing weighted calculation on the entity score and the relationship score according to the preset weight value, the method further comprises:
obtaining N map relations with larger scores in the map relations, wherein N is an integer larger than zero;
the performing weighted calculation on the entity score and the relationship score according to the preset weight specifically includes:
and carrying out weighted calculation on the entity score and the relation score of the N map relations according to the preset weight value.
7. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the question-answering method according to any one of claims 1 to 6.
8. A computer-readable storage medium storing a computer program, wherein the computer program is executed by a processor to implement the question-answering method according to any one of claims 1 to 6.
CN201910832947.1A 2019-09-04 2019-09-04 Question answering method, electronic equipment and computer readable storage medium Active CN110532368B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910832947.1A CN110532368B (en) 2019-09-04 2019-09-04 Question answering method, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910832947.1A CN110532368B (en) 2019-09-04 2019-09-04 Question answering method, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110532368A CN110532368A (en) 2019-12-03
CN110532368B true CN110532368B (en) 2023-03-14

Family

ID=68666797

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910832947.1A Active CN110532368B (en) 2019-09-04 2019-09-04 Question answering method, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110532368B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111008272A (en) * 2019-12-04 2020-04-14 深圳市新国都金服技术有限公司 Knowledge graph-based question and answer method and device, computer equipment and storage medium
CN111522887B (en) * 2020-04-03 2023-09-12 北京百度网讯科技有限公司 Method and device for outputting information
CN111506719B (en) * 2020-04-20 2023-09-12 深圳追一科技有限公司 Associated question recommending method, device and equipment and readable storage medium
CN111552880B (en) * 2020-04-30 2023-06-30 杭州网易再顾科技有限公司 Knowledge graph-based data processing method and device, medium and electronic equipment
CN111625638B (en) * 2020-06-02 2023-06-06 深圳追一科技有限公司 Question processing method, device, equipment and readable storage medium
CN111985238A (en) * 2020-06-30 2020-11-24 联想(北京)有限公司 Answer generation method and equipment
CN112015868B (en) * 2020-09-07 2022-08-26 重庆邮电大学 Question-answering method based on knowledge graph completion
CN112685544A (en) * 2020-12-25 2021-04-20 中国联合网络通信集团有限公司 Telecommunication information query method, device, equipment and medium
CN112650845B (en) * 2020-12-30 2023-01-03 西安交通大学 Question-answering system and method based on BERT and knowledge representation learning
CN113204628A (en) * 2021-05-17 2021-08-03 上海明略人工智能(集团)有限公司 Method and device for obtaining answers to question sentences, electronic equipment and readable storage medium

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104572938B (en) * 2014-12-30 2018-02-16 江苏师范大学 It is a kind of with the web entity recognition methods of query driven and system
CN105868313B (en) * 2016-03-25 2019-02-12 浙江大学 A kind of knowledge mapping question answering system and method based on template matching technique
CN107748757B (en) * 2017-09-21 2021-05-07 北京航空航天大学 Question-answering method based on knowledge graph
CN110020010A (en) * 2017-10-10 2019-07-16 阿里巴巴集团控股有限公司 Data processing method, device and electronic equipment
CN108563653B (en) * 2017-12-21 2020-07-31 清华大学 Method and system for constructing knowledge acquisition model in knowledge graph
CN108228877B (en) * 2018-01-22 2020-08-04 北京师范大学 Knowledge base completion method and device based on learning sorting algorithm
CN108427707B (en) * 2018-01-23 2021-05-04 深圳市阿西莫夫科技有限公司 Man-machine question and answer method, device, computer equipment and storage medium
CN108509519B (en) * 2018-03-09 2021-03-09 北京邮电大学 General knowledge graph enhanced question-answer interaction system and method based on deep learning
CN108920622B (en) * 2018-06-29 2021-07-20 北京奇艺世纪科技有限公司 Training method, training device and recognition device for intention recognition
CN108984745B (en) * 2018-07-16 2021-11-02 福州大学 Neural network text classification method fusing multiple knowledge maps
CN109241294A (en) * 2018-08-29 2019-01-18 国信优易数据有限公司 A kind of entity link method and device
CN110110092B (en) * 2018-09-30 2021-03-09 北京国双科技有限公司 Knowledge graph construction method and related equipment
CN109522393A (en) * 2018-10-11 2019-03-26 平安科技(深圳)有限公司 Intelligent answer method, apparatus, computer equipment and storage medium
CN109992629B (en) * 2019-02-28 2021-08-06 中国科学院计算技术研究所 Neural network relation extraction method and system fusing entity type constraints

Also Published As

Publication number Publication date
CN110532368A (en) 2019-12-03

Similar Documents

Publication Publication Date Title
CN110532368B (en) Question answering method, electronic equipment and computer readable storage medium
CN111353310B (en) Named entity identification method and device based on artificial intelligence and electronic equipment
US20200134263A1 (en) Non-factoid question-answering device
KR20160026892A (en) Non-factoid question-and-answer system and method
WO2020244065A1 (en) Character vector definition method, apparatus and device based on artificial intelligence, and storage medium
CN111898374B (en) Text recognition method, device, storage medium and electronic equipment
CN111159359A (en) Document retrieval method, document retrieval device and computer-readable storage medium
US11461613B2 (en) Method and apparatus for multi-document question answering
CN110134777B (en) Question duplication eliminating method and device, electronic equipment and computer readable storage medium
CN113239169A (en) Artificial intelligence-based answer generation method, device, equipment and storage medium
CN113761868B (en) Text processing method, text processing device, electronic equipment and readable storage medium
CN110717021A (en) Input text and related device for obtaining artificial intelligence interview
CN116882372A (en) Text generation method, device, electronic equipment and storage medium
CN113886531A (en) Intelligent question and answer determining method and device, computer equipment and storage medium
CN114003682A (en) Text classification method, device, equipment and storage medium
CN111881264B (en) Method and electronic equipment for searching long text in question-answering task in open field
CN113569018A (en) Question and answer pair mining method and device
CN110377706B (en) Search sentence mining method and device based on deep learning
US11176327B2 (en) Information processing device, learning method, and storage medium
CN113468311B (en) Knowledge graph-based complex question and answer method, device and storage medium
CN115795018A (en) Multi-strategy intelligent searching question-answering method and system for power grid field
CN111666770B (en) Semantic matching method and device
CN116414940A (en) Standard problem determining method and device and related equipment
CN110413956B (en) Text similarity calculation method based on bootstrapping
CN110688472A (en) Method for automatically screening answers to questions, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210207

Address after: 200245 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant after: Dalu Robot Co.,Ltd.

Address before: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Applicant before: CLOUDMINDS (SHENZHEN) ROBOTICS SYSTEMS Co.,Ltd.

TA01 Transfer of patent application right
CB02 Change of applicant information

Address after: 200245 Building 8, No. 207, Zhongqing Road, Minhang District, Shanghai

Applicant after: Dayu robot Co.,Ltd.

Address before: 200245 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant before: Dalu Robot Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant