CN114116972A - Processing system of transformer knowledge intelligent question-answer model based on BilSTM - Google Patents

Processing system of transformer knowledge intelligent question-answer model based on BilSTM Download PDF

Info

Publication number
CN114116972A
CN114116972A CN202111373347.7A CN202111373347A CN114116972A CN 114116972 A CN114116972 A CN 114116972A CN 202111373347 A CN202111373347 A CN 202111373347A CN 114116972 A CN114116972 A CN 114116972A
Authority
CN
China
Prior art keywords
question
query
entity
module
answer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111373347.7A
Other languages
Chinese (zh)
Inventor
王宇飞
张敏杰
杨宁
张乾坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanzhi Internet Beijing Network Technology Co ltd
Original Assignee
Shanzhi Internet Beijing Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanzhi Internet Beijing Network Technology Co ltd filed Critical Shanzhi Internet Beijing Network Technology Co ltd
Priority to CN202111373347.7A priority Critical patent/CN114116972A/en
Publication of CN114116972A publication Critical patent/CN114116972A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/3332Query translation
    • G06F16/3338Query expansion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking

Abstract

The embodiment of the application relates to a processing system of transformer knowledge intelligent question-answering model based on BilSTM, including: the problem input module is used for receiving a problem which is input by a user and is associated with the transformer; the intention identification and slot position extraction module is used for performing word segmentation processing on the problem, processing the words after word segmentation by using the BilSTM and acquiring the query type and the entity name or attribute name corresponding to each word segmentation; the query retrieval module is used for generating a query statement according to the query type and the entity name or the attribute name corresponding to each participle, and retrieving all relevant answers on the network based on the query statement; and the answer output module is used for sorting all the searched related answers according to the answer sorting strategy and displaying the answers in sequence according to a preset display format. The system solves the problem of how to realize the link with the entity in the transformer knowledge map by the input of the spoken sentence by the user.

Description

Processing system of transformer knowledge intelligent question-answer model based on BilSTM
Technical Field
The application belongs to the technical field of artificial intelligence recognition, and particularly relates to a processing system of a transformer knowledge intelligent question-answering model based on BiLSTM.
Background
Along with the deep application of the artificial intelligence technology in the power industry, power enterprises construct vertical knowledge maps around transformer equipment, and the knowledge maps contain various kinds of knowledge related to transformers, including transformer accounts, transformer faults, detection standards and the like.
When a first-line employee of an electric power company carries out daily operation and maintenance inspection work of equipment, the knowledge points need to be flexibly retrieved, currently, retrieval tools used by the employee comprise a search engine and a question and answer robot, and when the two modes are used for retrieval, the following problems are met: 1) because the input form of the power staff tends to be spoken, the query form is not standard and has greater randomness; 2) for a certain power transformer knowledge, there may be a large number of approximate interrogations for which the intent needs to be correctly understood; 3) extracting key information existing in input sentences of power employees; 4) because the question and answer linguistic data related to the transformer are few, the retrieval foundation is lacked, and further the transformer knowledge points cannot be accurately identified.
In summary, it is not currently possible to effectively link the spoken input sentences of the employees of the power company with the entities in the transformer knowledge graph.
Therefore, when the staff of the power company performs the operation and maintenance inspection of the equipment, how to link the statement input in the spoken language with the entity in the transformer knowledge graph becomes a technical problem to be solved urgently at present.
Disclosure of Invention
Technical problem to be solved
In view of the above disadvantages and shortcomings of the prior art, the present application provides a processing system for a transformer knowledge intelligent question-answering model based on BiLSTM.
(II) technical scheme
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a processing system for a transformer knowledge intelligent question-answering model based on BiLSTM, including:
the problem input module is used for receiving a problem which is input by a user and is associated with the transformer;
the intention identification and slot position extraction module is used for performing word segmentation processing on the problem, processing the words after word segmentation by using the BilSTM and acquiring the query type and the entity name or attribute name corresponding to each word segmentation;
the query retrieval module is used for generating a query statement according to the query type and the entity name or the attribute name corresponding to each participle, and retrieving all relevant answers on the network based on the query statement;
and the answer output module is used for sorting all the searched related answers according to the answer sorting strategy and displaying the answers in sequence according to a preset display format.
Optionally, the intention identifying and slot extracting module is specifically configured to:
the word segmentation unit is used for performing word segmentation processing on the problem to obtain a keyword in the problem;
the intention recognition unit is used for inputting the keywords into an intention recognition model trained in advance and acquiring intentions corresponding to the keywords; if the intention is a static question-answer pair intention, taking the static question-answer pair as a query type; if the intention is an intention for inquiring the knowledge graph, judging whether the keyword is an inquiry type of set type inquiry or entity attribute/relation type inquiry to obtain a final inquiry type;
the slot position information extraction unit is used for determining slot position types and slot position values corresponding to the slot position types by combining the keywords based on a pre-trained slot position extraction model;
the entity link corresponding unit is used for searching an entity name or an attribute name matched with the slot position value from the entity database based on a pre-trained entity link model;
the entity database stores entity names or attribute names of all entities related to the transformer and a map/mapping relation to which each entity name or attribute name belongs.
Optionally, a query retrieval module, in particular for
Constructing a query subgraph according to the query type and the entity name or attribute name corresponding to each participle;
converting the query subgraph into a SPARQL query statement or an ES query statement;
and retrieving in the current network based on the SPARQL query statement or the ES query statement to obtain a retrieval result of the triple or the Q/A answer.
Optionally, an answer output module, in particular for
When the retrieval result comprises the KG triples and the ES documents, displaying the KG triples as first display information, and displaying the ES documents as second display information;
and when the retrieval result comprises the static Q/A question-answer pair and the ES document, displaying the static Q/A question-answer pair as first display information, and displaying the E S document as second display information.
Optionally, the processing system further includes:
and the model training module is used for generating a training corpus and training the BilSTM based on the generated training corpus to obtain the trained BilSTM comprising an intention recognition model, a slot extraction model and an entity link model.
Optionally, the model training module is configured to generate a training corpus, including:
receiving a problem input by a user and associated with a transformer, wherein the problem comprises a spoken problem and a problem of irregular word order;
receiving question-answer intentions/query types, keywords and entity information for the questions filled by the user according to a pre-set training template;
and generating a training corpus from the question, the question-answer intention/query type corresponding to the question, the key words and the entity information.
Optionally, the processing system includes:
a plurality of clients and servers;
the question input module and the answer output module are positioned in each client;
the intention identification and slot position extraction module and the query retrieval module are positioned in the server;
after a question input module of the client receives a question, the client sends the question to the server so that an intention identification and slot position extraction module and an inquiry retrieval module of the server can process the question, and an obtained answer is transmitted to the client;
and the answer output module of the client displays the answers in sequence based on the transmitted answers.
Optionally, the client is a robot.
(III) advantageous effects
Firstly, the intention of a user is obtained through a pre-trained intention recognition model, so that the problems of spoken language, librarization, proximity and the like of the user when inputting question sentences can be solved;
secondly, determining the slot position type and a slot position value corresponding to the slot position type through a pre-trained slot position extraction model, and further extracting key information in a user input sentence;
finally, as the language material data disclosed in the field of power transformers are less and belong to a small sample data space, the model training module can generate the language material data, enhance the small sample data and provide the training language material data for the training of the intention recognition model and the slot position extraction model;
in summary, the processing system provided by the present application solves the technical problem of how to implement linking with an entity in a transformer knowledge graph through spoken sentence input by a user.
Drawings
The application is described with the aid of the following figures:
FIG. 1 is a schematic diagram showing the overall module composition of a processing system of a transformer knowledge intelligent question-answering model based on BiLSTM;
FIG. 2 is a schematic diagram of the unit configuration of the intent recognition and slot extraction module;
FIG. 3 is a schematic diagram of an interaction process of an intent recognition model, a slot extraction model, and an entity linking model;
FIG. 4 is a schematic diagram illustrating a process of generating corpus;
FIG. 5 is a schematic view of an information filling interface of a training template;
FIG. 6 is a schematic diagram of an interface display for generating corpus;
FIG. 7 is a diagram illustrating a data structure of a corpus;
FIG. 8 is a graph of the trend of the verification index values for the intent recognition model;
FIG. 9 is a diagram of a variation trend of the slot extraction model;
fig. 10 is a schematic view of the overall configuration of the module.
Detailed Description
For the purpose of better explaining the present invention and to facilitate understanding, the present invention will be described in detail by way of specific embodiments with reference to the accompanying drawings. It is to be understood that the following specific examples are illustrative of the invention only and are not to be construed as limiting the invention. In addition, it should be noted that, in the case of no conflict, the embodiments and features in the embodiments in the present application may be combined with each other; for convenience of description, only portions related to the invention are shown in the drawings.
An embodiment provides a processing system of a transformer knowledge intelligent question-answering model based on BilSTM, which comprises a question input module, an intention identification and slot extraction module, a query retrieval module and an answer output module, specifically,
and the problem input module is used for receiving a problem which is input by a user and is associated with the transformer, forming the received problem into a problem text and forwarding the problem text to the intention identification and slot extraction module.
The processing system in this embodiment receives, through the question input module, a question input by a user and associated with the transformer, and displays the question in an interface display manner. In practical applications, each technician can input various problems related to the transformer by means of the problem input module, however, the problems received by the problem input module usually present the problems of shorthand, fuzzification, spoken language, similar expression of the same problem, and multiple interactive inputs. For example: a technician may input "which manufacturer of main transformer of changchun 220KV substation No. 1? "may also be entered" which plant manufactured West lake station No. 1 change? "the above problems are basically similar problems, and the purpose is to inquire" where is the manufacturer of main transformer No. 1 of vinpocetine station? "this question, because of the different ways that different technicians ask questions, presents different sentence formats.
And the intention identifying and slot extracting module is used for carrying out word segmentation on the problem, processing the words subjected to word segmentation by adopting the BilSTM, acquiring the query type and the entity name or attribute name corresponding to each word segmentation, and forwarding the acquired query type and the entity name or attribute name corresponding to each word segmentation to the query retrieval module.
In this embodiment, after performing word segmentation processing on the received question, the intention identification and slot extraction module determines the intention type of the user, where the intention type may be divided into KG triples and static Q/a question-and-answer pairs. If the intention type of the user is KG triples, the intention recognition and slot position extraction module recognizes the intention of the user and extracts the slot position information of the problem text through a pre-trained intention recognition model and a pre-trained slot position extraction model. And the BilSTM outputs the entity name or the attribute name corresponding to each participle with higher confidence coefficient according to the target vector through bidirectional connection and attention weight.
And the query retrieval module is used for generating a query statement according to the query type and the entity name or the attribute name corresponding to each participle, retrieving all relevant answers on the network based on the query statement, and forwarding the retrieved answers to the answer output module.
In this embodiment, the query retrieval module converts the query subgraph into a SPARQL query statement or an ES query statement, and obtains a retrieval result of the triple or the Q/a answer based on the generated SPARQL query statement or the ES query statement.
And the answer output module is used for sorting all the searched related answers according to the answer sorting strategy and displaying the answers in sequence according to a preset display format.
In this embodiment, when the answer received by the answer output module includes a KG triple and an ES document, the KG triple is preferentially displayed, and the ES document is subsequently displayed; when the answer received by the answer output module comprises a static Q/A question-answer pair and an ES document, the static Q/A question-answer pair is preferentially displayed, and the ES document is subsequently displayed.
The processing system of embodiment one, receive user input transformer-associated questions through the question input module; performing word segmentation processing on the problem through an intention identification and slot position extraction module, and processing the words after word segmentation by using a BilSTM to obtain a query type and an entity name or an attribute name corresponding to each word segmentation; generating a query statement through a query retrieval module based on the query type and the entity name or the attribute name corresponding to each participle, and retrieving all associated answers on the network based on the query statement; and sorting all the searched related answers through an answer output module according to an answer sorting strategy, and displaying the answers in sequence according to a preset display format. Furthermore, the processing system in this embodiment solves the problem how to implement linking with an entity in the transformer knowledge base by spoken sentence input by a user.
The second embodiment provides a processing system of a transformer knowledge intelligent question-answering model based on BilSTM, wherein an intention identification and slot position extraction module in the processing system is used for performing word segmentation processing on a problem, processing the segmented words by using the BilSTM and acquiring query types and entity names or attribute names corresponding to the segmented words. The process involves three models, namely a pre-trained intent recognition model, a slot extraction model and an entity linking model.
According to fig. 2, in the present embodiment, the intention identifying and slot extracting module includes a word segmentation unit, an intention identifying unit, a slot information extracting unit, and an entity link corresponding unit.
The word segmentation unit is used for performing word segmentation processing on the problem to obtain a keyword in the problem;
the intention recognition unit is used for inputting the keywords into an intention recognition model trained in advance and acquiring intentions corresponding to the keywords; if the intention is a static question-answer pair intention, taking the static question-answer pair as a query type; if the intention is an intention for inquiring the knowledge graph, judging whether the keyword is an inquiry type of set type inquiry or entity attribute/relation type inquiry to obtain a final inquiry type;
the slot position information extraction unit is used for determining slot position types and slot position values corresponding to the slot position types by combining the keywords based on a pre-trained slot position extraction model;
the entity link corresponding unit is used for searching an entity name or an attribute name matched with the slot position value from the entity database based on a pre-trained entity link model;
the entity database stores entity names or attribute names of all entities related to the transformer and a map/mapping relation to which each entity name or attribute name belongs.
According to fig. 3, in this embodiment, the word segmentation unit performs word segmentation on "which factory the vinpocetine station number 1 main transformer produces" which is input by the user, so as to obtain three keywords, i.e., "vinpocetine station", "number 1 main transformer" and "production".
In this embodiment, the intention recognition unit inputs the keywords "changchun station", "number 1 main transformer" and "production" into a pre-trained intention recognition model, and further determines that the user intention is an intention of querying the knowledge graph, and further determines that the specific type of the intention of querying the knowledge graph corresponding to the current user input sentence is entity attribute/relationship class query.
In this embodiment, the processing of the slot information extraction unit and the intention recognition unit is parallel, specifically, based on a pre-trained slot extraction model, the slot information extraction unit combines keywords "changchun station", "main transformer No. 1" and "production", and further determines that the slot type of the current user input sentence and the slot value condition corresponding to each slot type are: the slot position type 'transformer substation' corresponds to the slot position value 'Changchun station'; the slot position type 'transformer' corresponds to a slot position value 'No. 1 main transformer'; the slot type "attribute/relationship" corresponds to the slot value "production".
In this embodiment, the entity link corresponding unit is configured to search, based on a pre-trained entity link model, entity names or attribute names matching with slot position values "changchun station", "number 1 master change", and "production" from an entity database, specifically: the entity name and the attribute name corresponding to the slot position value 'Changchun station' are 'Changchun 220kv substation' and 'substation' in sequence; the entity name and the attribute name corresponding to the slot bit value 'No. 1 main transformer' are a 'No. 1 main transformer' and a 'transformer' in sequence; the attribute name corresponding to the slot bit value production is manufacturer, and further the problem corresponding to the user input is obtained as follows: "is the main transformer No. 1 of the Changchun 220kv substation manufactured by which manufacturer? ".
The processing system described in the second embodiment obtains the keywords in the question through the word segmentation unit; obtaining a final query type of the question through an intention identification unit; determining slot types included in the problem and slot position values corresponding to each slot type through a slot information extraction unit; and searching an entity name or an attribute name matched with the slot position value through the entity link corresponding unit. Furthermore, the processing system in the embodiment solves the semantic ambiguity problem which may occur when the user inputs the question, and further can realize accurate identification of the user query or question-answering intention.
The third embodiment provides a processing system of a transformer knowledge intelligent question-answering model based on BilSTM, wherein a model training module in the processing system is used for generating a training corpus, training the BilSTM based on the generated training corpus, and obtaining the trained BilSTM comprising an intention recognition model, a slot extraction model and an entity link model.
In this embodiment, the intention recognition model, the slot extraction model, and the entity link model are trained in advance by a model training module.
In this embodiment, the model training module is configured to generate a corpus, and train the BiLSTM based on the generated corpus to obtain a trained BiLSTM including an intent recognition model, a slot extraction model, and an entity link model.
In this embodiment, the process of generating the corpus by the model training module is shown in fig. 4, specifically,
s1, receiving a problem which is input by a user and is associated with the transformer;
in this embodiment, the questions include a spoken question and an irregular word order question.
S2, receiving question and answer intentions/query types, keywords and entity information for the questions filled by the user according to a pre-set training template;
in the present embodiment, the information filling interface of the training template is shown in fig. 5.
S3, generating a training corpus from the question, the question-answer intention/query type corresponding to the question, the key words and the entity information;
in this embodiment, an interface for displaying the generated corpus is shown in fig. 6, and a confirmation key is clicked, and a data structure of the generated corpus is shown in fig. 7.
In the embodiment, the generated corpus data is divided into a training set and a verification set according to a preset proportion; and training the BilSTM based on the training set to obtain the trained BilSTM comprising an intention recognition model, a slot extraction model and an entity link model, wherein model training sentences are as follows:
#cd xxx/QAModelAPI_v2.0/ckpt
the # command is: rm is
Py executing command python train
In this embodiment, the specific process of verifying the intention recognition model, the slot extraction model, and the entity link model included in the trained BiLSTM is as follows:
in this embodiment, for the intention recognition model, slot extraction model, and entity link model included in the trained BiLSTM, the verification indexes are Loss, Acc, val _ Loss, and val _ Acc;
wherein the Loss is a Loss value of the intention recognition model, the slot extraction model and the entity link model on the training set; acc is the accuracy of an intention recognition model, a slot extraction model and an entity link model on a training set; val _ Loss is a Loss value of the intention identification model, the slot extraction model and the entity link model on the verification set; val _ Acc is the accuracy of the intent recognition model, slot extraction model, entity linking model on the validation set.
In this embodiment, the intention recognition model, the slot extraction model, and the entity link model formed through multiple rounds of training are automatically screened, and if the val _ loss value of the current round of the obtained intention recognition model, slot extraction model, and entity link model is lower than that of the previous round, the model is discarded, so that the model with the highest accuracy is selected.
In this embodiment, the training process performed on the intent recognition model, the slot extraction model, and the entity link model is parallel.
The following illustrates the verification process of the intention recognition model and the slot extraction model included in the trained BiLSTM:
in this embodiment, based on a transformer fault service system of a certain power company, an experiment is performed with a transformer fault report, six kinds of concepts of a transformer substation, a transformer and the like are selected, corpus data used for the experiment is collected, and a collected corpus data concept and quantity correspondence table is shown in table 1. Generating corpus data related to the six concepts based on a model training template, specifically, forming 198100 question sentences jointly, setting the upper limit of each question sentence to be 5 ten thousand, and performing the following steps according to the weight ratio of 8: 2, dividing all the obtained corpus data into a training set and a verification set, wherein the quantity distribution table of the training set and the verification set of the six concepts is shown in table 2.
Table 1 collected corpus data concepts and quantity correspondence table
Figure RE-GDA0003463712720000111
TABLE 2 quantity distribution table of training set and verification set of six kinds of concepts
Figure RE-GDA0003463712720000112
In the present embodiment, a verification index value variation trend chart of the intent recognition model is shown in fig. 8, and a verification index value variation trend chart of the slot extraction model is shown in fig. 9.
The processing system described in the third embodiment generates a corpus through the model training module, and trains the BiLSTM based on the generated corpus to obtain the trained BiLSTM including the intention recognition model, the slot extraction model, and the entity link model. On one hand, the number of the training corpora is increased, and the training corpora are further provided for generating an intention recognition model and a slot position extraction model for deep learning training; on the other hand, through an intention recognition model and a slot extraction model based on the BilSTM, accurate recognition of user query or question-answering intention can be realized.
The fourth embodiment provides a processing system of transformer knowledge intelligent question-answering model based on BilSTM, which comprises: a plurality of clients and servers;
in this embodiment, the question input module and the answer output module are located in each client;
in the embodiment, the intention identification and slot extraction module and the query retrieval module are located in the server;
in this embodiment, after the problem input module of the client receives the problem, the client sends the problem to the server, so that the intention identification and slot position extraction module and the query retrieval module of the server perform processing, and the obtained answer is transmitted to the client;
in this embodiment, the answer output module of the client displays the answers in sequence based on the transmitted answers.
In this embodiment, the client is a robot.
The processing system in the fourth embodiment can be applied to various devices, is convenient for a user to carry, can realize portable inquiry for problems related to the transformer, and is very convenient to use.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. Furthermore, it should be noted that in the description of the present specification, the description of the term "one embodiment", "some embodiments", "examples", "specific examples" or "some examples", etc., means that a specific feature, structure, material or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, the claims should be construed to include preferred embodiments and all changes and modifications that fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention should also include such modifications and variations.

Claims (8)

1. A transformer knowledge intelligent question-answering model processing system based on BilSTM is characterized by comprising:
the problem input module is used for receiving a problem which is input by a user and is associated with the transformer;
the intention identification and slot position extraction module is used for performing word segmentation processing on the problem, processing the words after word segmentation by using the BilSTM and acquiring the query type and the entity name or attribute name corresponding to each word segmentation;
the query retrieval module is used for generating a query statement according to the query type and the entity name or the attribute name corresponding to each participle, and retrieving all relevant answers on the network based on the query statement;
and the answer output module is used for sorting all the searched related answers according to the answer sorting strategy and displaying the answers in sequence according to a preset display format.
2. The processing system of claim 1, wherein the intent recognition and slot extraction module is specifically configured to:
the word segmentation unit is used for performing word segmentation processing on the problem to obtain a keyword in the problem;
the intention recognition unit is used for inputting the keywords into an intention recognition model trained in advance and acquiring intentions corresponding to the keywords; if the intention is a static question-answer pair intention, taking the static question-answer pair as a query type; if the intention is an intention for inquiring the knowledge graph, judging whether the keyword is an inquiry type of set type inquiry or entity attribute/relation type inquiry to obtain a final inquiry type;
the slot position information extraction unit is used for determining slot position types and slot position values corresponding to the slot position types by combining the keywords based on a pre-trained slot position extraction model;
the entity link corresponding unit is used for searching an entity name or an attribute name matched with the slot position value from the entity database based on a pre-trained entity link model;
the entity database stores entity names or attribute names of all entities related to the transformer and a map/mapping relation to which each entity name or attribute name belongs.
3. Processing system according to claim 1, characterized by a query retrieval module, in particular for
Constructing a query subgraph according to the query type and the entity name or attribute name corresponding to each participle;
converting the query subgraph into a SPARQL query statement or an ES query statement;
and retrieving in the current network based on the SPARQL query statement or the ES query statement to obtain a retrieval result of the triple or the Q/A answer.
4. The processing system of claim 1, wherein the answer output module is specifically configured to output the answer
When the retrieval result comprises the KG triples and the ES documents, displaying the KG triples as first display information, and displaying the ES documents as second display information;
and when the retrieval result comprises a static Q/A question-answer pair and an ES document, displaying the static Q/A question-answer pair as first display information, and displaying the ES document as second display information.
5. The processing system of claim 1, further comprising:
and the model training module is used for generating a training corpus and training the BilSTM based on the generated training corpus to obtain the trained BilSTM comprising an intention recognition model, a slot extraction model and an entity link model.
6. The processing system of claim 5, wherein the model training module is configured to generate the corpus, and comprises:
receiving a problem input by a user and associated with a transformer, wherein the problem comprises a spoken problem and a problem of irregular word order;
receiving question-answer intentions/query types, keywords and entity information for the questions filled by the user according to a pre-set training template;
and generating a training corpus from the question, the question-answer intention/query type corresponding to the question, the key words and the entity information.
7. The processing system of claim 1, wherein the processing system comprises:
a plurality of clients and servers;
the question input module and the answer output module are positioned in each client;
the intention identification and slot position extraction module and the query retrieval module are positioned in the server;
after a question input module of the client receives a question, the client sends the question to the server so that an intention identification and slot position extraction module and an inquiry retrieval module of the server can process the question, and an obtained answer is transmitted to the client;
and the answer output module of the client displays the answers in sequence based on the transmitted answers.
8. The processing system of claim 7, wherein the client is a robot.
CN202111373347.7A 2021-11-19 2021-11-19 Processing system of transformer knowledge intelligent question-answer model based on BilSTM Pending CN114116972A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111373347.7A CN114116972A (en) 2021-11-19 2021-11-19 Processing system of transformer knowledge intelligent question-answer model based on BilSTM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111373347.7A CN114116972A (en) 2021-11-19 2021-11-19 Processing system of transformer knowledge intelligent question-answer model based on BilSTM

Publications (1)

Publication Number Publication Date
CN114116972A true CN114116972A (en) 2022-03-01

Family

ID=80396405

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111373347.7A Pending CN114116972A (en) 2021-11-19 2021-11-19 Processing system of transformer knowledge intelligent question-answer model based on BilSTM

Country Status (1)

Country Link
CN (1) CN114116972A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116628173A (en) * 2023-07-26 2023-08-22 成都信通信息技术有限公司 Intelligent customer service information generation system and method based on keyword extraction

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116628173A (en) * 2023-07-26 2023-08-22 成都信通信息技术有限公司 Intelligent customer service information generation system and method based on keyword extraction
CN116628173B (en) * 2023-07-26 2023-10-31 成都信通信息技术有限公司 Intelligent customer service information generation system and method based on keyword extraction

Similar Documents

Publication Publication Date Title
CN111026842B (en) Natural language processing method, natural language processing device and intelligent question-answering system
CN102262634B (en) Automatic questioning and answering method and system
CN110413783B (en) Attention mechanism-based judicial text classification method and system
CN107766483A (en) The interactive answering method and system of a kind of knowledge based collection of illustrative plates
CN108491378B (en) Intelligent response system for operation and maintenance of electric power information
CN108460136A (en) Electric power O&M information knowledge map construction method
CN109308321A (en) A kind of knowledge question answering method, knowledge Q-A system and computer readable storage medium
CN107330125A (en) The unstructured distribution data integrated approach of magnanimity of knowledge based graphical spectrum technology
CN111858877A (en) Multi-type question intelligent question answering method, system, equipment and readable storage medium
CN113157860B (en) Electric power equipment maintenance knowledge graph construction method based on small-scale data
CN114116972A (en) Processing system of transformer knowledge intelligent question-answer model based on BilSTM
CN109902215A (en) A kind of method and system of deals match
CN113190692A (en) Self-adaptive retrieval method, system and device for knowledge graph
CN111460119B (en) Intelligent question-answering method and system for economic knowledge and intelligent equipment
CN116955586A (en) Consultation system for dynamically adjusting answer preference and implementation method thereof
CN116561274A (en) Knowledge question-answering method based on digital human technology and natural language big model
Khekare et al. Design of Automatic Key Finder for Search Engine Optimization in Internet of Everything
CN112989811B (en) History book reading auxiliary system based on BiLSTM-CRF and control method thereof
CN115640403A (en) Knowledge management and control method and device based on knowledge graph
CN114238595A (en) Metallurgical knowledge question-answering method and system based on knowledge graph
CN114547342A (en) College professional intelligent question-answering system and method based on knowledge graph
CN112015920A (en) Intelligent auxiliary learning system based on knowledge graph and edge calculation
CN112035680A (en) Knowledge graph construction method of intelligent auxiliary learning machine
CN110413636A (en) A kind of data processing method and device
CN109543182A (en) A kind of electric power enterprise based on solr engine takes turns interactive semantic analysis method more

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication