CN117972058A - Implicit knowledge guided dialogue generation method and system based on dialogue thinking chain - Google Patents

Implicit knowledge guided dialogue generation method and system based on dialogue thinking chain Download PDF

Info

Publication number
CN117972058A
CN117972058A CN202410144351.3A CN202410144351A CN117972058A CN 117972058 A CN117972058 A CN 117972058A CN 202410144351 A CN202410144351 A CN 202410144351A CN 117972058 A CN117972058 A CN 117972058A
Authority
CN
China
Prior art keywords
dialogue
knowledge
conversational
thinking
chain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410144351.3A
Other languages
Chinese (zh)
Inventor
屈丹
彭思思
魏晗
张昊
张文林
李�真
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Information Engineering University of PLA Strategic Support Force
Original Assignee
Information Engineering University of PLA Strategic Support Force
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Information Engineering University of PLA Strategic Support Force filed Critical Information Engineering University of PLA Strategic Support Force
Priority to CN202410144351.3A priority Critical patent/CN117972058A/en
Publication of CN117972058A publication Critical patent/CN117972058A/en
Pending legal-status Critical Current

Links

Landscapes

  • Machine Translation (AREA)

Abstract

The invention relates to the technical field of artificial intelligence conversations, in particular to a conversational generating method and a conversational generating system guided by implicit knowledge based on a conversational thinking chain, which are used for describing conversational task requirements and conversational sentence rules by acquiring conversational background knowledge related to conversational source sentences; the dialogue source sentence and the dialogue background knowledge are input into a preconfigured large language model, so that dialogue reply aiming at the dialogue source sentence under the dialogue background knowledge thinking chain is generated through the large language model by using the dialogue background knowledge as a prompt instruction. The invention promotes the model to more accurately call the internal knowledge in a thinking chain mode, so that the model can search the internal knowledge and reasonably use the internal knowledge to generate targeted and high-quality dialogue content, thereby better completing dialogue tasks, facilitating application deployment in intelligent dialogue systems such as language translation, text dialogue, text question-answering and the like, and having better application prospect.

Description

Implicit knowledge guided dialogue generation method and system based on dialogue thinking chain
Technical Field
The invention relates to the technical field of artificial intelligence conversations, in particular to a method and a system for generating an implicit knowledge guided conversation based on a conversation thinking chain.
Background
In recent years, the development and progress of natural language processing technology have resulted in more and more powerful language models, and very large scale language models such as GLM, chatGPT, LLAMA, etc. have been generated. These models are pre-trained based on huge data sizes and deeper network structures, learn a lot of knowledge, and exert surprising performance in NLP tasks such as language translation, text dialogue, text question-answering, and the like. However, even the ChatGPT large models, which once they are developed, exhibit their very high potential in many areas, have some limitations. A large number of evaluation analysis experiments show that the large model has 'knowledge defects' when finishing knowledge-intensive tasks such as knowledge-driven dialogue tasks, and the whole model can be divided into two types: firstly, knowledge is deficient, namely, related knowledge is lacking; secondly, knowledge utilization effect is poor, which includes both cases of not or rarely utilizing knowledge and erroneously utilizing knowledge.
Theoretically, large models pre-trained with large amounts of data contain rich knowledge inside, and they should be able to perform knowledge-intensive tasks well. Research has shown that large models do not achieve the expected results when such tasks are completed, particularly conversational tasks involving knowledge. Thus, in view of the black box process inside the large model, the researcher analysis considers this to be due to underutilization of the knowledge itself. To alleviate the above phenomenon, some researchers have turned to studies on knowledge utilization. Some studies have found that constructing high quality cues consistent with task user intent can more fully invoke knowledge contained in a large model. It has also been found that the performance of large language model LLMs can be significantly improved when performing tasks if specific cues are designed for each test sample that are targeted and diverse. These studies have made large models better performing in knowledge dialogs by adding human feedback or constructing high quality cues, but this is not an easy task. On one hand, the process of constructing human feedback and high-quality prompt words is tedious and time-consuming; on the other hand, artificially constructed cues tend to be difficult to adequately balance content diversity and data pertinence. Although the follow-up work proposes an automatic prompt method, which can enrich prompt contents while reducing the workload of manual design prompt, the follow-up work is still limited to limited prompt formats, and the generated prompt cannot guide a large model to fully call knowledge.
Disclosure of Invention
Therefore, the invention provides a method and a system for generating an implicit knowledge guided dialogue based on a dialogue thinking chain, which solve the problem of insufficient knowledge utilization of the existing language model, and promote the model to more accurately call the internal knowledge in a thinking chain mode, so that the dialogue generation quality is improved, and the method and the system are suitable for specific application in an artificial intelligent dialogue system.
According to the design scheme provided by the invention, on one hand, an implicit knowledge guided dialogue generating method based on a dialogue thinking chain is provided, which comprises the following steps:
Acquiring dialogue background knowledge related to dialogue source sentences, wherein the dialogue background knowledge is used for describing dialogue task requirements and dialogue sentence rules;
the dialogue source sentence and the dialogue background knowledge are input into a preconfigured large language model, so that dialogue reply aiming at the dialogue source sentence under the dialogue background knowledge thinking chain is generated through the large language model by using the dialogue background knowledge as a prompt instruction.
The method for generating the implicit knowledge guided dialogue based on the dialogue thinking chain further comprises the steps of:
Firstly, setting a model dialogue task guide instruction, and taking the guide instruction as knowledge for describing dialogue task requirements;
Then, screening the dialogue pairs most relevant to dialogue source sentences from the dialogue sample data set, generating corresponding thinking chains according to the dialogue pairs and by using a large model, and taking the dialogue pairs and the thinking chains as knowledge for describing rules of the dialogue sentences.
As the method for generating the implicit knowledge guided dialogue based on the dialogue thinking chain, the method further filters dialogue pairs most relevant to dialogue source sentences from a dialogue sample data set, and comprises the following steps:
collecting a dialogue sample data set from a preset knowledge base, wherein the dialogue sample data set comprises n dialogue pairs, and each dialogue pair consists of a source sentence and a target sentence;
m representative dialog pairs are screened from the dialog sample dataset, where m < n.
As the implicit knowledge guided dialogue generation method based on the dialogue thinking chain, the invention further screens m representative dialogue pairs from a dialogue sample data set, and comprises the following steps:
And obtaining each source sentence vector representation in the dialogue pair based on the pre-trained language model, performing cluster analysis on the source sentence vector representation by using a clustering method, and obtaining a representative dialogue pair according to a sample closest to the center in each class.
As the implicit knowledge guided dialogue generation method based on the dialogue thinking chain, the invention further utilizes a clustering method to perform clustering analysis on the source sentence vector representation, and comprises the following steps:
firstly, clustering all source sentence vectors, calculating similarity and similarity among classes in a clustering result, and recording source sentences closest to a clustering center;
Then, based on the similarity between the similarity and the similarity between the classes, calculating a contour coefficient by using an average contour method, taking the maximum value of the contour coefficient as an optimal cluster number, and assigning m to the optimal cluster number so as to select a representative dialogue pair from the closest cluster center records according to the optimal cluster number.
As the implicit knowledge guided dialogue generation method based on the dialogue thinking chain, the contour coefficient V calculation method is further expressed as follows: wherein a (i) is similarity, and b (i) is similarity between classes.
The method for generating the implicit knowledge guided dialogue based on the dialogue thinking chain further utilizes dialogue background knowledge as a prompt instruction and further comprises the following steps:
And generating a prompt instruction template based on the dialogue task requirements, the dialogue sentence rules and the dialogue source sentences, so as to generate dialogue replies aiming at the dialogue source sentences under the dialogue background knowledge thinking chain through a large language model according to the prompt instruction template.
In still another aspect, the present invention further provides a system for generating an implicit knowledge guided dialog based on a dialog thought chain, including: a knowledge acquisition module and a dialogue generation module, wherein,
The knowledge acquisition module is used for acquiring dialogue background knowledge related to dialogue source sentences, wherein the dialogue background knowledge is used for describing dialogue task requirements and dialogue sentence rules;
And the dialogue generation module is used for inputting dialogue source sentences and dialogue background knowledge into a preconfigured large language model to generate dialogue replies aiming at the dialogue source sentences under the dialogue background knowledge thinking chain through the large language model by using the dialogue background knowledge as a prompt instruction.
The invention has the beneficial effects that:
The invention applies the thinking chain of < dialogue content- > knowledge mining- > knowledge utilization- > reply generation > to dialogue tasks, utilizes the implicit knowledge guiding model to follow dialogue thinking, enables the model to execute a knowledge thinking chain path of firstly inquiring dialogue background knowledge and then generating reply based on knowledge, and restricts the model knowledge generating process based on guiding instructions and dialogue rules, enables the model to search internal knowledge and reasonably use the internal knowledge to generate targeted and high-quality dialogue content, further better completes dialogue tasks, is convenient for application deployment in intelligent dialogue systems such as language translation, text dialogue, text question-answering and the like, and has better application prospect.
Description of the drawings:
FIG. 1 is a schematic flow diagram of an implicit knowledge guided dialog generation based on a dialog thought chain in an embodiment;
FIG. 2 is an implicit knowledge guidance strategy in an embodiment;
FIG. 3 is a hint instruction template illustration in an embodiment.
The specific embodiment is as follows:
The present invention will be described in further detail with reference to the drawings and the technical scheme, in order to make the objects, technical schemes and advantages of the present invention more apparent.
Background knowledge is critical to the intelligent dialog system to produce high quality responses. Large language models are the most recent research hotspot, and based on extensive training data and deep neural network structures, it can learn the general knowledge contained in complex text semantics. However, large language models present limitations in undertaking knowledge driven conversational tasks, such as knowledge starvation and knowledge misuse. In order to alleviate the above-mentioned problems, referring to fig. 1, an embodiment of the present invention provides a method for generating an implicit knowledge guidance dialog based on a dialog thought chain, including:
S101, acquiring dialogue background knowledge related to dialogue source sentences, wherein the dialogue background knowledge is used for describing dialogue task requirements and dialogue sentence rules.
In particular, obtaining dialogue background knowledge related to dialogue source statements may be designed to include the following:
Firstly, setting a model dialogue task guide instruction, and taking the guide instruction as knowledge for describing dialogue task requirements;
Then, the dialogue pairs most relevant to the dialogue source sentence are screened from the dialogue sample data set, corresponding thinking chains are generated according to the dialogue pairs and by using the large model LLMs, and the dialogue pairs and the thinking chains are used as knowledge for describing rules of the dialogue sentence.
The thought chain is used as a new discrete prompt learning, which does not directly predict answers, but provides detailed thinking and analytic contents. The generated thinking process guides the large model to obtain better results when executing tasks, in particular to reasoning tasks such as common sense reasoning, mathematical operation, symbol reasoning and the like. Inspired by thought of a thinking Chain, self-consistency, bootstrapping, knowledge Chain (CoK) and other strategies all obviously improve the prompt performance of the thinking Chain. Most of the work is manual annotation, which is time and labor consuming. In the embodiment of the present disclosure, referring to fig. 2, through instruction construction and demonstration selection, the implicit knowledge is used to guide the large model to follow the dialogue thought of searching knowledge and generating reply, to restrict the large model knowledge generation process, and to simplify the knowledge post-processing process, so that the model can accurately and fully utilize the internal knowledge, and effectively cope with and complete the dialogue task driven by knowledge without depending on the external knowledge base.
The dialog data set D is collected from the related art knowledge base and may contain n dialog pairs D = { (s 1,t1),(s2,t2),...,(sn,tn) }, where (s i,ti) represents the dialog content of the i-th dialog, s i represents the dialog source sentence, and t i represents the dialog target sentence. The large model prompt word partial input may include three parts as a whole: instruction instruction, demonstrations presentation, context source statement. Among other things, instruction is mainly used to guide a large model to better generate or exploit knowledge and put some constraints on the generated results. Demonstrations are examples of a small number of tasks provided for large models. Context is the source statement of the dialog that currently needs to be completed.
The core of the knowledge guidance strategy is that the large model is more fully utilized to complete tasks through Instruction guidance, so the instrumentation mainly describes the task requirements and generation requirements of the large model, and aims to guide the model to generate knowledge or final replies related to dialogue content meeting the requirements. Examples of boot instructions are shown in table 1.
Table 1 instructions for generating high quality knowledge of dialog source statements in a dataset
Since dialogue data often proceeds around dialogue topics, presentation examples should be representative. Screening the dialogue pairs most relevant to dialogue source sentences from the dialogue sample dataset can be designed to include:
collecting a dialogue sample data set from a preset knowledge base, wherein the dialogue sample data set comprises n dialogue pairs, and each dialogue pair consists of a source sentence and a target sentence;
m representative dialog pairs are screened from the dialog sample dataset, where m < n.
Wherein, select m representative dialogue pairs from dialogue sample data set, can be designed to include:
And obtaining each source sentence vector representation in the dialogue pair based on the pre-trained language model, performing cluster analysis on the source sentence vector representation by using a clustering method, and obtaining a representative dialogue pair according to a sample closest to the center in each class.
The clustering analysis of the source sentence vector representation using the clustering method may include:
firstly, clustering all source sentence vectors, calculating similarity and similarity among classes in a clustering result, and recording source sentences closest to a clustering center;
Then, based on the similarity between the similarity and the similarity between the classes, calculating a contour coefficient by using an average contour method, taking the maximum value of the contour coefficient as an optimal cluster number, and assigning m to the optimal cluster number so as to select a representative dialogue pair from the closest cluster center records according to the optimal cluster number.
Example selection Demonstrations Selection can screen out standard-compliant presentation examples using the method shown in algorithm 1 to allow the large model to better understand and learn the sentence rules of the test data.
And carrying out cluster analysis on the source sentences of the test data. The vector representation of the source statement is first calculated using Sentence-BERT, after which the optimal number of clusters is confirmed using Average silhouette method. And traversing 2-10 as a clustering number, calculating a contour coefficient V under the current clustering number condition based on similarity a (i) and similarity b (i) between classes of each class sample, and taking the clustering number with the maximum V value as the optimal clustering number m. After determining the optimal cluster number, the presentation example may consist of the samples in each class that are closest to the center sample. In a subsequent step, an implicit knowledge-based guidance strategy based on a chain of conversational ideas may generate knowledge and replies from these conversational data.
S102, inputting the dialogue source sentences and the dialogue background knowledge into a preconfigured large language model to generate dialogue replies aiming at the dialogue source sentences under the dialogue background knowledge thinking chain through the large language model by using the dialogue background knowledge as a prompt instruction.
Wherein, utilize dialogue background knowledge as the suggestion instruction, can also include:
And generating a prompt instruction template based on the dialogue task requirements, the dialogue sentence rules and the dialogue source sentences, so as to generate dialogue replies aiming at the dialogue source sentences under the dialogue background knowledge thinking chain through a large language model according to the prompt instruction template.
Implicit knowledge guidance intent accomplishes performance improvement by leveraging knowledge inside the model without relying on manually annotated external knowledge, which is accomplished by constructing a thinking path of < dialog content- > knowledge mining- > knowledge utilization- > reply generation >. Implicit knowledge guidance obtains results directly according to hints. Similar to traditional reasoning tasks, the replies to dialog tasks are also composed in terms of a "procedure + answer", i.e. "background knowledge + answer", structure.
In the specific algorithm design, the input of the large model in the implicit knowledge guidance comprises instruction description i, including task description and constraint conditions, demonstration d and user source dialogue statement s. For each round of dialog, the specific instruction description may be shown in table 1, the hint instruction is represented as p3= (i; d), and the detailed hint example may be shown in the hint instruction template shown in fig. 3.
According to the experimental setting requirement, under the setting of few-shot of few samples, m dialogue pairs are screened from a data set by using an algorithm 1 and a thinking chain r is generated for the m dialogue pairs by using LLMs, and the m dialogue data are used as sample parts in prompt to be added into prompt instructions, as shown in a formula (1). Based on the hint instruction P3 and the current source sentence s, the large language model LLMs generates a dialog reply r as shown in equation (2):
According to experimental observation, large language models such as GPT-3.5-turbo and GLM-130B can form a high-quality and detailed knowledge thinking chain by utilizing the scheme, so that the conversation content is generated by fully utilizing model knowledge.
Further, based on the above method, the embodiment of the present invention further provides a system for generating an implicit knowledge guidance dialog based on a dialog thinking chain, including: a knowledge acquisition module and a dialogue generation module, wherein,
The knowledge acquisition module is used for acquiring dialogue background knowledge related to dialogue source sentences, wherein the dialogue background knowledge is used for describing dialogue task requirements and dialogue sentence rules;
And the dialogue generation module is used for inputting dialogue source sentences and dialogue background knowledge into a preconfigured large language model to generate dialogue replies aiming at the dialogue source sentences under the dialogue background knowledge thinking chain through the large language model by using the dialogue background knowledge as a prompt instruction.
The relative steps, numerical expressions and numerical values of the components and steps set forth in these embodiments do not limit the scope of the present invention unless it is specifically stated otherwise.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the system disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The elements and method steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or a combination thereof, and the elements and steps of the examples have been generally described in terms of functionality in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Those of ordinary skill in the art may implement the described functionality using different methods for each particular application, but such implementation is not considered to be beyond the scope of the present invention.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the above methods may be performed by a program that instructs associated hardware, and that the program may be stored on a computer readable storage medium, such as: read-only memory, magnetic or optical disk, etc. Alternatively, all or part of the steps of the above embodiments may be implemented using one or more integrated circuits, and accordingly, each module/unit in the above embodiments may be implemented in hardware or may be implemented in a software functional module. The present invention is not limited to any specific form of combination of hardware and software.
Finally, it should be noted that: the above examples are only specific embodiments of the present invention, and are not intended to limit the scope of the present invention, but it should be understood by those skilled in the art that the present invention is not limited thereto, and that the present invention is described in detail with reference to the foregoing examples: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. An implicit knowledge guided dialog generation method based on a dialog thinking chain is characterized by comprising the following steps:
Acquiring dialogue background knowledge related to dialogue source sentences, wherein the dialogue background knowledge is used for describing dialogue task requirements and dialogue sentence rules;
the dialogue source sentence and the dialogue background knowledge are input into a preconfigured large language model, so that dialogue reply aiming at the dialogue source sentence under the dialogue background knowledge thinking chain is generated through the large language model by using the dialogue background knowledge as a prompt instruction.
2. The conversational thinking chain-based implicit knowledge guidance conversational generation method of claim 1, wherein obtaining conversational background knowledge related to conversational source sentences comprises:
Firstly, setting a model dialogue task guide instruction, and taking the guide instruction as knowledge for describing dialogue task requirements;
Then, screening the dialogue pairs most relevant to dialogue source sentences from the dialogue sample data set, generating corresponding thinking chains according to the dialogue pairs and by using a large model, and taking the dialogue pairs and the thinking chains as knowledge for describing rules of the dialogue sentences.
3. The conversational thinking chain-based implicit knowledge guidance conversational generation method of claim 2, wherein screening conversational pairs from a conversational sample dataset that are most relevant to a conversational source sentence, comprises:
collecting a dialogue sample data set from a preset knowledge base, wherein the dialogue sample data set comprises n dialogue pairs, and each dialogue pair consists of a source sentence and a target sentence;
m representative dialog pairs are screened from the dialog sample dataset, where m < n.
4. A conversational thinking chain-based implicit knowledge guidance conversational generation method according to claim 3, wherein screening m representative conversational pairs from a conversational sample dataset comprises:
And obtaining each source sentence vector representation in the dialogue pair based on the pre-trained language model, performing cluster analysis on the source sentence vector representation by using a clustering method, and obtaining a representative dialogue pair according to a sample closest to the center in each class.
5. The conversational thinking chain-based implicit knowledge guidance conversational generation method of claim 4, wherein performing cluster analysis on the source sentence vector representation using a clustering method includes:
firstly, clustering all source sentence vectors, calculating similarity and similarity among classes in a clustering result, and recording source sentences closest to a clustering center;
Then, based on the similarity between the similarity and the similarity between the classes, calculating a contour coefficient by using an average contour method, taking the maximum value of the contour coefficient as an optimal cluster number, and assigning m to the optimal cluster number so as to select a representative dialogue pair from the closest cluster center records according to the optimal cluster number.
6. The conversational thinking chain-based implicit knowledge guidance conversational generation method of claim 5, wherein the contour coefficient V calculation method is expressed as: wherein a (i) is similarity, and b (i) is similarity between classes.
7. The conversational thinking chain-based implicit knowledge guidance conversational generation method of claim 1, wherein using conversational background knowledge as a hint instruction further comprises:
And generating a prompt instruction template based on the dialogue task requirements, the dialogue sentence rules and the dialogue source sentences, so as to generate dialogue replies aiming at the dialogue source sentences under the dialogue background knowledge thinking chain through a large language model according to the prompt instruction template.
8. An implicit knowledge guided dialog generation system based on a chain of dialog ideas, comprising: a knowledge acquisition module and a dialogue generation module, wherein,
The knowledge acquisition module is used for acquiring dialogue background knowledge related to dialogue source sentences, wherein the dialogue background knowledge is used for describing dialogue task requirements and dialogue sentence rules;
And the dialogue generation module is used for inputting dialogue source sentences and dialogue background knowledge into a preconfigured large language model to generate dialogue replies aiming at the dialogue source sentences under the dialogue background knowledge thinking chain through the large language model by using the dialogue background knowledge as a prompt instruction.
9. An electronic device, comprising:
At least one processor, and a memory coupled to the at least one processor;
Wherein the memory stores a computer program executable by the at least one processor to implement the method of any one of claims 1-7.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when executed, is capable of realizing the method according to any of claims 1-7.
CN202410144351.3A 2024-02-01 2024-02-01 Implicit knowledge guided dialogue generation method and system based on dialogue thinking chain Pending CN117972058A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410144351.3A CN117972058A (en) 2024-02-01 2024-02-01 Implicit knowledge guided dialogue generation method and system based on dialogue thinking chain

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410144351.3A CN117972058A (en) 2024-02-01 2024-02-01 Implicit knowledge guided dialogue generation method and system based on dialogue thinking chain

Publications (1)

Publication Number Publication Date
CN117972058A true CN117972058A (en) 2024-05-03

Family

ID=90855984

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410144351.3A Pending CN117972058A (en) 2024-02-01 2024-02-01 Implicit knowledge guided dialogue generation method and system based on dialogue thinking chain

Country Status (1)

Country Link
CN (1) CN117972058A (en)

Similar Documents

Publication Publication Date Title
US20200401899A1 (en) Computationally efficient neural network architecture search
CN110991195B (en) Machine translation model training method, device and storage medium
CN106897384A (en) One kind will bring out the theme automatic evaluation method and device
Pluzhnikova Technologies of artificial intelligence in educational management
Yiran Evaluation of students’ IELTS writing ability based on machine learning and neural network algorithm
CN113010655B (en) Answer and interference item generation method and device for reading and understanding of machine
Lin et al. Enhancing educational dialogue act classification with discourse context and sample informativeness
CN110287999B (en) Story generation method and device based on hidden variable model
CN111753554A (en) Method and device for generating intention knowledge base
KR20210067865A (en) Method and apparatus for generating qa model by using adversarial learning
CN117972058A (en) Implicit knowledge guided dialogue generation method and system based on dialogue thinking chain
CN115658921A (en) Open domain scientific knowledge discovery method and device based on pre-training language model
KR102456994B1 (en) Method and apparatus for generating question and answer dataset based on input paragraph
CN117540012B (en) Text generation method and system
CN117851576A (en) Explicit knowledge guided dialog generation method and system based on multi-stage prompt
Bernard Leveraging user simulation to develop and evaluate conversational information access agents
Press Toward balanced man-machine systems
CN115269844B (en) Model processing method, device, electronic equipment and storage medium
CN117993366B (en) Evaluation item dynamic generation method and system, electronic equipment and readable storage medium
Xiao Research on CAT-based Interactive Collaboration Mode of Undergraduate Translation Course
Hofmann et al. Teaching data science in school: Digital learning material on predictive text systems
Shi et al. Research on the Design and Implementation of Intelligent Tutoring System Based on AI Big Model
Sharma et al. 20 The ChatGPT
CN117874621A (en) Automatic story quality assessment method
Zhao Research on Precision Teaching Reform of Japanese Language Education in Colleges and Universities Based on the CRS Model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination