CN115905513B - Dialogue abstracting method based on denoising type question and answer - Google Patents

Dialogue abstracting method based on denoising type question and answer Download PDF

Info

Publication number
CN115905513B
CN115905513B CN202310151490.4A CN202310151490A CN115905513B CN 115905513 B CN115905513 B CN 115905513B CN 202310151490 A CN202310151490 A CN 202310151490A CN 115905513 B CN115905513 B CN 115905513B
Authority
CN
China
Prior art keywords
abstract
user
customer service
digest
dialogue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310151490.4A
Other languages
Chinese (zh)
Other versions
CN115905513A (en
Inventor
宋彦
田元贺
张勇东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN202310151490.4A priority Critical patent/CN115905513B/en
Publication of CN115905513A publication Critical patent/CN115905513A/en
Application granted granted Critical
Publication of CN115905513B publication Critical patent/CN115905513B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Machine Translation (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to the technical field of role-oriented dialogue abstracts and discloses a dialogue abstracting method based on denoising type question and answer; the customer service abstract is generated by using the modeling method based on the question and answer, so that the relation between the customer abstract and the customer service abstract is considered, and the information from the customer abstract is integrated, so that the generated customer service abstract is more matched with the customer abstract, and the quality of the customer service abstract is improved; the invention uses the user abstract generated by the user abstract module to replace the user abstract used in the traditional method training model as a problem through a denoising mechanism, and the user abstract is spliced with the dialogue and then sent to a customer service abstract encoder; the data used by the model in the training process is more suitable for the actual use scene of the model, and the performance of generating customer service abstract by the model is improved.

Description

Dialogue abstracting method based on denoising type question and answer
Technical Field
The invention relates to the technical field of role-oriented dialog abstracts, in particular to a dialog abstracting method based on denoising type question and answer.
Background
Role-oriented dialog extraction refers to the separate generation of abstracts for different roles in a dialog. The invention solves the following two technical problems:
existing abstracting methods tend to generate abstracts for different roles separately, regardless of their potential relationships, which results in abstracts of different roles not matching in some scenarios, e.g., user questions involved in user abstracts have no corresponding answers in customer service abstracts. In this regard, the present invention proposes to use a question and answer pattern based abstract generation method to generate corresponding answers to questions in each user abstract.
Existing question-answering models use manually annotated user questions as input during training. However, for role-oriented dialog abstract tasks, the trained model may use user problems generated by the model as input when in actual use. Because there is often a difference between the user problem generated by the model and the manually noted user problem, noise that may mislead the model exists in the manually noted user problem used during training. In this regard, the invention provides a question-answering architecture with a denoising mechanism, which effectively solves the problem of noise in manually marked user problems.
Disclosure of Invention
In order to solve the technical problems, the invention provides a dialogue abstracting method based on denoising type question and answer.
In order to solve the technical problems, the invention adopts the following technical scheme:
a dialogue abstract method based on denoising type question and answer inputs a given dialogue into a dialogue abstract model and outputs a customer service abstract; the dialogue abstract model comprises a user abstract module, a problem integration module, a customer service abstract encoder and a customer service abstract decoder;
the training method of the dialogue abstract model comprises the following steps:
step one: for a given dialog d=d 1 …d n Predictive generation of user summaries using a training-completed user summary module
Figure SMS_1
The method comprises the steps of carrying out a first treatment on the surface of the Wherein d is 1 …d n Represents n sentences in dialog D, +.>
Figure SMS_2
Representing user abstract +.>
Figure SMS_3
N words of (a);
step two: applying a denoising mechanism in the problem integration module to abstract the user with manual annotation U=u 1 …u N User digest generated by replacing user digest module
Figure SMS_4
I.e. user abstract +.>
Figure SMS_5
The method comprises the steps of carrying out a first treatment on the surface of the Wherein u is 1 …u N Representing N words in the manually marked user abstract U;
step three: in the problem integration module, the denoised user abstract is extracted
Figure SMS_6
As a problem, with dialogue d=d 1 …d n Splicing to obtain spliced text Q=d 1 …d n [SEP]U, wherein [ SEP ]]Characters representing the boundary between the markup dialog and the question;
step four: sending the spliced text Q to a customer service abstract encoder to obtain a spliced text vector h of the spliced text Q output by the customer service abstract encoder q
Step five: manual marked customer service abstract a=a is recorded 1 …a M Wherein a is 1 …a M Representing M words in customer service abstract A; for the following
Figure SMS_7
Handle h q The first j words A of the customer service abstract A marked manually j =a 1 …a j Sending the text to a customer service abstract decoder to obtain the j+1th word of the customer service abstract predicted by the dialogue abstract model>
Figure SMS_8
The method comprises the steps of carrying out a first treatment on the surface of the Thereby obtaining all words +.>
Figure SMS_9
The method comprises the steps of carrying out a first treatment on the surface of the When j=0, a j =[CLS],/>
Figure SMS_10
Wherein [ CLS ]]Is a character marking the beginning of the abstract;
step six: each word of customer service abstract predicted by dialogue abstract model
Figure SMS_11
Each word a of customer service abstract marked by manual j+1 By contrast, by cross entropy loss function L 2 Calculating Loss of Loss 2
Figure SMS_12
Step seven: through back propagation algorithm and Loss 2 And updating parameters in the customer service digest encoder and the customer service digest decoder.
Further, the user digest module comprises a user digest encoder and a user digest decoder; in the first step, the training process of the user abstract module is as follows:
inputting the conversation D into a user digest encoder to obtain a user digest conversation vector h of the conversation D output by the user digest encoder d
For the following
Figure SMS_13
User abstract dialogue vector h d Manually noted user digest U's first i words U i =u 1 …u i Sending the word I+1 to a user digest decoder to obtain the user digest (i+1) predicted by the user digest module>
Figure SMS_14
Thereby obtaining all words +.>
Figure SMS_15
The method comprises the steps of carrying out a first treatment on the surface of the When i=0, U i =[CLS],
Figure SMS_16
Predicting individual words by a user digest module
Figure SMS_17
Each word u of user abstract with manual marking i+1 By contrast, by cross entropy loss function L 1 Calculating Loss of Loss 1
Figure SMS_18
Through back propagation algorithm and Loss 1 And updating parameters of the user abstract module.
Further, in the first step, the process of generating the user digest by using the user digest module which is completely trained is as follows:
taking the conversation D as the input of a user digest module, sending the conversation D into a user digest encoder to obtain a user digest conversation vector h of the conversation D output by the user digest encoder d
For a pair of
Figure SMS_19
Handle h d And the first i words that the user digest module has predicted to be generated
Figure SMS_20
Sending the user digest to a user digest decoder to obtain the (i+1) th word of the user digest predicted by the user digest module>
Figure SMS_21
Thereby obtaining all words +.>
Figure SMS_22
I.e. user abstract +.>
Figure SMS_23
Further, the process of inputting a given conversation into the conversation summary model and outputting a customer service summary is as follows:
generating a given dialog d=d using a training-completed user digest module 1 …d n User digest of (a)
Figure SMS_24
In the problem integration module, user abstract is carried out
Figure SMS_25
As a problem, with dialogue d=d 1 …d n Splicing to obtain a new spliced text Q=d 1 …d n [SEP]/>
Figure SMS_26
Wherein [ SEP ]]Is a character marking the boundary between a dialog and a question;
sending the spliced text Q to a customer service abstract encoder to obtain a spliced text vector h of the spliced text Q output by the customer service abstract encoder q
Handle h q First j words of customer service abstract which have been generated by dialogue abstract model
Figure SMS_27
Sending the text to a customer service abstract decoder to obtain the j+1th word of the customer service abstract predicted by the dialogue abstract model>
Figure SMS_28
Thereby obtaining all predicted words of the dialogue abstract model +.>
Figure SMS_29
I.e. customer service abstract->
Figure SMS_30
Notably, in role-oriented conversation abstract tasks, particularly user and customer service conversation abstract tasks, customer service abstract generation is difficult. The main focus and improvement of the invention is also the abstract of customer service dialogue content.
Compared with the prior art, the invention has the beneficial technical effects that:
the customer service abstract is generated by using the modeling method based on the question and answer, so that the relation between the customer abstract and the customer service abstract is considered, and the information from the customer abstract is integrated, so that the generated customer service abstract is more matched with the customer abstract, and the quality of the customer service abstract is improved.
The invention uses the user abstract generated by the user abstract module to replace the user abstract used in the traditional method training model as a problem through a denoising mechanism, and the user abstract is spliced with the dialogue and then sent to a customer service abstract encoder; the data used by the model in the training process is more suitable for the actual use scene of the model, and the performance of generating customer service abstract by the model is improved.
Drawings
Fig. 1 is an overall flow chart of the present invention.
Detailed Description
A preferred embodiment of the present invention will be described in detail with reference to the accompanying drawings.
The invention is applied to role-oriented dialogue abstract tasks, for example, in dialogue between a user and customer service, user abstract and customer service abstract are required to be generated respectively.
As shown in fig. 1, the dialogue digest model in the present invention includes a user digest module, a problem integration module, a customer service digest encoder, and a customer service digest decoder; the user digest module includes a user digest encoder and a user digest decoder. And inputting the given dialogue into a dialogue abstract model, and outputting the customer service abstract.
The training method of the dialogue abstract model comprises the following steps:
s1: for a given dialog d=d 1 …d n Predictive generation of user summaries using a training-completed user summary module
Figure SMS_31
The method comprises the steps of carrying out a first treatment on the surface of the Wherein d is 1 …d n Represents n sentences in dialog D, +.>
Figure SMS_32
Representing user abstract +.>
Figure SMS_33
N words of (a);
s2: applying a denoising mechanism in the problem integration module to abstract the user with manual annotation U=u 1 …u N User digest generated by replacing user digest module
Figure SMS_34
I.e. user abstract +.>
Figure SMS_35
The method comprises the steps of carrying out a first treatment on the surface of the Wherein u is 1 …u N Representing N words in the manually marked user abstract U;
s3: in the problem integration module, the denoised user abstract is extracted
Figure SMS_36
As a problem, with dialogue d=d 1 …d n Splicing to obtain spliced text Q=d 1 …d n [SEP]U, wherein [ SEP ]]Characters representing the boundary between the markup dialog and the question;
s4: sending the spliced text Q to a customer service abstract encoder to obtain a spliced text vector h of the spliced text Q output by the customer service abstract encoder q
S5: manual marked customer service abstract a=a is recorded 1 …a M Wherein a is 1 …a M Representing M words in customer service abstract A; for the following
Figure SMS_37
Handle h q The first j words A of the customer service abstract A marked manually j =a 1 …a j Sending the text to a customer service abstract decoder to obtain the j+1th word of the customer service abstract predicted by the dialogue abstract model>
Figure SMS_38
The method comprises the steps of carrying out a first treatment on the surface of the Thereby obtaining all words +.>
Figure SMS_39
The method comprises the steps of carrying out a first treatment on the surface of the When j=0, a j =[CLS],/>
Figure SMS_40
Wherein [ CLS ]]Is a character marking the beginning of the abstract;
s6: each word of customer service abstract predicted by dialogue abstract model
Figure SMS_41
Each word a of customer service abstract marked by manual j+1 By contrast, by cross entropy loss function L 2 Calculating Loss of Loss 2
Figure SMS_42
S7: through back propagation algorithm and Loss 2 And updating parameters in the customer service digest encoder and the customer service digest decoder.
In S1, the process of generating the user digest by using the user digest module for the completion training is as follows:
s11: will dialogue d=d 1 …d n As the input of the user digest module, the user digest is sent to the user digest encoder to obtain the user digest dialogue vector h of the dialogue D output by the user digest encoder d
S12: for a pair of
Figure SMS_43
Handle h d The first i words that have been generated by the user digest module
Figure SMS_44
Sending the user digest to a user digest decoder to obtain the (i+1) th word of the user digest predicted by the user digest module>
Figure SMS_45
Thereby obtaining all words +.>
Figure SMS_46
I.e. user abstract +.>
Figure SMS_47
. In particular, when i=0, U i =[CLS],/>
Figure SMS_48
=[CLS]Wherein [ CLS ]]Is a character marking the beginning of the abstract.
The training process of the user abstract module is as follows:
inputting the dialogue D into a user digest encoder to obtain a user digest dialogue vector h of the dialogue D output by the encoder d
User abstract for recording manual annotation is U=u 1 …u N Wherein u is 1 …u N Representing N words in the user abstract U; for the following
Figure SMS_49
User abstract dialogue vector h d Manually noted user digest U's first i words U i =u 1 …u i Sending the word I+1 to a user digest decoder to obtain the user digest (i+1) predicted by the user digest module>
Figure SMS_50
Thereby obtaining all words +.>
Figure SMS_51
Predicting individual words by a user digest module
Figure SMS_52
Each word u of user abstract with manual marking i+1 By contrast, by cross entropy loss function L 1 Calculating Loss of Loss 1
Figure SMS_53
Through back propagation algorithm and Loss 1 And updating parameters of the user abstract module.
The process of generating a customer digest using the session digest model is as follows:
generating a given dialog d=d using a training-completed user digest module 1 …d n User digest of (a)
Figure SMS_54
In the problem integration module, user abstract is carried out
Figure SMS_55
As a problem, with dialogue d=d 1 …d n Splicing to obtain a new spliced text Q=d 1 …d n [SEP]/>
Figure SMS_56
Wherein [ SEP ]]Is a character marking the boundary between a dialog and a question;
sending the spliced text Q to a customer service abstract encoder to obtain a spliced text vector h of the spliced text Q output by the customer service abstract encoder q
Handle h q First j words of customer service abstract which have been generated by dialogue abstract model
Figure SMS_57
Sending the text to a customer service abstract decoder to obtain the j+1th word of the customer service abstract predicted by the dialogue abstract model>
Figure SMS_58
Thereby obtaining all predicted words of the dialogue abstract model +.>
Figure SMS_59
I.e. customer service abstract->
Figure SMS_60
. In particular, when j=0, a j =[CLS]Wherein [ CLS ]]Is a character marking the beginning of the abstract.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Furthermore, it should be understood that although the present disclosure describes embodiments, not every embodiment is provided with a single embodiment, and that this description is provided for clarity only, and that the disclosure is not limited to specific embodiments, and that the embodiments may be combined appropriately to form other embodiments that will be understood by those skilled in the art.

Claims (4)

1. A dialogue abstracting method based on denoising type question and answer is characterized in that a given dialogue is input into a dialogue abstracting model, and customer service abstracts are output; the dialogue abstract model comprises a user abstract module, a problem integration module, a customer service abstract encoder and a customer service abstract decoder;
the training method of the dialogue abstract model comprises the following steps:
step one: for a given dialog d=d 1 …d n Predictive generation of user summaries using a training-completed user summary module
Figure QLYQS_1
The method comprises the steps of carrying out a first treatment on the surface of the Wherein d is 1 …d n Represents n sentences in dialog D, +.>
Figure QLYQS_2
Representing user abstract +.>
Figure QLYQS_3
N words of (a);
step two: applying a denoising mechanism in the problem integration module to abstract the user with manual annotation U=u 1 …u N User digest generated by replacing user digest module
Figure QLYQS_4
I.e. user abstract +.>
Figure QLYQS_5
The method comprises the steps of carrying out a first treatment on the surface of the Wherein u is 1 …u N Representing N words in the manually marked user abstract U;
step three: in the problem integration module, the denoised user abstract is extracted
Figure QLYQS_6
As a problem, with dialogue d=d 1 …d n Splicing to obtain spliced text Q=d 1 …d n [SEP]U, wherein [ SEP ]]Characters representing the boundary between the markup dialog and the question;
step four: sending the spliced text Q to a customer service abstract encoder to obtain a spliced text vector h of the spliced text Q output by the customer service abstract encoder q
Step five: manual marked customer service abstract a=a is recorded 1 …a M Wherein a is 1 …a M Representing M words in customer service abstract A; for the following
Figure QLYQS_7
Handle h q The first j words A of the customer service abstract A marked manually j =a 1 …a j Sending the text to a customer service abstract decoder to obtain the j+1th word of the customer service abstract predicted by the dialogue abstract model>
Figure QLYQS_8
The method comprises the steps of carrying out a first treatment on the surface of the Thereby obtaining all words +.>
Figure QLYQS_9
The method comprises the steps of carrying out a first treatment on the surface of the When j=0, a j =[CLS],/>
Figure QLYQS_10
Wherein [ CLS ]]Is a character marking the beginning of the abstract;
step six: each word of customer service abstract predicted by dialogue abstract model
Figure QLYQS_11
Each word a of customer service abstract marked by manual j+1 By contrast, by cross entropy loss function L 2 Calculating Loss of Loss 2
Figure QLYQS_12
Step seven: through back propagation algorithm and Loss 2 And updating parameters in the customer service digest encoder and the customer service digest decoder.
2. The method for abstracting a dialogue based on a denoising question and answer according to claim 1, wherein: the user digest module comprises a user digest encoder and a user digest decoder; in the first step, the training process of the user abstract module is as follows:
inputting the conversation D into a user digest encoder to obtain a user digest conversation vector h of the conversation D output by the user digest encoder d
For the following
Figure QLYQS_13
User abstract dialogue vector h d Manually noted user digest U's first i words U i =u 1 …u i Sending the word I+1 to a user digest decoder to obtain the user digest (i+1) predicted by the user digest module>
Figure QLYQS_14
Thereby obtaining all words +.>
Figure QLYQS_15
The method comprises the steps of carrying out a first treatment on the surface of the When i=0, U i =[CLS],
Figure QLYQS_16
Predicting individual words by a user digest module
Figure QLYQS_17
Each word u of user abstract with manual marking i+1 By contrast, by cross entropy loss function L 1 Calculating Loss of Loss 1
Figure QLYQS_18
Through back propagation algorithm and Loss 1 And updating parameters of the user abstract module.
3. The method for abstracting a dialogue based on a denoising question and answer according to claim 1, wherein: in the first step, the process of generating the user digest by using the user digest module for training is as follows:
taking the conversation D as the input of a user digest module, sending the conversation D into a user digest encoder to obtain a user digest conversation vector h of the conversation D output by the user digest encoder d
For a pair of
Figure QLYQS_19
Handle h d And the first i words that the user digest module has predicted to be generated
Figure QLYQS_20
Sending the user digest to a user digest decoder to obtain the (i+1) th word of the user digest predicted by the user digest module>
Figure QLYQS_21
Thereby obtaining all words +.>
Figure QLYQS_22
I.e. user abstract +.>
Figure QLYQS_23
4. The method for abstracting a dialogue based on a denoising question and answer according to claim 1, wherein: the process of inputting a given conversation into the conversation summary model and outputting a customer service summary is as follows:
user summary module generation using complete trainingFixed dialog d=d 1 …d n User digest of (a)
Figure QLYQS_24
In the problem integration module, user abstract is carried out
Figure QLYQS_25
As a problem, with dialogue d=d 1 …d n Splicing to obtain a new spliced text Q=d 1 …d n [SEP]/>
Figure QLYQS_26
Wherein [ SEP ]]Is a character marking the boundary between a dialog and a question;
sending the spliced text Q to a customer service abstract encoder to obtain a spliced text vector h of the spliced text Q output by the customer service abstract encoder q
Handle h q First j words of customer service abstract which have been generated by dialogue abstract model
Figure QLYQS_27
Sending the text to a customer service abstract decoder to obtain the j+1th word of the customer service abstract predicted by the dialogue abstract model>
Figure QLYQS_28
Thereby obtaining all predicted words of the dialogue abstract model +.>
Figure QLYQS_29
I.e. customer service abstract->
Figure QLYQS_30
CN202310151490.4A 2023-02-22 2023-02-22 Dialogue abstracting method based on denoising type question and answer Active CN115905513B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310151490.4A CN115905513B (en) 2023-02-22 2023-02-22 Dialogue abstracting method based on denoising type question and answer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310151490.4A CN115905513B (en) 2023-02-22 2023-02-22 Dialogue abstracting method based on denoising type question and answer

Publications (2)

Publication Number Publication Date
CN115905513A CN115905513A (en) 2023-04-04
CN115905513B true CN115905513B (en) 2023-07-14

Family

ID=86481134

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310151490.4A Active CN115905513B (en) 2023-02-22 2023-02-22 Dialogue abstracting method based on denoising type question and answer

Country Status (1)

Country Link
CN (1) CN115905513B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117131187B (en) * 2023-10-26 2024-02-09 中国科学技术大学 Dialogue abstracting method based on noise binding diffusion model

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148863A (en) * 2020-10-15 2020-12-29 哈尔滨工业大学 Generation type dialogue abstract method integrated with common knowledge
CN113158665A (en) * 2021-04-02 2021-07-23 西安交通大学 Method for generating text abstract and generating bidirectional corpus-based improved dialog text
CN113204627A (en) * 2021-05-13 2021-08-03 哈尔滨工业大学 Dialog summary generation system using DialoGPT as feature marker
CN114942990A (en) * 2022-05-23 2022-08-26 华东师范大学 Few-sample abstract dialogue abstract generation system based on prompt learning

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8516380B2 (en) * 2007-12-28 2013-08-20 International Business Machines Corporation Conversation abstractions based on trust levels in a virtual world
US10785185B2 (en) * 2018-06-13 2020-09-22 International Business Machines Corporation Automated summary of digital group conversations

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148863A (en) * 2020-10-15 2020-12-29 哈尔滨工业大学 Generation type dialogue abstract method integrated with common knowledge
CN113158665A (en) * 2021-04-02 2021-07-23 西安交通大学 Method for generating text abstract and generating bidirectional corpus-based improved dialog text
CN113204627A (en) * 2021-05-13 2021-08-03 哈尔滨工业大学 Dialog summary generation system using DialoGPT as feature marker
CN114942990A (en) * 2022-05-23 2022-08-26 华东师范大学 Few-sample abstract dialogue abstract generation system based on prompt learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Detecting summary-worthy sentences: the effect of discourse features;Maximilian Droog-Hayes et al.;《2019 IEEE 13th International Conference on Semantic Computing (ICSC)》;第381-384页 *
基于深层置信网络的说话人信息提取方法;陈丽萍等;《模式识别与人工智能》;第1089-1095页 *

Also Published As

Publication number Publication date
CN115905513A (en) 2023-04-04

Similar Documents

Publication Publication Date Title
US9760565B2 (en) Natural expression processing method, processing and response method, device, and system
CN115905513B (en) Dialogue abstracting method based on denoising type question and answer
WO2024099144A1 (en) Downstream task model generation method, task execution method, and device
Ren The use of machine translation algorithm based on residual and LSTM neural network in translation teaching
TW201011735A (en) Method and system for generating dialogue managers with diversified dialogue acts
Müller et al. Findings of the first wmt shared task on sign language translation (wmt-slt22)
CN112349294B (en) Voice processing method and device, computer readable medium and electronic equipment
CN117077085B (en) Multi-mode harmful social media content identification method combining large model with two-way memory
CN117349427A (en) Artificial intelligence multi-mode content generation system for public opinion event coping
CN110610006A (en) Morphological double-channel Chinese word embedding method based on strokes and glyphs
CN114218936A (en) Automatic generation algorithm for high-quality comments in media field
CN117291232A (en) Image generation method and device based on diffusion model
CN115496077B (en) Multimode emotion analysis method and device based on modal observation and grading
CN115953779A (en) Unsupervised image description generation method based on text countermeasure generation network
CN116303930A (en) Session intelligent generation method based on semantic matching and generation model
CN115730607A (en) Dialogue detection model training method and device
CN115589446A (en) Meeting abstract generation method and system based on pre-training and prompting
Li et al. Text Guided Image Editing with Automatic Concept Locating and Forgetting
CN114372140A (en) Layered conference abstract generation model training method, generation method and device
CN117131187B (en) Dialogue abstracting method based on noise binding diffusion model
Pan et al. A multimodal framework for automated teaching quality assessment of one-to-many online instruction videos
CN112613282A (en) Text generation method and device and storage medium
CN112507243A (en) Content pushing method and device based on expressions
CN117332860B (en) Text instruction data generation method and device, electronic equipment and storage medium
CN117174240B (en) Medical image report generation method based on large model field migration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant