CN110069613A - A kind of reply acquisition methods and device - Google Patents
A kind of reply acquisition methods and device Download PDFInfo
- Publication number
- CN110069613A CN110069613A CN201910351022.5A CN201910351022A CN110069613A CN 110069613 A CN110069613 A CN 110069613A CN 201910351022 A CN201910351022 A CN 201910351022A CN 110069613 A CN110069613 A CN 110069613A
- Authority
- CN
- China
- Prior art keywords
- context
- corpus
- target
- group
- reply
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/332—Query formulation
- G06F16/3329—Natural language query formulation or dialogue systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/3331—Query processing
- G06F16/334—Query execution
- G06F16/3344—Query execution using natural language analysis
Abstract
This application discloses a kind of reply acquisition methods and devices, this method comprises: after getting target context, it first obtains and target context is in semantically similar each group corpus context, wherein, dialog history before target context includes the target problem and target problem that quizmaster proposes is above, dialog history before corpus context includes problem corpus and problem corpus is above, then, after getting the corresponding reply corpus of the problems in each group corpus context corpus, it therefrom chooses at least one and replys corpus, at least one reply to be selected as target problem.It can be seen that, the application is to be based on and target context is in semantically similar each group corpus context, to obtain the reply to be selected of target problem, the reply to be selected obtained can be enable to be returned to the key content of target problem, so as to meet the dialogue demand of quizmaster, the reasonability replied and obtain result is improved.
Description
Technical field
This application involves field of artificial intelligence more particularly to a kind of reply acquisition methods and device.
Background technique
With the development of artificial intelligence and natural language processing technique, machine starts the understanding mankind for tentatively having certain
The ability of language, this ability behaviour, which is exchanged with machine by human language, brings possibility, therefore, occurs in recent years each
The interactive system of kind various kinds.This conversational system can be divided into two kinds according to whether oriented mission: one is Task,
It has specific target or task, it is intended to task is counted up into shortest interaction time or wheel, such as intelligent customer service, mobile phone
Intelligent assistant etc.;Another kind is natural interaction type, is usually known as " chat robots ", it does not have specific target,
It is intended to link up with the mankind, exchanges even emotion and pour out.
Wherein, the interactive system about natural interaction type, be the context based on dialogue go retrieve or generate with it is right
The relevant reply of content is talked about, but obtained reply may be unrelated with the key content of problem, to be unable to satisfy pair of quizmaster
Words demand.
Summary of the invention
The main purpose of the embodiment of the present application is to provide a kind of reply acquisition methods and device, can obtain and problem weight
The relevant reasonable reply of point content, to meet the dialogue demand of quizmaster.
The embodiment of the present application provides a kind of reply acquisition methods, comprising:
Target context is obtained, the target context includes the target problem and the target problem that quizmaster proposes
Dialog history before is above;
It obtains with the target context in semantically similar each group corpus context, the corpus context includes asking
Dialog history before inscribing corpus and described problem corpus is above;
Obtain the problems in each group corpus context corresponding reply corpus of corpus;
It chooses at least one and replys corpus, at least one reply to be selected as the target problem.
Optionally, the acquisition is with the target context in semantically similar each group corpus context, comprising:
From the dialogue corpus constructed in advance, each group context relevant to the target context is searched out;
From each group context searched, filter out with the target context above and below semantically similar each group
Text, as each group corpus context.
Optionally, described from each group context searched, it filters out close semantically with the target context
Each group context, comprising:
It is search context by the every group of contextual definition searched;
Generate the corresponding contextual feature of described search context;
Wherein, the contextual feature includes co-occurrence feature and/or semantic feature, and the co-occurrence characteristic present is described to be searched
Different degree of the co-occurrence word in described search context and the target context in rope context and the target context,
The semantic feature characterizes the semantic similarity of described search context Yu the target context;
According to the every group of corresponding contextual feature of context searched, from each group context searched, sieve
It selects with the target context in semantically similar each group context.
It is optionally, described to choose at least one reply corpus, comprising:
By analyzing the correlation between the target context and each reply corpus, chooses at least one and reply language
Material.
It is optionally, described to choose at least one reply corpus, comprising:
Using the correlation models constructed in advance, at least one is selected to reply corpus.
Optionally, the correlation models are obtained using the training of model training data, the model training data packet
Include the true reply and random reply of sample problem included by each sample context, the sample context;Wherein, described
Dialog history before sample context includes the sample problem and the sample problem is above.
The embodiment of the present application also provides a kind of reply acquisition device, comprising:
Target context acquiring unit, for obtaining target context, the target context includes what quizmaster proposed
Dialog history before target problem and the target problem is above;
Corpus context acquiring unit, for obtaining with the target context above and below semantically similar each group corpus
Text, the dialog history before the corpus context includes problem corpus and described problem corpus are above;
Corpus acquiring unit is replied, for obtaining the corresponding reply corpus of the problems in each group corpus context corpus;
Corpus selection unit is replied, corpus is replied for choosing at least one, as at least one of the target problem
Reply to be selected.
Optionally, the corpus context acquiring unit includes:
Contextual search subelement, for searching out and the target context from the dialogue corpus constructed in advance
Relevant each group context;
Context screens subelement, for filtering out and existing with the target context from each group context searched
Semantically similar each group context, as each group corpus context.
Optionally, the context screening subelement includes:
Contextual definition subelement is searched for, every group of contextual definition for will search is search context;
Contextual feature generates subelement, for generating the corresponding contextual feature of described search context;
Wherein, the contextual feature includes co-occurrence feature and/or semantic feature, and the co-occurrence characteristic present is described to be searched
Different degree of the co-occurrence word in described search context and the target context in rope context and the target context,
The semantic feature characterizes the semantic similarity of described search context Yu the target context;
Each group context screens subelement, for according to every group of corresponding contextual feature of context searching,
From each group context searched, filter out with the target context in semantically similar each group context.
Optionally, the reply corpus selection unit is specifically used for:
By analyzing the correlation between the target context and each reply corpus, chooses at least one and reply language
Material.
Optionally, the reply corpus selection unit is specifically used for:
Using the correlation models constructed in advance, at least one is selected to reply corpus.
Optionally, the correlation models are obtained using the training of model training data, the model training data packet
Include the true reply and random reply of sample problem included by each sample context, the sample context;Wherein, described
Dialog history before sample context includes the sample problem and the sample problem is above.
The embodiment of the present application also provides a kind of replies to obtain equipment, comprising: processor, memory, system bus;
The processor and the memory are connected by the system bus;
The memory includes instruction, described instruction for storing one or more programs, one or more of programs
The processor is set to execute any one implementation in above-mentioned reply acquisition methods when being executed by the processor.
The embodiment of the present application also provides a kind of computer readable storage medium, deposited in the computer readable storage medium
Instruction is contained, when described instruction is run on the terminal device, so that the terminal device executes in above-mentioned reply acquisition methods
Any one implementation.
The embodiment of the present application also provides a kind of computer program product, the computer program product is on the terminal device
When operation, so that the terminal device executes any one implementation in above-mentioned reply acquisition methods.
A kind of reply acquisition methods and device provided by the embodiments of the present application first obtain after getting target context
With target context in semantically similar each group corpus context, wherein target context includes the target that quizmaster proposes
Dialog history before problem and target problem is above, corpus context include problem corpus and problem corpus before go through
History dialogue above, then, after getting the corresponding reply corpus of the problems in each group corpus context corpus, can be selected therefrom
At least one is taken to reply corpus, at least one reply to be selected as target problem.As it can be seen that the embodiment of the present application is to be based on and mesh
Put on hereafter in semantically similar each group corpus context, to obtain the reply to be selected of target problem, can make to obtain to
The key content that can be returned to target problem is replied in choosing, and then can filter out target problem most from these replies to be selected
It replys eventually, to meet the dialogue demand of quizmaster, improves the reasonability replied and obtain result.
Detailed description of the invention
In order to illustrate the technical solutions in the embodiments of the present application or in the prior art more clearly, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is the application
Some embodiments for those of ordinary skill in the art without creative efforts, can also basis
These attached drawings obtain other attached drawings.
Fig. 1 is a kind of flow diagram for replying acquisition methods provided by the embodiments of the present application;
Fig. 2 is the structural schematic diagram of correlation models provided by the embodiments of the present application;
Fig. 3 is a kind of structural schematic diagram for replying acquisition methods provided by the embodiments of the present application;
Fig. 4 is a kind of composition schematic diagram for replying acquisition device provided by the embodiments of the present application.
Specific embodiment
To keep the purposes, technical schemes and advantages of the embodiment of the present application clearer, below in conjunction with the embodiment of the present application
In attached drawing, the technical scheme in the embodiment of the application is clearly and completely described, it is clear that described embodiment is
Some embodiments of the present application, instead of all the embodiments.Based on the embodiment in the application, those of ordinary skill in the art
Every other embodiment obtained without making creative work, shall fall in the protection scope of this application.
First embodiment
Referring to Fig. 1, for a kind of flow diagram for replying acquisition methods provided in this embodiment, this method includes following step
It is rapid:
S101: target context is obtained, wherein target context includes that the target problem that quizmaster proposes and target are asked
Dialog history before topic is above.
In the present embodiment, quizmaster being proposed to machine, need to get machine reply the problem of be defined as target
Problem, and the dialog history before the target problem and target problem that propose including quizmaster is defined above for above and below target
Text.
It should be noted that target context may include target problem and target problem before whole dialog histories
Above, alternatively, target context also may include the forward partial history comprising including target problem since target problem
Dialogue is above.Assuming that target context includes m sentence, it is here, this m sentence is successively fixed from front to back in chronological order
Justice is u1、u2、…、um, wherein umRefer to the corresponding content of text of target problem.
It should also be noted that, the present embodiment does not limit the mode that quizmaster proposes target problem, for example, quizmaster can be with
Mode by voice input proposes target problem to machine, can also propose that target is asked to machine by way of text input
Topic, that is, target problem can be speech form, be also possible to textual form, also, the present embodiment does not also limit target problem
Languages, such as Chinese, English etc..In addition, when quizmaster proposes target problem to machine by way of text input, this reality
It applies example and does not also limit input method type used in quizmaster, such as search dog input method, Baidu's input method etc..
S102: obtaining and target context is in semantically similar each group corpus context, wherein corpus context includes
Dialog history before problem corpus and problem corpus is above.
In the present embodiment, after getting target context by step S101, the conjunction for obtaining result is replied in order to improve
Rationality, so that the reply content got can satisfy the dialogue demand of quizmaster, it is necessary first to the language based on target context
Adopted information, acquisition and target context include in semantically similar each group corpus context, each corpus context of acquisition
Dialog history before problem corpus and problem corpus is above, wherein corpus context can be to be collected in advance, can will
Man-machine more wheel dialogues are used as one group of corpus context, more wheels of person to person can also be talked with as above and below one group of corpus
Text, and the problems in corpus context corpus is a customer problem.
It should be noted that corpus context may include problem corpus and problem corpus before whole dialog histories
Above, alternatively, corpus context also may include the forward partial history comprising including problem corpus since problem corpus
Dialogue is above.Assuming that corpus context includes n sentence, it is here, this n sentence is successively fixed from front to back in chronological order
Justice is v1、v2、…、vn, wherein vnRefer to the corresponding content of text of problem corpus.
Next, the present embodiment " will be obtained in step S102 by following step A-B and target context is in semanteme
The detailed process of upper similar each group corpus context " is introduced.
Step A: from the dialogue corpus constructed in advance, each group context relevant to target context is searched out.
In the present embodiment, after getting target context by step S101, it can use text search method, from pre-
In the dialogue corpus first constructed, each group context relevant to target context is searched out, is searched for example, can use distribution
Index hold up (Elasticsearch) or full-text search server (Solr), from the dialogue corpus constructed in advance, search out with
The relevant each group context of target context.Then, then by subsequent step B, from this each group context searched out, into one
Step filters out and target context is in semantically similar each group context.
Wherein, the problems in multiple groups context and every group of context corpus can be prestored (i.e. often by talking in corpus
The last one problem in group context) corresponding reply corpus (the i.e. corresponding reality of the last one problem in every group of context
Reply content), these contexts and corresponding reply corpus can be by collecting the dialogue of people in daily life
And it is carried out obtain after sensitive information processing.Specifically, when corpus is talked in building, a large amount of people can be collected first
Dialogue data in daily life, for example, it is true on social network-i i-platform (such as microblogging, discussion bar) to collect a large amount of people
Real dialogue data etc., then, then by some sensitive datas (such as telephone number or identification card number) in these dialog texts
Delete or replacement, then, can will wherein each group by sensitive information treated dialogue data directly as about one group
Text therefrom extracts the continuous dialogue data in part as one group of context, wherein last sentence in every group of context is to use
Family problem, at this point, by the reply corpus of the customer problem in every group of context and every group of context, deposit dialogue corpus.
In addition, when corpus is talked in building, in addition to it is above-mentioned existing real dialog data handled after obtain
Outside each group context and its corresponding reply corpus, other dialogues can also be simulated according to these true dialogue datas
Data, and each group context and the corresponding reply corpus of every group of context are therefrom obtained, to construct dialogue corpus.
Wherein, it about every group of context in dialogue corpus, can be with the first preset length or default dialogue wheel
Several texts, similar, the corresponding reply corpus of every group of context can be the text with the second preset length, also,
One preset length is typically larger than the second preset length.
It should be noted that when searching for each group context relevant to target context from dialogue corpus, for just
In description, can will talk with each contextual definition in corpus is sample context, can calculate each sample context
Correlation coefficient value between target context, the correlation coefficient value is for measuring corresponding sample context and target context
Between correlation indicate that the correlation between the two contexts is bigger for example, correlation coefficient value is bigger.Then, by these
Correlation coefficient value sorts from large to small, and above and below the corresponding each group sample of correlation coefficient value of the preceding preset number of selected and sorted
Text, as with target context in semantically related each group context;Alternatively, these correlation coefficient value are arranged from small to large
Sequence, and the corresponding each group sample context of correlation coefficient value of the posterior preset number of selected and sorted, as with target context
In semantically related each group context.
Also, in order to retrieve as far as possible more and target context is in semantically related each group context,
Bigger numerical can be set by preset number as far as possible, for example, can be arranged in the case where system operations amount allows
It is 1000, that is, can select in the case where system operations amount allows and target context is at semantically related 1000 groups
Context, further to be got from this 1000 groups of contexts and target context is semantically by subsequent step B
Similar each group context.
Step B: it from each group context searched, filters out and target context is above and below semantically similar each group
Text, as each group corpus context.
In the present embodiment, it is searched out related to target context from the dialogue corpus constructed in advance by step A
Each group context after, can the semantic similarity degree respectively to the every group of context and target context that search count
Calculate, and according to calculated result, filter out and target context is in semantically more similar each group context, that is, filter out with
The higher each group context of target context semantic similarity degree, as each group corpus context.
In this way, after searching each group context relevant to target context, can be screened out from it with above and below target
Text is in semantically similar each group context, to the foundation as the reply to be selected for obtaining target problem, so that obtaining
To content and the target context content of reply to be selected be semantic relevant, and then more meet the dialogue demand of quizmaster.
Next, the present embodiment will by following step B1-B3, in step B " from each group context searched,
Filter out and target context be in semantically similar each group context " detailed process be introduced.
Step B1: being search context by the every group of contextual definition searched.
In the present embodiment, for ease of description, it is relevant to target context every by what is searched out from dialogue corpus
Group contextual definition is search context.
Step B2: the corresponding contextual feature of search context is generated.
In the present embodiment, for every group searching context, in order to judge the search context whether with above and below target
Text be semantically it is similar, the corresponding contextual feature of search context can be generated first.
Wherein, which includes co-occurrence feature and/or semantic feature, co-occurrence mark sheet
Different degree of the co-occurrence word in search context and target context in search context and target context has been levied, it is semantic special
Sign then characterizes the semantic similarity of search context and target context.
It should be noted that the present embodiment will be subject to it is all search contexts in certain last set context, to be situated between
It continues and how to generate the corresponding contextual feature of search context, and the processing mode of other group searching contexts is similar therewith, no
It repeats one by one again.
Co-occurrence feature is introduced below.
A kind of generating mode of co-occurrence feature are as follows: first with segmenting method, search context is segmented, is somebody's turn to do
The each word for including in search context, then, by it is wherein included " ", " " etc. itself have no deactivating for specific meaning
Word is deleted, and then, then calculates remaining each word weight shared in the search context, the weighted value is bigger, shows
Different degree of the corresponding word in the search context is higher.
Similarly, it can use segmenting method, which segmented, obtain including in target context is each
A word, then, by it is wherein included " ", " " etc. itself have no the stop words of specific meaning and delete, then, then calculate
Remaining each word weight shared in target context, the weighted value is bigger, shows corresponding word in target context
Different degree it is higher.
In the present embodiment, the word that context and target context all have will be searched for and is defined as co-occurrence word, the co-occurrence word
It can be made of at least one word.
In turn, the weight of the co-occurrence word in the search context and target context can be subjected to read group total respectively,
Harmonic average calculating are carried out to obtaining two read group total results again, and the calculated result is corresponding as the search context
Co-occurrence feature, to characterize all co-occurrence words for including in the search context and target context in the search context and mesh
Different degree hereinafter is put on, specific formula for calculation is as follows:
fw=fs*fh*2/(fs+fh) (3)
Wherein,Indicate the weight of i-th of co-occurrence word in search context, the weighted value is bigger, indicates i-th of co-occurrence
Importance of the word in search context is higher;N indicates the total number of the co-occurrence word in search context and target context;fs
Indicate total weight of all co-occurrence words in search context and target context in the search context, which gets over
Greatly, show that different degree of all co-occurrence words in the search context is higher;Indicate i-th co-occurrence word in target context
Weight, the weighted value is bigger, indicates that importance of i-th co-occurrence word in target context is higher;fhIndicate search context
With total weight of all co-occurrence words in target context in target context, the weighted value is bigger, shows all co-occurrence words
Different degree in target context is higher;fwFor the corresponding co-occurrence feature of the search context, indicate to fsAnd fhIt is adjusted
With the calculated result obtained after average computation, fwValue it is bigger, show all co-occurrence words in the search context and target
Different degree hereinafter is higher.
For example: assuming that search context segment etc. each word obtained after pretreatment be " you ", " present ",
" good ", " ", and calculate each word weight shared in the search context be respectively " 0.2 ", " 0.3 ", " 0.4 ",
" 0.1 ", while segment etc. to target context each word obtained after pretreatment is " you ", " genuine ", " good ", " ",
And calculating each word weight shared in target context is respectively " 0.2 ", " 0.3 ", " 0.3 ", " 0.2 ".
As it can be seen that the co-occurrence word in the search context and target context is " you " and " good ", then utilize formula (1) can be with
Calculate total weight of the two co-occurrence words in the search context be 0.6, i.e. 0.2+0.4=0.6, can using formula (2)
To calculate total weight of the two co-occurrence words in target context as 0.5, i.e. 0.2+0.3=0.5 utilizes formula in turn
(3) can calculate the corresponding co-occurrence feature of the search context is 0.545, i.e. fw=0.6*0.5*2/ (0.5+0.6)=
0.545, which characterizes all co-occurrence words for including in the search context and target context in the search context and target
Different degree in context.
The corresponding co-occurrence feature of search context is described above, the corresponding semantic feature of search context is carried out below
It introduces, specifically, the generating mode of semantic feature may include following step (1)-(3):
Step (1): the semantic expressiveness result of target context is generated.
In the present embodiment, target context is segmented, it, can be with after obtaining each word for including in target context
Using vector generation method, the corresponding term vector of each word in target context is generated, for example, query semantics dictionary can be passed through
Mode, inquire the corresponding term vector of each word in target context, in turn, can according to the corresponding term vector of each word and
Each word weight shared in target context, generates the semantic expressiveness of target context as a result, specific formula for calculation is as follows:
Wherein, S indicates the semantic expressiveness result of target context;M indicates the total number for the word for including in target context;
EiIndicate the corresponding term vector of i-th of word in target context;wiIndicate that i-th of word is in target context in target context
Shared weight, the weighted value is bigger, indicates that different degree of i-th of word in target context is higher.
Step (2): the semantic expressiveness result of search context is generated.
In the present embodiment, search context is segmented, it, can after obtaining each word for including in the search context
To utilize vector generation method, the corresponding term vector of each word in the search context is generated, for example, query semantics can be passed through
The mode of dictionary, inquiring the corresponding term vector of each word in the search context in turn can be according to the corresponding word of each word
Vector and each word weight shared in the search context generate the semantic expressiveness of the search context as a result, specific meter
It is as follows to calculate formula:
Wherein, H indicates the semantic expressiveness result of search context;M' indicates total of the word for including in search context
Number;E'jIndicate the corresponding term vector of j-th of word in search context;γjIndicate that j-th of word is above and below search in search context
Shared weight in text, the weighted value is bigger, indicates that different degree of j-th of word in search context is higher.
It should be noted that conditioning step (1) and (2) do not execute sequence to the embodiment of the present application.
Step (3): according to the semantic expressiveness of generation as a result, generating the corresponding semantic feature of search context.
In the present embodiment, the semantic expressiveness result S of target context is generated by step (1), and passes through step (2)
It, can be by calculating the semantic expressiveness result of target context and searching for after the semantic expressiveness result H for generating search context
The COS distance between semantic expressiveness result hereafter obtains the semantic similarity of search context and target context, to
As the corresponding semantic feature of search context, specific formula for calculation is as follows:
fm=cosine (S, H) (6)
Wherein, fmIt is the corresponding semantic feature of search context, to characterize the language of search context and target context
Adopted similarity, meanwhile, fmIt is also semantic distance value, for measuring the semantic distance of search context and target context;cos
Ine indicates COS distance calculation formula;The semantic expressiveness result of S expression target context;H indicates the semantic table of search context
Show result.
It is understood that the f in above-mentioned formula (6)mValue is bigger, shows the language for searching for context and target context
Justice is apart from smaller, that is, the semantic similarity of the two is higher.
Step B3: according to the every group of corresponding contextual feature of context searched, above and below each group searched
Wen Zhong, filters out and target context is in semantically similar each group context.
After generating the corresponding contextual feature of every last set context by step B2, that is, generate in every last set
Hereafter after corresponding co-occurrence feature and/or semantic feature, the two can be subjected to read group total, and benefit according to respective shared weight
The semantic similarity degree of corresponding search context and target context is characterized with calculated result, specific formula for calculation is as follows:
F=wwfw+wmfm (7)
Wherein, f indicates the semantic similarity degree value of corresponding search context and target context;wwIndicate the search or more
The corresponding co-occurrence feature f of textwWeight, weight wwIt is bigger, indicate co-occurrence feature fwImportance it is bigger, weight wwIt can be according to reality
Result is tested to be adjusted;wmIndicate the corresponding semantic feature f of the search contextmWeight, weight wmIt is bigger, indicate semantic special
Levy fmImportance it is bigger, weight wmIt can be adjusted according to experimental result.
Specifically, the semanteme between every group searching context and target context is being calculated using above-mentioned formula (7)
After close degree value, in turn, these semantic similarity degree values can be sorted from large to small, and the preceding present count of selected and sorted
The corresponding each group searching context of the semantic similarity degree value of mesh (or within the scope of default value), as with target context in language
Similar each group context in justice;Alternatively, these semantic similarity degree values are sorted from small to large, and selected and sorted is posterior pre-
If the corresponding each group searching context of the semantic similarity degree value of number (or within the scope of default value), as with above and below target
Text is in semantically similar each group context;Alternatively, all semantic similarity degree values for being selected above preset threshold are corresponding each
Group searching context, as with target context in semantically similar each group context.
Alternatively, above and below can also be only in accordance with the corresponding co-occurrence feature of every last set context or only in accordance with every last set
The corresponding semantic feature of text filters out from each group context searched and target context is in semantically similar each group
Context.
Specifically, a kind of to be optionally achieved in that, every group searching context pair is being calculated using above-mentioned formula (3)
The co-occurrence characteristic value f answeredwAfterwards, in turn, these co-occurrence characteristic values can be sorted from large to small, and selected and sorted is preceding default
The corresponding each group searching context of the co-occurrence characteristic value of number (or within the scope of default value), as with target context in semanteme
Upper similar each group context;Alternatively, these co-occurrence characteristic values are sorted from small to large, and the posterior preset number of selected and sorted
The corresponding each group searching context of the co-occurrence characteristic value of (or within the scope of default value), as with target context in semantically phase
Close each group context;Alternatively, the corresponding each group searching context of all co-occurrence characteristic values for being selected above preset threshold, as
With target context in semantically similar each group context.
Another kind is optionally achieved in that, is calculating the corresponding language of every group searching context using above-mentioned formula (6)
Adopted characteristic value fmAfterwards, in turn, these semantic feature values can be sorted from large to small, and the preceding preset number of selected and sorted
The semantic feature of (or within the scope of default value) is worth corresponding each group searching context, as with target context in semantically phase
Close each group context;Alternatively, these semantic feature values are sorted from small to large, and the posterior preset number of selected and sorted (or
Within the scope of default value) semantic feature be worth corresponding each group searching context, as close semantically with target context
Each group context;Alternatively, the corresponding each group searching context of all semantic feature values for being selected above preset threshold, as with
Target context is in semantically similar each group context.
It should be noted that using the co-occurrence word in search context and target context, it can be from dialogue corpus
In in each group context relevant to target context that searches out, more accurately filter out with target context in semantically phase
Close each group context, and as each group corpus context.
S103: the problems in each group corpus context corresponding reply corpus of corpus is obtained.
In the present embodiment, it is got by step S102 and target context is above and below semantically similar each group corpus
Wen Hou can get the corresponding reply of the problems in each group corpus context corpus from the dialogue corpus constructed in advance
Corpus.
S104: it chooses at least one and replys corpus, at least one reply to be selected as target problem.
In the present embodiment, the corresponding reply language of the problems in each group corpus context corpus is got by step S103
After material, the semantic degree of correlation between each reply corpus and target context can be calculated, and then can be according to meter
It calculates as a result, therefrom selecting at least one replys corpus, at least one reply to be selected as target problem.
It should be noted that the specific implementation " chosen at least one in this step S104 and reply corpus " will be second
It is introduced in embodiment.
To sum up, a kind of reply acquisition methods provided in this embodiment, after getting target context, first obtain and target
Then context is getting the problems in each group corpus context corpus pair in semantically similar each group corpus context
After the reply corpus answered, it can therefrom choose at least one and reply corpus, at least one reply to be selected as target problem.It can
See, the present embodiment is to be based on and target context is in semantically similar each group corpus context, come obtain target problem to
Choosing is replied, and the reply to be selected obtained can be enable to be returned to the key content of target problem, and then can be to be selected time from these
The final reply of target problem is filtered out in multiple, to meet the dialogue demand of quizmaster, is improved and is replied the reasonable of acquisition result
Property.
Second embodiment
The present embodiment is by the specific implementation to " choosing at least one and replying corpus " in first embodiment in step S104
Journey is introduced.
It is understood that the reply of a problem, it should which the context with the problem even problem is in semantic phase
It is higher on Guan Du, it just can guarantee that the reply is can be returned to the reasonable reply of problem key content, rather than some be similar to
" I does not know " is meaningless in this way, only semantically, relevant high frequency is replied reluctantly to problem.
Based on this, in the present embodiment, one kind is optionally achieved in that, " chooses at least one reply in step S104
The specific implementation process of corpus " may include: to be chosen by the correlation between analysis target context and each reply corpus
At least one replys corpus.
In this implementation, it can use the semantic dependency calculation method of existing or future appearance, calculate target
Correlation between context and each reply corpus therefrom selects at least one and replys corpus further according to calculated result, than
Such as, the correlation models constructed in advance or the mode for directlying adopt correlation calculations be can use, determine target context
It selects at least one with the correlation between each reply corpus and replys corpus and then according to the correlation determination result.
It should be noted that next, the present embodiment will be in all reply corpus for being got by step S103
A certain item is replied subject to corpus, how to introduce using the correlation models constructed in advance, determines target context and this
The correlation between corpus is replied, and other reply corpus processing modes are similar therewith, no longer repeat one by one.
Specifically, the correlation models that the present embodiment constructs in advance can be made of multitiered network, should as shown in Fig. 2
Model structure includes input layer, embeding layer, expression layer and matching layer.
Wherein, input layer replys corpus and target context for inputting.It specifically, as shown in Fig. 2, can be by this time
Multiple corpus is input in Fig. 2 on the left of input layer " true to reply " position, while " target context " being input to defeated in Fig. 2
Enter " context " position among layer.
The effect of embeding layer be each word in the reply corpus and target context in input layer is converted to it is corresponding
Term vector, specifically, as shown in Fig. 2, in embeding layer, it can be by inquiring trained term vector dictionary in advance
The mode of (Embedding Matrix) inquires and replys the corresponding term vector of each word in corpus and target context,
That is, will reply corpus and the respective word sequence of target context is converted to term vector sequence.
The effect of expression layer is the reply corpus and the corresponding term vector sequence of target context to embeding layer output
It is encoded, to obtain replying corpus and the corresponding coding vector of target context.For example, in expression layer, Ke Yili
With bag of words (Bag-of-words model, abbreviation BOW), convolutional neural networks (Convolutional Neural
Networks, abbreviation CNN) or Recognition with Recurrent Neural Network (Recurrent Neural Network, abbreviation RNN) etc. it is defeated to embeding layer
The corresponding term vector sequence of reply corpus and target context out is encoded, to obtain the two corresponding one
The specific coding process of coding vector, model is consistent with existing method, and details are not described herein.
The effect of matching layer be to expression layer output reply corpus and the corresponding coding vector of target context into
Row matching primitives, and the correlation between target context and this reply corpus is determined according to calculated result.Specifically,
The COS distance value replied between corpus and the corresponding coding vector of target context that expression layer output can be calculated, makees
For matching primitives as a result, the matching primitives result value is bigger, it is related between corpus to show that target context is replied to this
Property is higher.Alternatively, also can use trained multilayer fully-connected network (Multi-Layer Perception, abbreviation in advance
MLP), the reply corpus to expression layer output and the corresponding coding vector of target context carry out matching primitives, obtain
Matching primitives result.
In turn, when using the above-mentioned correlation models constructed in advance, determining that target context and each reply corpus
Between matching primitives result after, these matching primitives results can be sorted from large to small, and selected and sorted is preceding default
The matching primitives end value of number corresponding at least one replys corpus, at least one reply to be selected as target problem;Or
Person sorts these matching primitives results from small to large, and the matching primitives end value pair of the posterior preset number of selected and sorted
At least one answered replys corpus, at least one reply to be selected as target problem;Alternatively, being selected above the institute of preset threshold
There is matching primitives end value corresponding at least one to reply corpus, at least one reply to be selected as target problem.
In the present embodiment, a kind of optionally to be achieved in that, correlation models are to utilize the model training data trained
It arrives, wherein model training data include the true reply of each sample context, sample problem included by sample context
It is replied with random;Wherein, the dialog history before sample context includes sample problem and sample problem is above.
Next, the building process of correlation models will be introduced in the present embodiment.It can specifically include following steps
C1-C3:
Step C1: model training data are formed.
In the present embodiment, in order to construct correlation models, firstly, it is necessary to collect a large amount of human conversation context in advance
(these context of dialogue may be constructed above-mentioned dialogue corpus), for example, a large amount of people can be collected in advance in microblogging, discussion bar
Etc. real dialog data in social network-i i-platforms, then, by the last one problem in wherein each human conversation context
As sample problem, and the dialog history before including sample problem and the sample problem is defined above for above and below sample
Text, and each sample problem has corresponded to true reply content and random reply content, the data that these are collected into are as mould
Type training data.
Step C2: building correlation models.
An initial correlation models, and initialization model parameter can be constructed.
It should be noted that conditioning step C1 and step C2 do not execute sequence to the present embodiment.
Step C3: using the model training data collected in advance, correlation models are trained.
In the present embodiment, after being collected into model training data by step C1, the model training data pair be can use
It is trained by the correlation models that step C2 is constructed, by multiwheel models training, until meeting training termination condition,
At this point, i.e. training obtains correlation models.
Specifically, it when carrying out epicycle training, needs to select a sample context from above-mentioned model training data, this
When, the target context in above-described embodiment is replaced with into the sample context, will be obtained in above-described embodiment by step S103
The reply corpus got replaces with true reply content included by the sample context, according to the side of above-described embodiment introduction
Formula determines that the correlation between the sample context and the true reply content, detailed process can refer on the left of Fig. 2 and intermediate
Shown in two column block diagrams.
Meanwhile the reply corpus got in above-described embodiment by step S103 can be replaced with to sample institute above
Including random reply content determined in the sample context and the random reply in the way of above-described embodiment introduction
Correlation between appearance, detailed process can refer among Fig. 2 and shown in two column block diagram of right side.
Then, the correlation between the sample context and true reply content and the sample context with should be with
On the basis of correlation between machine reply content, by comparing difference between the two, the parameter of correlation models is carried out
It updates, that is, completes the epicycle training of correlation models.
In epicycle training process, one kind is optionally achieved in that, can be made in the training process of correlation models
It is trained with objective function, for example, can be by the classification of loss function hinge loss or two cross entropy (binary_
Crossentropy it) is used as objective function, for the correlation and the sample between the sample context and true reply
On the basis of correlation between context and random reply, gap between the two is widened, so that correlation models have
Have and replys and the non-separating capacity rationally replied reasonable.
Also, in objective functions such as use loss function hinge loss or binary_crossentropy to correlation
When model is trained, the model parameter of correlation models can be carried out constantly more according to the variation of the objective function value
Newly, model parameter is updated for example, back-propagation algorithm can be used, until the value of objective function meets the requirements (ratio
Such as, tend to 0 or amplitude of variation very little etc.), then stop the update of model parameter, to complete the training of correlation models.
To sum up, the present embodiment first determines target context and each reply corpus using the correlation models constructed in advance
Between correlation select wherein at least one then further according to the correlation determination result and reply corpus, asked as target
Each reply to be selected of topic, so as to guarantee that these to be selected reply got are semantic phase in terms of content with target context
It closes, and can also guarantee that these replies to be selected can be returned to the key content of target problem, and be no longer only some nothings
Meaning or only semantically, relevant high frequency is replied reluctantly to target context, and then the dialogue that can satisfy quizmaster needs
It asks.
3rd embodiment
For ease of understanding, the present embodiment is by a kind of structural schematic diagram of reply acquisition methods as shown in connection with fig. 3.To this Shen
Please the whole realization process of reply acquisition methods that provides of embodiment be introduced.
As shown in figure 3, obtaining module comprising replying in the structure, which is used to obtain at least one time of target problem
Choosing is replied.
Specifically, the whole realization process of the embodiment of the present application are as follows: it is possible, firstly, to target context is got, the mesh
Put on the history before the target problem and target problem that quizmaster is inputted by way of voice or text has been included below
Dialogue is above;It is then possible to module be obtained using replying, according to the target context got, from constructing in advance to language
Expect in library, filters out and target context is in semantically similar each group context, as each group corpus context, and get
The corresponding reply corpus of the problems in this each group corpus context corpus;Then, it replys to obtain module and can use and get
These target contexts replying corpus and being formed above including the dialog history before target problem and target problem, into
Row correlation calculations, to select wherein at least one and reply corpus, as target problem according to the correlation determination result
Each reply to be selected, these to be selected reply so as to guarantee to get be to target context are semantic related in terms of content
, and be can be returned to the reasonable reply of target problem key content, rather than it is some it is meaningless, only with target context
Semantically, relevant high frequency is replied reluctantly, and then can satisfy the dialogue demand of quizmaster, further, can also will acquire
To each reply to be selected exported by way of voice and/or text, so as to be selected from each reply to be selected
The final reply of target problem.It should be noted that specifically replying acquisition process referring in first embodiment and second embodiment
Step S101~S104's is discussed in detail.
Fourth embodiment
A kind of reply acquisition device will be introduced in the present embodiment, and related content refers to above method embodiment.
It referring to fig. 4, is a kind of composition schematic diagram for replying acquisition device provided in this embodiment, which includes:
Target context acquiring unit 401, for obtaining target context, the target context includes quizmaster's proposition
Target problem and the target problem before dialog history it is above;
Corpus context acquiring unit 402, for obtaining with the target context in semantically similar each group corpus
Context, the dialog history before the corpus context includes problem corpus and described problem corpus are above;
Corpus acquiring unit 403 is replied, for obtaining the corresponding reply corpus of the problems in each group corpus context corpus;
Corpus selection unit 404 is replied, replys corpus for choosing at least one, at least one as the target problem
A reply to be selected.
In a kind of implementation of the present embodiment, the corpus context acquiring unit 402 includes:
Contextual search subelement, for searching out and the target context from the dialogue corpus constructed in advance
Relevant each group context;
Context screens subelement, for filtering out and existing with the target context from each group context searched
Semantically similar each group context, as each group corpus context.
In a kind of implementation of the present embodiment, the context screening subelement includes:
Contextual definition subelement is searched for, every group of contextual definition for will search is search context;
Contextual feature generates subelement, for generating the corresponding contextual feature of described search context;
Wherein, the contextual feature includes co-occurrence feature and/or semantic feature, and the co-occurrence characteristic present is described to be searched
Different degree of the co-occurrence word in described search context and the target context in rope context and the target context,
The semantic feature characterizes the semantic similarity of described search context Yu the target context;
Each group context screens subelement, for according to every group of corresponding contextual feature of context searching,
From each group context searched, filter out with the target context in semantically similar each group context.
In a kind of implementation of the present embodiment, the reply corpus selection unit 404 is specifically used for:
By analyzing the correlation between the target context and each reply corpus, chooses at least one and reply language
Material.
In a kind of implementation of the present embodiment, the reply corpus selection unit 404 is specifically used for:
Using the correlation models constructed in advance, at least one is selected to reply corpus.
In a kind of implementation of the present embodiment, the correlation models are obtained using the training of model training data
, the model training data include that each sample context, the true of sample problem included by the sample context return
Multiple and random reply;Wherein, the history pair before the sample context includes the sample problem and the sample problem
Words are above.
Further, the embodiment of the present application also provides a kind of replies to obtain equipment, comprising: processor, memory, system
Bus;
The processor and the memory are connected by the system bus;
The memory includes instruction, described instruction for storing one or more programs, one or more of programs
The processor is set to execute any implementation method of above-mentioned reply acquisition methods when being executed by the processor.
Further, described computer-readable to deposit the embodiment of the present application also provides a kind of computer readable storage medium
Instruction is stored in storage media, when described instruction is run on the terminal device, so that the terminal device executes above-mentioned reply
Any implementation method of acquisition methods.
Further, the embodiment of the present application also provides a kind of computer program product, the computer program product exists
When being run on terminal device, so that the terminal device executes any implementation method of above-mentioned reply acquisition methods.
As seen through the above description of the embodiments, those skilled in the art can be understood that above-mentioned implementation
All or part of the steps in example method can be realized by means of software and necessary general hardware platform.Based on such
Understand, substantially the part that contributes to existing technology can be in the form of software products in other words for the technical solution of the application
It embodies, which can store in storage medium, such as ROM/RAM, magnetic disk, CD, including several
Instruction is used so that a computer equipment (can be the network communications such as personal computer, server, or Media Gateway
Equipment, etc.) execute method described in certain parts of each embodiment of the application or embodiment.
It should be noted that each embodiment in this specification is described in a progressive manner, each embodiment emphasis is said
Bright is the difference from other embodiments, and the same or similar parts in each embodiment may refer to each other.For reality
For applying device disclosed in example, since it is corresponded to the methods disclosed in the examples, so being described relatively simple, related place
Referring to method part illustration.
It should also be noted that, herein, relational terms such as first and second and the like are used merely to one
Entity or operation are distinguished with another entity or operation, without necessarily requiring or implying between these entities or operation
There are any actual relationship or orders.Moreover, the terms "include", "comprise" or its any other variant are intended to contain
Lid non-exclusive inclusion, so that the process, method, article or equipment including a series of elements is not only wanted including those
Element, but also including other elements that are not explicitly listed, or further include for this process, method, article or equipment
Intrinsic element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that
There is also other identical elements in process, method, article or equipment including the element.
The foregoing description of the disclosed embodiments makes professional and technical personnel in the field can be realized or use the application.
Various modifications to these embodiments will be readily apparent to those skilled in the art, as defined herein
General Principle can be realized in other embodiments without departing from the spirit or scope of the application.Therefore, the application
It is not intended to be limited to the embodiments shown herein, and is to fit to and the principles and novel features disclosed herein phase one
The widest scope of cause.
Claims (13)
1. a kind of reply acquisition methods characterized by comprising
Target context is obtained, before the target context includes the target problem and the target problem that quizmaster proposes
Dialog history it is above;
It obtains with the target context in semantically similar each group corpus context, the corpus context includes problem language
Dialog history before material and described problem corpus is above;
Obtain the problems in each group corpus context corresponding reply corpus of corpus;
It chooses at least one and replys corpus, at least one reply to be selected as the target problem.
2. the method according to claim 1, wherein the acquisition is close semantically with the target context
Each group corpus context, comprising:
From the dialogue corpus constructed in advance, each group context relevant to the target context is searched out;
From each group context searched, filters out with the target context in semantically similar each group context, make
For each group corpus context.
3. according to the method described in claim 2, it is characterized in that, described from each group context searched, filter out with
The target context is in semantically similar each group context, comprising:
It is search context by the every group of contextual definition searched;
Generate the corresponding contextual feature of described search context;
Wherein, the contextual feature includes co-occurrence feature and/or semantic feature, in co-occurrence characteristic present described search
The hereafter different degree with the co-occurrence word in the target context in described search context and the target context, it is described
Semantic feature characterizes the semantic similarity of described search context Yu the target context;
It is filtered out from each group context searched according to the every group of corresponding contextual feature of context searched
With the target context in semantically similar each group context.
4. method according to any one of claims 1 to 3, which is characterized in that described to choose at least one reply corpus, packet
It includes:
By analyzing the correlation between the target context and each reply corpus, chooses at least one and reply corpus.
5. according to the method described in claim 4, it is characterized in that, described choose at least one reply corpus, comprising:
Using the correlation models constructed in advance, at least one is selected to reply corpus.
6. according to the method described in claim 5, it is characterized in that, the correlation models are to utilize the training of model training data
Obtain, the model training data include each sample context, sample problem included by the sample context it is true
It is real to reply and reply at random;Wherein, going through before the sample context includes the sample problem and the sample problem
History dialogue is above.
7. a kind of reply acquisition device characterized by comprising
Target context acquiring unit, for obtaining target context, the target context includes the target that quizmaster proposes
Dialog history before problem and the target problem is above;
Corpus context acquiring unit, for obtaining with the target context in semantically similar each group corpus context,
Dialog history before the corpus context includes problem corpus and described problem corpus is above;
Corpus acquiring unit is replied, for obtaining the corresponding reply corpus of the problems in each group corpus context corpus;
Corpus selection unit is replied, replys corpus for choosing at least one, at least one as the target problem is to be selected
It replys.
8. device according to claim 7, which is characterized in that the corpus context acquiring unit includes:
Contextual search subelement, for searching out related to the target context from the dialogue corpus constructed in advance
Each group context;
Context screens subelement, for filtering out with the target context in semanteme from each group context searched
Upper similar each group context, as each group corpus context.
9. device according to claim 8, which is characterized in that the context screens subelement and includes:
Contextual definition subelement is searched for, every group of contextual definition for will search is search context;
Contextual feature generates subelement, for generating the corresponding contextual feature of described search context;
Wherein, the contextual feature includes co-occurrence feature and/or semantic feature, in co-occurrence characteristic present described search
The hereafter different degree with the co-occurrence word in the target context in described search context and the target context, it is described
Semantic feature characterizes the semantic similarity of described search context Yu the target context;
Each group context screens subelement, for according to every group of corresponding contextual feature of context searching, from searching
Rope to each group context in, filter out with the target context in semantically similar each group context.
10. device according to any one of claims 7 to 9, which is characterized in that the reply corpus selection unit is specifically used
In:
By analyzing the correlation between the target context and each reply corpus, chooses at least one and reply corpus.
11. a kind of reply obtains equipment characterized by comprising processor, memory, system bus;
The processor and the memory are connected by the system bus;
The memory includes instruction for storing one or more programs, one or more of programs, and described instruction works as quilt
The processor makes the processor perform claim require 1-6 described in any item methods when executing.
12. a kind of computer readable storage medium, which is characterized in that instruction is stored in the computer readable storage medium,
When described instruction is run on the terminal device, so that the terminal device perform claim requires the described in any item methods of 1-6.
13. a kind of computer program product, which is characterized in that when the computer program product is run on the terminal device, make
It obtains the terminal device perform claim and requires the described in any item methods of 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910351022.5A CN110069613A (en) | 2019-04-28 | 2019-04-28 | A kind of reply acquisition methods and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910351022.5A CN110069613A (en) | 2019-04-28 | 2019-04-28 | A kind of reply acquisition methods and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110069613A true CN110069613A (en) | 2019-07-30 |
Family
ID=67369389
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910351022.5A Pending CN110069613A (en) | 2019-04-28 | 2019-04-28 | A kind of reply acquisition methods and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110069613A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111914565A (en) * | 2020-07-15 | 2020-11-10 | 海信视像科技股份有限公司 | Electronic equipment and user statement processing method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101566998A (en) * | 2009-05-26 | 2009-10-28 | 华中师范大学 | Chinese question-answering system based on neural network |
US20160306852A1 (en) * | 2015-03-11 | 2016-10-20 | International Business Machines Corporation | Answering natural language table queries through semantic table representation |
CN107305578A (en) * | 2016-04-25 | 2017-10-31 | 北京京东尚科信息技术有限公司 | Human-machine intelligence's answering method and device |
CN108170749A (en) * | 2017-12-21 | 2018-06-15 | 北京百度网讯科技有限公司 | Dialogue method, device and computer-readable medium based on artificial intelligence |
CN109033318A (en) * | 2018-07-18 | 2018-12-18 | 北京市农林科学院 | Intelligent answer method and device |
-
2019
- 2019-04-28 CN CN201910351022.5A patent/CN110069613A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101566998A (en) * | 2009-05-26 | 2009-10-28 | 华中师范大学 | Chinese question-answering system based on neural network |
US20160306852A1 (en) * | 2015-03-11 | 2016-10-20 | International Business Machines Corporation | Answering natural language table queries through semantic table representation |
CN107305578A (en) * | 2016-04-25 | 2017-10-31 | 北京京东尚科信息技术有限公司 | Human-machine intelligence's answering method and device |
CN108170749A (en) * | 2017-12-21 | 2018-06-15 | 北京百度网讯科技有限公司 | Dialogue method, device and computer-readable medium based on artificial intelligence |
CN109033318A (en) * | 2018-07-18 | 2018-12-18 | 北京市农林科学院 | Intelligent answer method and device |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111914565A (en) * | 2020-07-15 | 2020-11-10 | 海信视像科技股份有限公司 | Electronic equipment and user statement processing method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111046132B (en) | Customer service question-answering processing method and system for searching multiple rounds of conversations | |
CN110175227B (en) | Dialogue auxiliary system based on team learning and hierarchical reasoning | |
CN106448670A (en) | Dialogue automatic reply system based on deep learning and reinforcement learning | |
CN110491416A (en) | It is a kind of based on the call voice sentiment analysis of LSTM and SAE and recognition methods | |
Sun et al. | Deep and shallow features fusion based on deep convolutional neural network for speech emotion recognition | |
CN105808590B (en) | Search engine implementation method, searching method and device | |
CN109597493A (en) | A kind of expression recommended method and device | |
CN111182162B (en) | Telephone quality inspection method, device, equipment and storage medium based on artificial intelligence | |
CN110727778A (en) | Intelligent question-answering system for tax affairs | |
CN111694940A (en) | User report generation method and terminal equipment | |
CN112989761B (en) | Text classification method and device | |
CN110008327A (en) | Law answers generation method and device | |
CN109325780A (en) | A kind of exchange method of the intelligent customer service system in E-Governance Oriented field | |
CN116070169A (en) | Model training method and device, electronic equipment and storage medium | |
CN109472030A (en) | A kind of system replys the evaluation method and device of quality | |
CN110597968A (en) | Reply selection method and device | |
CN110362651A (en) | Dialogue method, system, device and the storage medium that retrieval and generation combine | |
CN110069612A (en) | A kind of reply generation method and device | |
CN111783955A (en) | Neural network training method, neural network training device, neural network dialogue generating method, neural network dialogue generating device, and storage medium | |
CN112463944A (en) | Retrieval type intelligent question-answering method and device based on multi-model fusion | |
CN114153955A (en) | Construction method of multi-skill task type dialogue system fusing chatting and common knowledge | |
CN115497465A (en) | Voice interaction method and device, electronic equipment and storage medium | |
CN115221387A (en) | Enterprise information integration method based on deep neural network | |
CN113807103B (en) | Recruitment method, device, equipment and storage medium based on artificial intelligence | |
CN112100395B (en) | Expert cooperation feasibility analysis method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190730 |
|
RJ01 | Rejection of invention patent application after publication |