CN110209783A - Chat answer method and system, electronic device and readable storage medium storing program for executing - Google Patents
Chat answer method and system, electronic device and readable storage medium storing program for executing Download PDFInfo
- Publication number
- CN110209783A CN110209783A CN201910342725.1A CN201910342725A CN110209783A CN 110209783 A CN110209783 A CN 110209783A CN 201910342725 A CN201910342725 A CN 201910342725A CN 110209783 A CN110209783 A CN 110209783A
- Authority
- CN
- China
- Prior art keywords
- word
- chat
- output probability
- question
- response
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 230000004044 response Effects 0.000 claims abstract description 123
- 230000009466 transformation Effects 0.000 claims description 29
- 238000012549 training Methods 0.000 claims description 23
- 230000015654 memory Effects 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 8
- 238000006243 chemical reaction Methods 0.000 claims description 4
- 230000011218 segmentation Effects 0.000 claims description 4
- 238000012163 sequencing technique Methods 0.000 claims description 4
- 230000003252 repetitive effect Effects 0.000 abstract description 8
- 238000005516 engineering process Methods 0.000 abstract description 7
- 238000010586 diagram Methods 0.000 description 12
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 241000208340 Araliaceae Species 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 230000007787 long-term memory Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/332—Query formulation
- G06F16/3329—Natural language query formulation or dialogue systems
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Theoretical Computer Science (AREA)
- Artificial Intelligence (AREA)
- Human Computer Interaction (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Machine Translation (AREA)
Abstract
The invention discloses a kind of chat answer method and systems, electronic device and readable storage medium storing program for executing, this method comprises: obtaining question information, and call that has trained to be used to obtain the chat model of response message, according to above-mentioned question information and chat model, randomness variation is carried out to the output probability of word in the corresponding lexical set of chat model, and word selection is carried out based on the target output probability after randomness variation, the corresponding response message of question information is obtained, the response message is exported.Relative to technology, randomness variation is carried out by the output probability to word in the corresponding lexical set of chat model, enhance the randomness of the word of selection, the randomness of the response message also enhanced, the problem for effectively avoiding the repetitive rate of response excessively high, improves the flexibility and interest of response.
Description
Technical field
The present invention relates to computer field more particularly to a kind of chat answer method and system, electronic device and readable deposit
Storage media.
Background technique
With the development of science and technology, the life style of the mankind is being altered in steps in artificial intelligence, for example, chat robots are just
It is one of which, when client is interacted by text or voice with chat robots, chat robots can be client
Carry out intelligent response.
However current chat robots are relatively more fixed for the response mode of problem, using the side of greediness sampling
Formula, i.e. the selection maximum language of possibility are marked, and the defect of this method is so that the response that a problem obtains repeats
Rate is excessively high, and for human without any accident, the flexibility of response and interest are low.
Summary of the invention
The main purpose of the present invention is to provide a kind of chat answer method and systems, electronic device and readable storage medium
Matter, it is intended to solve that the repetitive rate replied in the prior art is excessively high, flexibility and the low technical problem of interest.
To achieve the above object, first aspect present invention provides a kind of chat answer method, comprising:
Question information is obtained, and calls that has trained to be used to obtain the chat model of response message;
According to the question information and the chat model, in the corresponding lexical set of the chat model word it is defeated
Probability carries out randomness transformation out, and carries out word selection based on the transformed target output probability of randomness, obtains described mention
Ask information corresponding response message;
Export the corresponding response message of the question information.
To achieve the above object, second aspect of the present invention provides a kind of chat answering system, which includes:.
Calling module is obtained, for obtaining question information, and calls that train to be used to obtain the chat mould of response message
Type;
Stochastic transformation module is used for according to the question information and the chat model, corresponding to the chat model
The output probability of word carries out randomness transformation in lexical set, and carries out word based on the transformed target output probability of randomness
Language selection, obtains the corresponding response message of the question information;
Output, for exporting the corresponding response message of the question information.
To achieve the above object, third aspect present invention provides a kind of electronic device, comprising: memory, processor and deposits
The computer program run on a memory and on a processor is stored up, when the processor executes the computer program, is realized
Each step in chat answer method as described in relation to the first aspect.
To achieve the above object, fourth aspect present invention provides a kind of readable storage medium storing program for executing, is stored thereon with computer journey
Sequence when the computer program is executed by processor, realizes each step in chat answer method as described in relation to the first aspect.
The present invention provides a kind of chat answer method, this method comprises: obtaining question information, and calls that has trained to be used for
The chat model of response message is obtained, according to above-mentioned question information and chat model, in the corresponding lexical set of chat model
The output probability of word carries out randomness variation, and carries out word selection based on the target output probability after randomness variation, obtains
To the corresponding response message of question information, the response message is exported.Relative to technology, by the corresponding word finder of chat model
The output probability of word carries out randomness variation in conjunction, enhances the randomness of the word of selection, the response also enhanced
The randomness of information, the problem for effectively avoiding the repetitive rate of response excessively high improve the flexibility and interest of response.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this
Some embodiments of invention for those skilled in the art without creative efforts, can also basis
These attached drawings obtain other attached drawings.
Fig. 1 is the flow diagram of the training method of chat model in the embodiment of the present invention;
Fig. 2 is the flow diagram of chat answer method in the embodiment of the present invention;
Fig. 3 is the flow diagram of the refinement step of step 201 in embodiment illustrated in fig. 2;
Fig. 4 is the flow diagram of the refinement step of step 301 in embodiment illustrated in fig. 3;
Fig. 5 is the structural schematic diagram of chat answering system in the embodiment of the present invention;
Fig. 6 is the structural schematic diagram of electronic device in the embodiment of the present invention.
Specific embodiment
In order to make the invention's purpose, features and advantages of the invention more obvious and easy to understand, below in conjunction with the present invention
Attached drawing in embodiment, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described reality
Applying example is only a part of the embodiment of the present invention, and not all embodiments.Based on the embodiments of the present invention, those skilled in the art
Member's every other embodiment obtained without making creative work, shall fall within the protection scope of the present invention.
In embodiments of the present invention, above-mentioned chat answer method can be realized that the system can be by chat answering system
Program module, and the program module is stored in the readable storage medium storing program for executing of the equipment with chat answering, which is holding
When row chat answering, the above-mentioned chat answering system in storage medium can be called and run by processor, on realizing
State chat answer method.Wherein, it needs to be used to obtain the chat mould of response message using what is trained in the chat answer method
Type, the technical solution in embodiment, is explained below the training side of above-mentioned chat answer model in order to better understand the present invention
Method.
Referring to Fig. 1, the flow diagram of the training method for model of chatting in the embodiment of the present invention, it is possible to understand that
It is that the training method of the chat model is to belong to a part of chat answer method, comprising:
Step 101 obtains multiple groups question and answer sample pair, wherein one group of question and answer sample is to including problem sample information and correspondence
Response sample information;
In embodiments of the present invention, model to be trained is referred to as initial chat model, which can
To be LSTM (shot and long term memory network, Long Short Term Memory) either other Recognition with Recurrent Neural Network models,
It can be based on the selection for the type for specifically carrying out chat model, herein without limitation in practical application.By to the mould
Type is trained, and is available for obtaining the chat model of response message.
Model is trained and needs first to prepare sample, in the embodiment of the present application, multiple groups question and answer sample pair can be got,
Wherein, one group of question and answer sample is to including problem sample information and corresponding response sample information, and the question and answer sample is to can be from net
Network crawl, or in advance by artificially collecting to obtain.It should be noted that question and answer sample to the problems in sample information and different
Surely refer to problem, can be problem sentence, be also possible to other kinds of sentence, for example, it can be statement sentences, in reality
It is dialogue in.
Step 102, for h group question and answer sample pair, the response sample information of h group question and answer sample centering is segmented
Processing, obtains Q word;
Wherein, the value of h is [1, H], and H is the sum of question and answer sample pair, and Q, H are positive integer;
Step 103, by h group question and answer sample to the problems in preceding q word in sample information and Q word constitute
Input set, and using the q+1 word as output, obtain Q training sample of h group question and answer sample pair;
Wherein, the value of q is [1, Q-1].
Wherein, above-mentioned question and answer sample is to cannot be directly used to train, but is used by the processing to question and answer sample pair
In trained training sample, it is assumed that the sum of question and answer sample pair is H group, in order to make it easy to understand, be illustrated by taking h group as an example,
Wherein, the value of the h is [1, H], wherein the h and H is positive integer.
For h group question and answer sample pair, the response sample information of the question and answer sample centering in the h group is carried out at participle
Reason, obtain Q word, and after obtaining above-mentioned Q word, by h group question and answer sample to the problems in sample information and Q
Preceding q word in each word constitutes input set, and using the q+1 word as output, obtains Q of h group sample pair
Training sample.
For example, if problem sample first segments answer B, the result segmented to comprising problem A and answer B
For b1, b2, b3, b4 ... .bQ }, then using problem A as input, b1 then makees A+b1 as output one training sample of composition
For input, b2 continues as output one training sample of composition using A+b1+b2 as input, and b3 is as output composition one
Training sample, and so on, until using A+b1+b2+b3+b4 ... .+bQ-1 as input, Q instruction is can be obtained as output in bQ
Practice sample.
In another example in one group of sample question and answer pair, problem sample information are as follows: " you like me ", response sample information are
" I likes you " then first segments response sample information, respectively " I ", " liking ", " you " when, then available 3
Training sample, respectively input " you like me "+output " I ", input " you like me "+" I ", output " liking ", and
" you like me "+" I "+" liking " is inputted, is exported " you ".
Step 104 is trained initial chat model using the training sample of multiple groups question and answer sample pair, obtains chat mould
Plate.
In embodiments of the present invention, after all training samples for obtaining multiple groups question and answer sample pair, the training sample is utilized
This is trained initial chat model, obtains chat model.
Further, the above-mentioned corresponding lexical set of chat model trained will be also obtained, specifically, will be above-mentioned
After all response sample informations of H group sample question and answer centering are segmented, gathered, duplicate removal is carried out to the word in the set
Processing, obtains above-mentioned lexical set.For example, if all there is word " F " after N number of response sample information participle, after participle
Set in have N number of word " F ", duplicate removal processing, which refers to, deletes duplicate N-1 word " F ", leaves a word " F ".
In embodiments of the present invention, for multiple groups question and answer sample pair, the training of word segmentation processing and input and output is carried out respectively
The combination of sample can effectively obtain the training sample for being trained, to be trained to initial chat model, and can
Using the chat model trained in such a way that word one by one is predicted, the response letter for the problem of inputting the chat model is obtained
Breath.
The process using above-mentioned chat model realization chat response is described more fully below, referring to Fig. 2, for the present invention
The flow diagram of chat answer method in embodiment, comprising:
Step 201 obtains question information, and calls that has trained to be used to obtain the chat model of response message;
Step 201, according to question information and chat model, it is general to the output of word in the corresponding lexical set of chat model
Rate carries out randomness transformation, and carries out word selection based on the transformed target output probability of randomness, obtains question information pair
The response message answered;
Step 203, the corresponding response message of output question information.
Wherein, there are many modes for obtaining question information, such as can be and inputted on the display screen of equipment by quizmaster
Question information, or can be the voice of acquisition quizmaster, and speech analysis is carried out to voice, obtain question information, Huo Zheke
To be the question information for receiving other equipment transmission by wired or wireless network.
Further, above-mentioned question information input has been trained in obtained chat model, chat model will export it
The output probability of each word in corresponding lexical set carries out randomness transformation to the output probability of word each in lexical set,
And word selection is carried out based on the transformed target output probability of randomness, to obtain the corresponding response message of question information, and
The corresponding response message of final output question information.
Wherein, randomness transformation refers to that the output probability to each word converts, and can enhance the randomness of response, have
The problem for avoiding the repetitive rate of response excessively high is imitated, the flexibility and interest of response are improved.
It should be noted that similar to the chat training process of model, which is by selecting one by one when in use
The mode for selecting next word obtains complete response message, therefore, for a question information, needs at least using primary
Chat model can just access response message.
In embodiments of the present invention, it is carried out by the output probability to word in the corresponding lexical set of chat model random
Property variation, enhance the randomness of the word of selection, the problem for effectively avoiding the repetitive rate of response excessively high.
Wherein, referring to Fig. 3, being the flow diagram of the refinement step of step 201 in embodiment illustrated in fig. 2, comprising:
Step 301, by question information and obtained response set of words, input chat model obtains chat model pair
The output probability of each word in the lexical set answered;
Step 302 carries out randomness transformation to the output probability of word each in lexical set, obtains transformed each
The target output probability of word;
Step 303, the target output probability based on each word carry out word selection, and the word of selection is added to response
In set of words, and return step 301.
Wherein, for each question information, the response set of words of the question information, and the response word will all be established
The initial value of set is sky, which converts selection based on chat model and randomness for placing in sequence
Word, and it is corresponding by the way that question information can be obtained according to selecting sequence sequence in each word in the response set of words
Response message.
Specifically, by question information and obtained response set of words, output chat model obtains the chat model pair
The output probability of each word in the lexical set answered, it is to be understood that in the case where response set of words is empty, then may be used
Question information is input to chat model.
For in the lexical set each word output probability carry out randomness transformation, randomness transformation refer to based on
The principle of machine becomes the output probability of each word on the basis of the output probability of each word for model output of chatting
It changes.
In order to realize above-mentioned randomness transformation, need using to random parameter value, the parameter value can from it is preset with
It is randomly choosed to obtain in machine parameter area, for example, the parameter area can [0,1].After obtaining random parameter value, utilize
Preset stochastic transformation algorithm and the random parameter value convert the output probability of each word, obtain the mesh of each word
Mark output probability.
Wherein, above-mentioned stochastic transformation algorithm are as follows:
For n-th of word, the transformed output probability of n-th of word, the initial value of n are calculated using following formula
It is 1, and the maximum value of n is N, N is the sum of word in lexical set:
Wherein, Pn' indicate the transformed output probability of n-th of word, PnIndicate n-th of word of chat model output
Output probability, M indicate random parameter value.
By utilizing above-mentioned stochastic transformation algorithm, the target output probability of each word in lexical set can be obtained.
Further, the target output probability based on above-mentioned each word carries out word selection, and the word of selection is added
It adds in response set of words, at this point, completing the selection of a word, it is to be understood that if this time word of selection
It is not sky, then shows the selection that can continue word, can return in above-mentioned steps 301, to continue next word
The selection of language.If the word this time selected is sky, shows that the word in response set of words can form a response, can tie
Beam this time word selection.
In embodiments of the present invention, it is converted by above-mentioned randomness, so that the word chosen has centainly random
Property, flexibility is stronger.
Wherein, it converts by randomness, after obtaining the target output probability of each word in lexical set, will be based on
The target output probability of each word carries out word selection, and the word of selection is added in response set of words.Specifically, asking
It is the flow diagram of the refinement step of the step 301 in the embodiment of the present application in embodiment illustrated in fig. 3 refering to Fig. 4, comprising:
Step 401 is ranked up the target output probability of each word according to sequence from big to small, obtains each word
The sequence of language;
Word in step 402, ergodic sequence, according to the target output probability of the word traversed to the word traversed
Carry out word selection;
If step 403, the word chosen to go through, the word traversed is added in response set of words, and is returned
Return step 301;
If step 404, the non-selected word traversed, and the word traversed is the last one in sequence, then will answer
The each word answered in set of words constitutes response message according to the sequencing of selection.
In embodiments of the present invention, can the target output probability first to each word be ranked up, the mode of sequence can be with
For the sequence according to target output probability from big to small, the sequence of each word in lexical set is obtained.
Further, the sequence obtained to sequence traverses, defeated according to the target of the word for the word traversed
Probability carries out word selection to it out, for example, showing that the word is selected general if the probability of the word A traversed is 0.6
Rate is 0.6, is selected for 0.6 it according to its selected probability.
Wherein, if the word chosen to go through, which is added in response set of words, and be understood that
It is that the application obtains response message using the mode for predicting next word one by one, therefore, includes in response set of words
Word be to be ranked up according to the sequence selected.
Wherein, right if the non-selected word currently traversed, traverses next word, and return to above-mentioned step 402
It is selected, until choose a word, alternatively, if the non-selected word currently traversed, and the word traversed is
The last one of above-mentioned sequence, then show to have been completed entire ergodic process, it is unpredictable into response set of words most
Next word of the word closely added, existing word can be used as response message use in response set of words, therefore,
Word each in response set of words is constituted into response message according to the sequencing of selection, and is exported.For example, if response
The sequence of word and word that set of words includes are as follows: I, today, to go, go on business, then the response message obtained are as follows: my today
It goes to go on business.
It is understood that each word is according to target output probability in lexical set in order to improve treatment effeciency
It is ranked up after obtaining the sequence of word, the word that can be 0 by target output probability in the sequence is first rejected from sequence.
Based on foregoing description, in embodiments of the present invention, chat answering system obtains question information, and calls and trained
For obtaining the chat model of response message, according to above-mentioned question information and chat model, word finder corresponding to chat model
The output probability of word carries out randomness variation in conjunction, and carries out word choosing based on the target output probability after randomness variation
It selects, obtains the corresponding response message of question information, export the response message.Relative to technology, by corresponding to chat model
The output probability of word carries out randomness variation in lexical set, enhances the randomness of the word of selection, also enhances to obtain
Response message randomness, the problem for effectively avoiding the repetitive rate of response excessively high improves the flexibility and interest of response.
Referring to Fig. 5, for the structural schematic diagram for answering system of chatting in the embodiment of the present invention, comprising:
Calling module 501 is obtained, for obtaining question information, and calls that train to be used to obtain the chat of response message
Model;
Stochastic transformation module 502, for being corresponded to the chat model according to the question information and the chat model
Lexical set in the output probability of word carry out randomness transformation, and carried out based on the transformed target output probability of randomness
Word selection, obtains the corresponding response message of the question information;
Output module 503, for exporting the corresponding response message of the question information.
It is understood that in embodiment illustrated in fig. 5 described in step performed by modules and embodiment of the method
It is similar, it can be specifically not repeated herein refering to associated description in embodiment of the method.
In embodiments of the present invention, it is carried out by the output probability to word in the corresponding lexical set of chat model random
Property variation, enhance the randomness of the word of selection, the randomness of the response message also enhanced effectively avoids response
The excessively high problem of repetitive rate improves the flexibility and interest of response.
Further, above-mentioned stochastic transformation module 502 includes:
Probability output module, for inputting the chat mould for the question information and obtained response set of words
Type obtains the output probability of each word in the corresponding lexical set of the chat model;The response set of words it is initial
Value is sky;
Conversion module carries out randomness transformation for the output probability to each word in the lexical set, is become
The target output probability of each word after changing;
Selecting module carries out word selection for the target output probability based on each word, by the word of selection
It is added in the response set of words, and returns to described by the question information and obtained response set of words, input
The step of chat model, until obtaining the corresponding response message of the question information.
Wherein, above-mentioned conversion module is specifically used for: random parameter value is randomly choosed out of preset random parameter;Benefit
The output probability of each word is converted with preset stochastic transformation algorithm and the random parameter value, is obtained described
The target output probability of each word.
And it is further, above-mentioned conversion module is specifically used for: random ginseng is randomly choosed out of preset random parameter
Numerical value.
For n-th of word, the transformed output probability of n-th of word, the initial value of n are calculated using following formula
It is 1, and the maximum value of n is N, N is the sum of each word:
Wherein, Pn' indicate the transformed output probability of n-th of word, PnIndicate that the output probability of n-th of word, M indicate
The random parameter value.
Further, above-mentioned selecting module specifically includes:
Sorting module is ranked up for the target output probability to each word according to sequence from big to small,
Obtain the sequence of each word;
Selecting module is traversed, for traversing the word in the sequence, according to the target output probability of the word traversed
Word selection is carried out to the word traversed;
Return module is added, if the word for traversing described in selecting, is added to institute for the word traversed
It states in response set of words, and returns to above-mentioned probability output module;
Module is constructed, if for the non-selected word traversed, and the word traversed is in the sequence
The last one, then each word in the response set of words is constituted into the response according to the sequencing of selection and believed
Breath.
In embodiments of the present invention, above-mentioned chat model is obtained by training, therefore, above-mentioned chat response system
System further include:
Sample acquisition module, for obtaining multiple groups question and answer sample pair, wherein one group of question and answer sample is believed including problem sample
Breath and corresponding response sample information;
Word segmentation module, for believing the response sample of the h group question and answer sample centering for h group question and answer sample pair
Breath carries out word segmentation processing, obtains Q word, and the value of the h is [1, H], and H is the sum of question and answer sample pair, and Q, H are positive whole
Number;
Composite module, for by the h group question and answer sample to the problems in sample information and the Q word
Preceding q word constitutes input set, and using the q+1 word as output, obtains Q+1 of the h group question and answer sample pair
Training sample, wherein the value of q is [1, Q], and the Q+1 word is sky;
Training module, for being trained using the training sample of the multiple groups question and answer sample pair to initial chat model,
Obtain the chat template.
In embodiments of the present invention, system will acquire question information, and call that has trained to be used to obtain response message
Chat model, according to above-mentioned question information and chat model, to the output probability of word in the corresponding lexical set of chat model
Randomness variation is carried out, and word selection is carried out based on the target output probability after randomness variation, it is corresponding to obtain question information
Response message, export the response message.Relative to technology, pass through the output to word in the corresponding lexical set of chat model
Probability carries out randomness variation, enhances the randomness of the word of selection, the randomness of the response message also enhanced has
The problem for avoiding the repetitive rate of response excessively high is imitated, the flexibility and interest of response are improved.
The embodiment of the present invention also provides a kind of readable storage medium storing program for executing, is stored thereon with computer program, the computer program
When being executed by processor, each step in above-mentioned chat answer method is realized.
The embodiment of the present invention also provides a kind of electronic device, including memory, processor and storage on a memory and
The computer program run on processor is realized in above-mentioned chat answer method when the processor executes computer program
Each step.
Specifically, referring to Fig. 6, Fig. 6 is a kind of structural schematic diagram of electronic device provided in an embodiment of the present invention.The electricity
Sub-device can be used for the chat answer method in embodiment of the method.As shown in fig. 6, the electronic device specifically includes that
It memory 601, processor 602, bus 603 and is stored on memory 601 and can be run on processor 602
Computer program, memory 601 and processor 602 are connected by bus 603.It is real when processor 602 executes the computer program
Show the IMS originating method in Fig. 1 or embodiment illustrated in fig. 2.Wherein, the quantity of processor can be one or more.
Memory 601 can be high random access memory body (RAM, Random Access Memory) memory,
It can be non-labile memory (non-volatile memory), such as magnetic disk storage.Memory 601 can for storing
Program code is executed, processor 602 is coupled with memory 601.
Memory 601 can be described as computer readable storage medium, be stored with computer on the computer readable storage medium
Program, chat answer method when which is executed by processor in embodiment of the method.Further, which can store Jie
Matter can also be that USB flash disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), RAM, magnetic or disk etc. are each
Kind can store the medium of program code.
In several embodiments provided herein, it should be understood that disclosed device and method can pass through it
Its mode is realized.For example, the apparatus embodiments described above are merely exemplary, for example, the division of the module, only
Only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple module or components can be tied
Another system is closed or is desirably integrated into, or some features can be ignored or not executed.Another point, it is shown or discussed
Mutual coupling, direct-coupling or communication connection can be through some interfaces, the INDIRECT COUPLING or logical of device or module
Letter connection can be electrical property, mechanical or other forms.
The module as illustrated by the separation member may or may not be physically separated, aobvious as module
The component shown may or may not be physical module, it can and it is in one place, or may be distributed over multiple
On network module.Some or all of the modules therein can be selected to realize the mesh of this embodiment scheme according to the actual needs
's.
It, can also be in addition, each functional module in each embodiment of the present invention can integrate in a processing module
It is that modules physically exist alone, can also be integrated in two or more modules in a module.Above-mentioned integrated mould
Block both can take the form of hardware realization, can also be realized in the form of software function module.
If the integrated module is realized in the form of software function module and sells or use as independent product
When, it can store in a computer readable storage medium.Based on this understanding, technical solution of the present invention is substantially
The all or part of the part that contributes to existing technology or the technical solution can be in the form of software products in other words
It embodies, which is stored in a storage medium, including some instructions are used so that a computer
Equipment (can be personal computer, server or the network equipment etc.) executes the complete of each embodiment the method for the present invention
Portion or part steps.And storage medium above-mentioned includes: USB flash disk, mobile hard disk, read-only memory (ROM, Read-Only
Memory), random access memory (RAM, Random Access Memory), magnetic or disk etc. are various can store journey
The medium of sequence code.
It should be noted that for the various method embodiments described above, describing for simplicity, therefore, it is stated as a series of
Combination of actions, but those skilled in the art should understand that, the present invention is not limited by the sequence of acts described because
According to the present invention, certain steps can use other sequences or carry out simultaneously.Secondly, those skilled in the art should also know
It knows, the embodiments described in the specification are all preferred embodiments, and related actions and modules might not all be this hair
Necessary to bright.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, there is no the portion being described in detail in some embodiment
Point, it may refer to the associated description of other embodiments.
The above are to a kind of chat answer method provided by the present invention and system, electronic device and readable storage medium storing program for executing
Description, for those skilled in the art, thought according to an embodiment of the present invention, in specific embodiments and applications
It will change, to sum up, the contents of this specification are not to be construed as limiting the invention.
Claims (10)
1. a kind of chat answer method, which is characterized in that the described method includes:
Question information is obtained, and calls that has trained to be used to obtain the chat model of response message;
It is general to the output of word in the corresponding lexical set of the chat model according to the question information and the chat model
Rate carries out randomness transformation, and carries out word selection based on the transformed target output probability of randomness, obtains the enquirement letter
Cease corresponding response message;
Export the corresponding response message of the question information.
2. the method according to claim 1, wherein described according to the question information and the chat model,
Randomness transformation and the probability selection of the word for carrying out the output probability of word, obtain the corresponding response of the question information
Information, comprising:
By the question information and obtained response set of words, the chat model is inputted, obtains the chat model pair
The output probability of each word in the lexical set answered;The initial value of the response set of words is sky;
Randomness transformation is carried out to the output probability of each word in the lexical set, obtains transformed each word
Target output probability;
Target output probability based on each word carries out word selection, and the word of selection is added to the response word
In set, and described by the question information and obtained response set of words, the step of inputting the chat model is returned to,
Until obtaining the corresponding response message of the question information.
3. according to the method described in claim 2, it is characterized in that, the output to each word in the lexical set is general
Rate carries out randomness transformation, obtains the target output probability of transformed each word, comprising:
Random parameter value is randomly choosed out of preset random parameter;
The output probability of each word is converted using preset stochastic transformation algorithm and the random parameter value, is obtained
To the target output probability of each word.
4. according to the method described in claim 3, it is characterized in that, described using preset stochastic transformation algorithm and described random
Parameter value converts the output probability of each word, and the target output probability for obtaining each word includes:
For n-th of word, the transformed output probability of n-th of word is calculated using following formula, the initial value of n is 1,
And the maximum value of n is N, N is the sum of each word:
Wherein, P 'nIndicate the transformed output probability of n-th of word, PnThe output probability of n-th of word is indicated, described in M expression
Random parameter value.
5. according to the method described in claim 2, it is characterized in that, the target output probability based on each word into
The selection of row word, the word of selection is added in the response set of words includes:
The target output probability of each word is ranked up according to sequence from big to small, obtains each word
Sequence;
The word in the sequence is traversed, the word traversed is carried out according to the target output probability of the word traversed
Word selection;
If the word traversed described in selection, the word traversed is added in the response set of words, and is returned
Return described by the question information and obtained response set of words, the step of inputting the chat model;
If the non-selected word traversed, and the word traversed is the last one in the sequence, then by institute
The each word stated in response set of words constitutes the response message according to the sequencing of selection.
6. according to claim 1 to method described in 5, which is characterized in that the acquisition question information, and call the use trained
Before obtaining the chat model of response message, further includes:
Obtain multiple groups question and answer sample pair, wherein one group of question and answer sample is believed including problem sample information and corresponding response sample
Breath;
For h group question and answer sample pair, word segmentation processing is carried out to the response sample information of the h group question and answer sample centering, is obtained
To Q word, the value of the h is [1, H], and H is the sum of question and answer sample pair, and Q, H are positive integer;
By the h group question and answer sample to the problems in preceding q word in sample information and the Q word constitute input
Set, and using the q+1 word as output, obtain Q+1 training sample of the h group question and answer sample pair, wherein q's
Value is [1, Q], and the Q+1 word is sky;
Initial chat model is trained using the training sample of the multiple groups question and answer sample pair, obtains the chat template.
7. a kind of chat answering system, which is characterized in that the system comprises:
Calling module is obtained, for obtaining question information, and calls that train to be used to obtain the chat model of response message;
Stochastic transformation module, for according to the question information and the chat model, vocabulary corresponding to the chat model
The output probability of word carries out randomness transformation in set, and carries out word choosing based on the transformed target output probability of randomness
It selects, obtains the corresponding response message of the question information;
Output module, for exporting the corresponding response message of the question information.
8. system according to claim 7, which is characterized in that the stochastic transformation module includes:
Probability output module, for inputting the question information and obtained response set of words the chat model, obtaining
The output probability of each word into the corresponding lexical set of the chat model;The initial value of the response set of words is
It is empty;
Conversion module carries out randomness transformation for the output probability to each word in the lexical set, after obtaining transformation
Each word target output probability;
Selecting module carries out word selection for the target output probability based on each word, the word of selection is added
To in the response set of words, and return it is described by the question information and obtained response set of words, described in input
The step of chat model, until obtaining the corresponding response message of the question information.
9. a kind of electronic device, the computer run on a memory and on a processor including memory, processor and storage
Program, which is characterized in that when the processor executes the computer program, realize as described in claim 1 to 6 any one
Chat answer method in each step.
10. a kind of readable storage medium storing program for executing, is stored thereon with computer program, which is characterized in that the computer program is processed
When device executes, each step in chat answer method as claimed in any one of claims 1 to 7 is realized.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910342725.1A CN110209783B (en) | 2019-04-26 | 2019-04-26 | Chat response method and system, electronic device and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910342725.1A CN110209783B (en) | 2019-04-26 | 2019-04-26 | Chat response method and system, electronic device and readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110209783A true CN110209783A (en) | 2019-09-06 |
CN110209783B CN110209783B (en) | 2024-03-15 |
Family
ID=67786544
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910342725.1A Active CN110209783B (en) | 2019-04-26 | 2019-04-26 | Chat response method and system, electronic device and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110209783B (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060241944A1 (en) * | 2005-04-25 | 2006-10-26 | Microsoft Corporation | Method and system for generating spelling suggestions |
US20110173174A1 (en) * | 2010-01-13 | 2011-07-14 | Flitcroft Investments Ltd | Linguistically enhanced search engine and meta-search engine |
CN106547789A (en) * | 2015-09-22 | 2017-03-29 | 阿里巴巴集团控股有限公司 | A kind of lyrics generation method and device |
US20170286397A1 (en) * | 2016-03-30 | 2017-10-05 | International Business Machines Corporation | Predictive Embeddings |
US20180143965A1 (en) * | 2016-11-22 | 2018-05-24 | Microsoft Technology Licensing, Llc | Trained data input system |
CN108959271A (en) * | 2018-08-10 | 2018-12-07 | 广州太平洋电脑信息咨询有限公司 | Document creation method, device, computer equipment and readable storage medium storing program for executing are described |
CN109271503A (en) * | 2018-11-06 | 2019-01-25 | 北京猎户星空科技有限公司 | Intelligent answer method, apparatus, equipment and storage medium |
CN109493186A (en) * | 2018-11-20 | 2019-03-19 | 北京京东尚科信息技术有限公司 | The method and apparatus for determining pushed information |
-
2019
- 2019-04-26 CN CN201910342725.1A patent/CN110209783B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060241944A1 (en) * | 2005-04-25 | 2006-10-26 | Microsoft Corporation | Method and system for generating spelling suggestions |
US20110173174A1 (en) * | 2010-01-13 | 2011-07-14 | Flitcroft Investments Ltd | Linguistically enhanced search engine and meta-search engine |
CN106547789A (en) * | 2015-09-22 | 2017-03-29 | 阿里巴巴集团控股有限公司 | A kind of lyrics generation method and device |
US20170286397A1 (en) * | 2016-03-30 | 2017-10-05 | International Business Machines Corporation | Predictive Embeddings |
US20180143965A1 (en) * | 2016-11-22 | 2018-05-24 | Microsoft Technology Licensing, Llc | Trained data input system |
CN108959271A (en) * | 2018-08-10 | 2018-12-07 | 广州太平洋电脑信息咨询有限公司 | Document creation method, device, computer equipment and readable storage medium storing program for executing are described |
CN109271503A (en) * | 2018-11-06 | 2019-01-25 | 北京猎户星空科技有限公司 | Intelligent answer method, apparatus, equipment and storage medium |
CN109493186A (en) * | 2018-11-20 | 2019-03-19 | 北京京东尚科信息技术有限公司 | The method and apparatus for determining pushed information |
Also Published As
Publication number | Publication date |
---|---|
CN110209783B (en) | 2024-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107861938B (en) | POI (Point of interest) file generation method and device and electronic equipment | |
CN108664632A (en) | A kind of text emotion sorting algorithm based on convolutional neural networks and attention mechanism | |
CN110188331A (en) | Model training method, conversational system evaluation method, device, equipment and storage medium | |
CN104424507B (en) | Prediction method and prediction device of echo state network | |
CN110019843A (en) | The processing method and processing device of knowledge mapping | |
CN106407178A (en) | Session abstract generation method and device | |
CN113590776B (en) | Knowledge graph-based text processing method and device, electronic equipment and medium | |
CN106682387A (en) | Method and device used for outputting information | |
CN109919183A (en) | A kind of image-recognizing method based on small sample, device, equipment and storage medium | |
CN109241268A (en) | A kind of analog information recommended method, device, equipment and storage medium | |
CN107729995A (en) | Method and system and neural network processor for accelerans network processing unit | |
CN110245228A (en) | The method and apparatus for determining text categories | |
CN109448703A (en) | In conjunction with the audio scene recognition method and system of deep neural network and topic model | |
CN113608881B (en) | Memory allocation method, device, equipment, readable storage medium and program product | |
CN113449878B (en) | Data distributed incremental learning method, system, equipment and storage medium | |
CN110197213A (en) | Image matching method, device and equipment neural network based | |
CN112257785A (en) | Serialized task completion method and system based on memory consolidation mechanism and GAN model | |
CN110209783A (en) | Chat answer method and system, electronic device and readable storage medium storing program for executing | |
CN111090740A (en) | Knowledge graph generation method for dialog system | |
CN113312445B (en) | Data processing method, model construction method, classification method and computing equipment | |
CN109934350A (en) | Mathematical problem a question multiresolution implementation method, device and platform | |
CN113747480B (en) | Processing method and device for 5G slice faults and computing equipment | |
CN106909894A (en) | Vehicle brand type identifier method and system | |
CN115203412A (en) | Emotion viewpoint information analysis method and device, storage medium and electronic equipment | |
CN112818084B (en) | Information interaction method, related device, equipment and computer readable medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |