CN107562816A - User view automatic identifying method and device - Google Patents

User view automatic identifying method and device Download PDF

Info

Publication number
CN107562816A
CN107562816A CN201710701095.3A CN201710701095A CN107562816A CN 107562816 A CN107562816 A CN 107562816A CN 201710701095 A CN201710701095 A CN 201710701095A CN 107562816 A CN107562816 A CN 107562816A
Authority
CN
China
Prior art keywords
user
mood
grader
coarseness
fine granularity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710701095.3A
Other languages
Chinese (zh)
Other versions
CN107562816B (en
Inventor
宋亚楠
王兰
王昊奋
邵浩
邱楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Green Bristlegrass Intelligence Science And Technology Ltd
Original Assignee
Shenzhen Green Bristlegrass Intelligence Science And Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Green Bristlegrass Intelligence Science And Technology Ltd filed Critical Shenzhen Green Bristlegrass Intelligence Science And Technology Ltd
Priority to CN201710701095.3A priority Critical patent/CN107562816B/en
Publication of CN107562816A publication Critical patent/CN107562816A/en
Application granted granted Critical
Publication of CN107562816B publication Critical patent/CN107562816B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention relates to a kind of user view automatic identifying method and device, methods described to include:Identify that user behavior is intended to according to read statement, the user behavior is intended to as user view;If can not identify that user behavior is intended to according to the read statement, the sentence similar to the read statement is retrieved in database, using user's query intention corresponding to the sentence retrieved as user view;If retrieving in the database less than similar sentence, the other intentions of user are identified using user view grader, the user's identification is intended to as user view.User view automatic identifying method and device provided by the invention, can automatic classification identification user view, improve the accuracy of user view identification, and combine and the mood of user is analyzed, strengthen the degree that robot identifies to user view.

Description

User view automatic identifying method and device
Technical field
The present invention relates to field of artificial intelligence, and in particular to user view automatic identifying method and device.
Background technology
IT application process with society is increasingly sharpened, and the robot exchanged by natural language with user also gradually enters into The life of people.During robot and user mutual, robot can be divided into three classes substantially:Simple question-response Mode, without context, this mode is fundamentally the upgrading and iteration to conventional retrieval mode;There is the human-computer dialogue of target, The purpose of this dialogue is a certain particular demands for solving user, such as:Book tickets, order hotel etc., robot is needed as far as possible few The demand information to user is excavated in dialogue wheel number;Chat pattern is carried out between user and robot without purpose dialogue.Mesh Preceding machine everybody system whether question-response, have the dialogue of target or chat, all rarely consider user view and Mood.
A kind of scheme for the identification user view having had at present is to get user by various types of sensors to give birth to Index is managed, user view and mood are known according to the change of analysis user's physiological status.The shortcomings that program is:Application has Limit, it is necessary to there are sensor contacts to user, in-convenience in use;Acquired physiological index is closed with user view without directly corresponding System, such as:User's blood pressure, heartbeat increase, be both probably that exciting anxiety is also likely to be to be influenceed by variation of ambient temperature, it is also possible to Health condition changes;In addition, not every user view can all be embodied by physiological index, such as:It is predetermined to take out and seek The demand that generation drives is looked for, the physiological index of possible user is identical.Therefore, how convenient accurately identification user is real The problem of being intended that those skilled in the art's urgent need to resolve.
The content of the invention
For in the prior art the defects of, user view automatic identifying method and device provided by the invention can be automatic Hierarchical identification user view, the accuracy of user view identification is improved, and combine and the mood of user is analyzed, strengthen robot To the degree of user view identification.
In a first aspect, the invention provides a kind of user view automatic identifying method, including:
Identify that user behavior is intended to according to read statement, the user behavior is intended to as user view;
If it can not identify that user behavior is intended to according to the read statement, retrieval and the read statement in database Similar sentence, using user's query intention corresponding to the sentence retrieved as user view;
If retrieving in the database less than similar sentence, the other meanings of user are identified using user view grader Figure, the user's identification is intended to as user view.
User view automatic identifying method provided by the invention, the degree of difficulty of user view is identified according to computer, it is right User view is modeled, and user view is divided into user behavior intention, user's query intention, other three layers of the intentions of user, point Every layer of intention is identified the method that Cai Yong be adapted to, and user view is identified by way of classification, improves user view The accuracy of identification simultaneously reduces the consumption of manpower and the requirement to labeled data.
Preferably, it is described to identify that user behavior is intended to according to read statement, including:
Word segmentation processing is carried out to read statement;
The word obtained to word segmentation processing carries out part-of-speech tagging, and the word to obtaining is named Entity recognition and entity Link;
It is intended to according to the result of name Entity recognition and entity link result identification user behavior.
Preferably, it is described using the user view grader identification other intentions of user, including:
Word, the part of speech of word and name Entity recognition result, the entity link result that word segmentation processing is obtained input User view grader, obtain the other intentions of user.
Preferably, in addition to:
User emotion is identified according to the input information of user;
According to the user emotion and the user view, the intention strong degree of user is judged.
Preferably, it is described that user emotion is identified according to the input information of user, including:
The input information of user is inputted into coarseness mood grader, put according to what the coarseness mood grader exported Coarseness mood value of the high mood of reliability as user;
The input information of user is inputted into fine granularity mood grader, put according to what the fine granularity mood grader exported Fine granularity mood value of the high mood of reliability as user.
Preferably, the construction method of the coarseness mood grader and the fine granularity mood grader includes:
Obtain the sample data related to the user;
Coarseness mark and fine granularity mark are carried out to the sample data;
Trained to obtain coarseness mood grader and fine granularity mood grader according to the sample data after mark.
Preferably, in addition to:Sample data after mark is divided into training set and test set;
The sample data according to after mark trains to obtain coarseness mood grader and fine granularity mood grader, bag Include:
Sample data in training set trains to obtain coarseness mood grader and fine granularity mood grader;
After having trained, in addition to:
Using the sample data in test set, the coarseness mood grader obtained to training and the fine granularity feelings Thread grader is tested, and obtains the confidence level of the coarseness mood grader and the fine granularity mood grader;
According to coarseness mood grader described in the Confidence test and the training knot of the fine granularity mood grader Fruit.
Preferably, the sample data related to the user is obtained, including:
It is text data to obtain the user and produced with robot interactive,
And/or
The text data related to the user is obtained by internet.
To the judgement of user emotion in the present invention, with the confidence level of coarseness mood grader and fine granularity mood grader As the weights of final emotion judgment result, the behavior final to robot or directive function has been acted so that robot can Finer judgement is made to the intention of user according to the mood of user, to increase robot in interactive process to user The assurance of demand so that robot more conforms to user's expectation.
Second aspect, the invention provides a kind of user view automatic identification equipment, including:
First intention identification module, for identifying that user behavior is intended to according to read statement, the user behavior is intended to As user view;
Second intention identification module, if for that can not identify that user behavior is intended to according to the read statement, in data The sentence similar to the read statement is retrieved in storehouse, is anticipated user's query intention corresponding to the sentence retrieved as user Figure;
3rd intention assessment module, if for retrieving in the database less than similar sentence, utilize user view Grader identifies the other intentions of user, using the other intentions of the user as user view.
User view automatic identification equipment provided by the invention, the degree of difficulty of user view is identified according to computer, it is right User view is modeled, and user view is divided into user behavior intention, user's query intention, other three layers of the intentions of user, point Every layer of intention is identified the method that Cai Yong be adapted to, and user view is identified by way of classification, improves user view The accuracy of identification simultaneously reduces the consumption of manpower and the requirement to labeled data.
Preferably, the first intention identification module is specifically used for:
Word segmentation processing is carried out to read statement;
The word obtained to word segmentation processing carries out part-of-speech tagging, and the word to obtaining is named Entity recognition and entity Link;
It is intended to according to the result of name Entity recognition and entity link result identification user behavior.
Preferably, the 3rd intention assessment module is specifically used for:
Word, the part of speech of word and name Entity recognition result, the entity link result that word segmentation processing is obtained input User view grader, obtain the other intentions of user.
Preferably, in addition to Emotion identification module, it is used for:
User emotion is identified according to the input information of user;
According to the user emotion and the user view, the intention strong degree of user is judged.
Preferably, the Emotion identification module is specifically used for:
The input information of user is inputted into coarseness mood grader, put according to what the coarseness mood grader exported Coarseness mood value of the high mood of reliability as user;
The input information of user is inputted into fine granularity mood grader, put according to what the fine granularity mood grader exported Fine granularity mood value of the high mood of reliability as user.
Preferably, in addition to grader builds module, for building the coarseness mood grader by following steps With the fine granularity mood grader:
Obtain the sample data related to the user;
Coarseness mark and fine granularity mark are carried out to the sample data;
Trained to obtain coarseness mood grader and fine granularity mood grader according to the sample data after mark.
Preferably, the grader structure module is additionally operable to:Sample data after mark is divided into training set and test set;
The sample data according to after mark trains to obtain coarseness mood grader and fine granularity mood grader, bag Include:
Sample data in training set trains to obtain coarseness mood grader and fine granularity mood grader;
After having trained, in addition to:
Using the sample data in test set, the coarseness mood grader obtained to training and the fine granularity feelings Thread grader is tested, and obtains the confidence level of the coarseness mood grader and the fine granularity mood grader;
According to coarseness mood grader described in the Confidence test and the training knot of the fine granularity mood grader Fruit.
Preferably, in the grader structure module, the sample data related to the user is obtained, including:
It is text data to obtain the user and produced with robot interactive,
And/or
The text data related to the user is obtained by internet.
To the judgement of user emotion in the present invention, with the confidence level of coarseness mood grader and fine granularity mood grader As the weights of final emotion judgment result, the behavior final to robot or directive function has been acted so that robot can Finer judgement is made to the intention of user according to the mood of user, to increase robot in interactive process to user The assurance of demand so that robot more conforms to user's expectation.
The third aspect, the invention provides a kind of computer-readable recording medium, computer program is stored thereon with, it is special Sign is that the program realizes the either method described in first aspect when being executed by processor.
Brief description of the drawings
The family that Fig. 1 is provided by the embodiment of the present invention is intended to the flow chart of automatic identifying method;
The family that Fig. 2 is provided by the embodiment of the present invention is intended to the structured flowchart of automatic recognition system.
Embodiment
The embodiment of technical solution of the present invention is described in detail below in conjunction with accompanying drawing.Following examples are only used for Clearly illustrate technical scheme, therefore be intended only as example, and the protection of the present invention can not be limited with this Scope.
It should be noted that unless otherwise indicated, technical term or scientific terminology used in this application should be this hair The ordinary meaning that bright one of ordinary skill in the art are understood.
The noun being related in the present embodiment is made explanations first, user view is divided into user behavior in the present embodiment Intention, user's query intention and the other intentions of user.
1st, user behavior is intended to, and refers to user and clearly carries mandatory or commanding intention, is subdivided into order robot Control external equipment and order robot control itself.
2nd, user's query intention, refer to the intention of user's query-related information, similar to user's question-response or have target pair The demand of words.
3rd, the other intentions of user, other intentions in addition to user behavior is intended to user's query intention are referred to.
As shown in figure 1, a kind of user view automatic identifying method is present embodiments provided, including:
Step S1, identify that user behavior is intended to according to read statement, the user behavior is intended to as user view.
Wherein, read statement refers to text message, and the voice signal that user inputs can be converted into text envelope by robot Breath obtains read statement.Step S1 is to identify that user behavior is intended in itself merely with read statement, and method is simply efficient.
Natural language corresponding to user behavior intention is generally command format, such as:Music is played, air-conditioning is opened, goes to learn Sing.Therefore, in order to improve intention assessment efficiency, the step S1 method that is preferable to carry out includes:Read statement is carried out at participle Reason, the word obtained to word segmentation processing carries out part-of-speech tagging, and the word to obtaining is named Entity recognition and entity link, It is intended to according to the result of name Entity recognition and entity link result identification user behavior.
Wherein, name entity refer in text have certain sense entity, mainly including name, place name, mechanism name, specially There is noun etc..
For user view can not be identified by step S1 method, then it is identified using step S2 method.
Step S2, if can not identify that user behavior is intended to according to read statement, retrieval and read statement in database Similar sentence, using user's query intention corresponding to the sentence retrieved as user view.
Wherein, database be build in advance be stored with " sentence-user's query intention " to storehouse, i.e., stored in data Each sentence is labeled with corresponding user's query intention, only the sentence in read statement and database need to be carried out into similarity ratio It is right, similar sentence is found, then retrieving user's query intention corresponding to the sentence by similar sentence is used as user view.Step Rapid S2 is to identify user view by inquiring about the method for database, can identify user's query intention, can identification step S1 without The read statement of method identification, improve the precision of user view identification.
It is the other intentions of user for can not identify user view by step S1 and S2, using step S3 method It is identified.
Step S3, if retrieval identifies that user is other less than similar sentence using user view grader in database It is intended to.
Wherein, user view grader is by the grader of deep learning or craft or other means foundation, Yong Huyi Figure grader can according to input information output corresponding to user view, specifically include:It will be obtained in step S1 by word segmentation processing Word, the part of speech and name Entity recognition result, entity link result input user view grader of word, obtain user Other intentions.
Compared with the method in step S1, S2, the intention assessment accuracy of user view grader is higher, and can identify User view in complicated sentence, identification object are not limited to fixed sentence pattern, it is not required that large database is equipped with, but establish The process of grader needs larger sample size, it is necessary to put into larger manpower and materials.
The user view automatic identifying method that the present embodiment provides, the degree of difficulty of user view is identified according to computer, User view is modeled, user view is divided into user behavior intention, user's query intention, other three layers of the intentions of user, Suitable method is respectively adopted every layer of intention is identified, user view is identified by way of classification, improve user's meaning Scheme the accuracy of identification and reduce the consumption of manpower and the requirement to labeled data.
Under many circumstances, recognize user view and merely mean that robot is aware of operation to be performed, still User wishes that the degree that robot performs can not also be known, such as:Robot knows that user it is expected to open air-conditioning, but is not aware that User it is expected the temperature of setting.In order to solve the above problems, the present embodiment mutually ties the user view recognized with user emotion Close, with the behavior of guidance machine people.
Therefore, on the basis of any of the above-described embodiment of the method, the user view automatic identifying method of the present embodiment offer It is further comprising the steps of:
Step S4, user emotion is identified according to the input information of user.
Wherein, the input information of user includes but is not limited to text message, voice messaging, expression information, action message etc. A series of information that can embody user emotion, user emotion can be speculated by analyzing input information.For example, used by analyzing The features such as intonation, tone, volume in the voice messaging at family, can obtain user emotion.
Step S5, according to user emotion and user view, judge the intention strong degree of user.
Wherein, it is intended that strong degree refers to user and wishes that robot performs the degree of associative operation.For example, user wishes robot Open air-conditioning, then be intended to strong degree and refer to that user wishes the temperature of air-conditioning setting;User wishes that robot plays music, then It is to wish to play what kind of music and volume to user to be intended to strong degree.
The present embodiment carries out the modeling of coarseness and two kinds of granularities of fine granularity to user emotion simultaneously.Coarseness mood will be used Family mood is divided into:Positive mood, negative emotions and neutral mood.The theory broken up with reference to emotional development in psychology, fine granularity The mood of user is pressed following Type division by mood:Be intended to ask, be happy, detesting, it is angry it is anxious, unhappy, frightened, like, it is angry it is anxious, sympathize with, Respect, love, moral sense, aesthetic feeling, rational feeling, surprised, pain, interest, indignation, sadness, fear, be shy, attachment, separate, be sad, Frightened, ashamed, pride, pride, anxiety, compunction, sympathy.
Based on the above-mentioned classification to user emotion, step S4 preferred embodiment includes:
Step S41, the input information of user is inputted into coarseness mood grader, exported according to coarseness mood grader Coarseness mood value of the high mood of confidence level as user.
Step S42, the input information of user is inputted into fine granularity mood grader, exported according to fine granularity mood grader Fine granularity mood value of the confidence level highest mood as user.
The behavior or action that final robot takes, it is the decision-making made with reference to user view and user emotion, such as:User Want that it is negative (confidence level of coarseness mood grader is 90%) anxiety (fine granularity mood to open air-conditioning and user's current emotional The confidence level of grader is mood 85%), then sets air-conditioner temperature and arrive relatively low gear;User wants to open air-conditioning and user Current emotional is neutral reason, then sets air-conditioner temperature and arrive in general gear.
To the judgement of user emotion in the present embodiment, with the confidence of coarseness mood grader and fine granularity mood grader Spend weights as final emotion judgment result, the behavior final to robot or acted directive function so that robot energy It is enough that finer judgement is made to the intention of user according to the mood of user, with increase robot in interactive process to The assurance of family demand so that robot more conforms to user's expectation.
Wherein, the construction method of coarseness mood grader and fine granularity mood grader includes:
Step S101, obtain the sample data related to user.
Wherein, it is text data that can be user produce with robot interactive in the source of sample data, or, passes through internet Obtain the text data related to user, for example, the social media of user, web retrieval history, etc. information.By obtaining user more Add comprehensive information, so as to more comprehensively obtain the feature of user so that the classification results of the grader of structure are more Precisely.
Step S102, coarseness mark is carried out to sample data and fine granularity marks.It is corresponding to mark each sample data Coarseness mood and fine granularity mood.
Step S103, train to obtain coarseness mood grader according to the sample data after mark and fine granularity mood is classified Device.
Wherein, if the customized information of user can not be obtained, social media, web retrieval history, the information of such as user, It can then train to obtain general mood grader using general user profile, its training process is:Obtain the logical of a large number of users By the use of user profile as sample data, sample data is labeled using the method for step S102, S103, and with being labeled with Sample data trains general mood grader, and general mood grader includes the general mood grader of coarseness and fine granularity is general Mood grader, it can be respectively intended to judge the coarseness mood label and fine granularity mood label of user.General mood grader It is to be obtained based on a large number of users data, can preferably identifies the mood of domestic consumer, user personalized information can not be being obtained When, the mood grader of user individual also can not be just obtained, now, can be selected in step S4 (or step S41 and S42) The user emotion of general mood grader identification user, the supplement as the mood grader of personalization.
In order to improve the nicety of grading of mood grader, the step S102 sample datas marked are divided into training set and test Collection.The sample data of training set is used to train coarseness mood grader and fine granularity mood grader.The sample number of test set Test according to for the coarseness mood grader and fine granularity mood grader that are obtained to training, distinguished according to test result The confidence level of coarseness mood grader and fine granularity mood grader is obtained, according to Confidence test coarseness mood grader With the training result of fine granularity mood grader, if being unsatisfactory for requiring, continue to be trained grader.
In order to improve the precision of intention assessment, in identification process is intended to, can also be referred to by various detection user's physiology Several sensor obtains the physiological parameters such as the heartbeat of user, body temperature, blood pressure, auxiliary using the physiological parameter of user as basis for estimation Help judgement user view.The user profile such as individual subscriber knowledge mapping, web retrieval history, social network sites message can also be combined, To increase understanding of the robot to user, the degree of accuracy of increase robot identification user view and mood.
User generates substantial amounts of data during with robot interactive, and these user data have included abundant Information, including the demand of user, intention and mood.The user view automatic identifying method that the present embodiment provides, passes through various sides Formula identifies that the mood when interaction of user is intended to and is interactive can make chat robots become more intelligence and more conform to user Expectation, strengthen Consumer's Experience.
The user view grader and various mood graders being previously mentioned in the present embodiment can be by existing points Class device is realized, such as decision tree classifier, K- Nearest Neighbor Classifiers, Naive Bayes Classifier, can also pass through neutral net skill Art trains to obtain grader, and the above-mentioned method for realizing grader is prior art, be will not be repeated here.
Based on above-mentioned user view automatic identifying method identical inventive concept, present embodiments provide a kind of user meaning Figure automatic identification equipment, as shown in Fig. 2 including:
First intention identification module, for identifying that user behavior is intended to according to read statement, the user behavior is intended to As user view;
Second intention identification module, if for that can not identify that user behavior is intended to according to read statement, in database The retrieval sentence similar to read statement, using user's query intention corresponding to the sentence retrieved as user view;
3rd intention assessment module, if for being retrieved in database less than similar sentence, classified using user view Device identifies the other intentions of user, using the other intentions of the user as user view.
Preferably, the first intention identification module is specifically used for:
Word segmentation processing is carried out to read statement;
The word obtained to word segmentation processing carries out part-of-speech tagging, and the word to obtaining is named Entity recognition and entity Link;
It is intended to according to the result of name Entity recognition and entity link result identification user behavior.
Preferably, the 3rd intention assessment module is specifically used for:
Word, the part of speech of word and name Entity recognition result, the entity link result that word segmentation processing is obtained input User view grader, obtain the other intentions of user.
Preferably, in addition to Emotion identification module, it is used for:
User emotion is identified according to the input information of user;
According to the user emotion and the user view, the intention strong degree of user is judged.
Preferably, the Emotion identification module is specifically used for:
The input information of user is inputted into coarseness mood grader, put according to what the coarseness mood grader exported Coarseness mood value of the high mood of reliability as user;
The input information of user is inputted into fine granularity mood grader, put according to what the fine granularity mood grader exported Fine granularity mood value of the high mood of reliability as user.
Preferably, in addition to grader builds module, for building the coarseness mood grader by following steps With the fine granularity mood grader:
Obtain the sample data related to the user;
Coarseness mark and fine granularity mark are carried out to the sample data;
Trained to obtain coarseness mood grader and fine granularity mood grader according to the sample data after mark.
Preferably, the grader structure module is additionally operable to:Sample data after mark is divided into training set and test set;
The sample data according to after mark trains to obtain coarseness mood grader and fine granularity mood grader, bag Include:
Sample data in training set trains to obtain coarseness mood grader and fine granularity mood grader;
After having trained, in addition to:
Using the sample data in test set, the coarseness mood grader obtained to training and the fine granularity feelings Thread grader is tested, and obtains the confidence level of the coarseness mood grader and the fine granularity mood grader;
According to coarseness mood grader described in the Confidence test and the training knot of the fine granularity mood grader Fruit.
Preferably, in the grader structure module, the sample data related to the user is obtained, including:
It is text data to obtain the user and produced with robot interactive,
And/or
The text data related to the user is obtained by internet.
The user view automatic identification equipment that the present embodiment provides is with above-mentioned user view automatic identifying method for identical Inventive concept, there is identical beneficial effect, here is omitted.
Based on above-mentioned user view automatic identifying method identical inventive concept, present embodiments provide a kind of computer Readable storage medium storing program for executing, computer program is stored thereon with, the program realizes that above-mentioned user view is known automatically when being executed by processor Any method in the embodiment of other method.The present embodiment provide computer-readable recording medium in computer program with Above-mentioned user view automatic identifying method has identical beneficial effect, here is omitted for identical inventive concept.
Finally it should be noted that:Various embodiments above is merely illustrative of the technical solution of the present invention, rather than its limitations;To the greatest extent The present invention is described in detail with reference to foregoing embodiments for pipe, it will be understood by those within the art that:Its according to The technical scheme described in foregoing embodiments can so be modified, either which part or all technical characteristic are entered Row equivalent substitution;And these modifications or replacement, the essence of appropriate technical solution is departed from various embodiments of the present invention technology The scope of scheme, it all should cover among the claim of the present invention and the scope of specification.

Claims (10)

  1. A kind of 1. user view automatic identifying method, it is characterised in that including:
    Identify that user behavior is intended to according to read statement, the user behavior is intended to as user view;
    If can not identify that user behavior is intended to according to the read statement, retrieved in database similar to the read statement Sentence, using user's query intention corresponding to the sentence retrieved as user view;
    If retrieving in the database less than similar sentence, the other intentions of user are identified using user view grader, will The user's identification is intended to be used as user view.
  2. 2. according to the method for claim 1, it is characterised in that described to identify that user behavior is intended to according to read statement, bag Include:
    Word segmentation processing is carried out to read statement;
    The word obtained to word segmentation processing carries out part-of-speech tagging, and the word to obtaining is named Entity recognition and chain of entities Connect;
    It is intended to according to the result of name Entity recognition and entity link result identification user behavior.
  3. 3. according to the method for claim 2, it is characterised in that described to utilize the user view grader identification other meanings of user Figure, including:
    Word, the part of speech of word and name Entity recognition result, the entity link result that word segmentation processing is obtained input user Intent classifier, obtain the other intentions of user.
  4. 4. according to the method for claim 1, it is characterised in that also include:
    User emotion is identified according to the input information of user;
    According to the user emotion and the user view, the intention strong degree of user is judged.
  5. 5. according to the method for claim 4, it is characterised in that it is described that user emotion is identified according to the input information of user, Including:
    The input information of user is inputted into coarseness mood grader, the confidence level exported according to the coarseness mood grader Coarseness mood value of the high mood as user;
    The input information of user is inputted into fine granularity mood grader, the confidence level exported according to the fine granularity mood grader Fine granularity mood value of the high mood as user.
  6. 6. according to the method for claim 5, it is characterised in that the coarseness mood grader and the fine granularity mood The construction method of grader includes:
    Obtain the sample data related to the user;
    Coarseness mark and fine granularity mark are carried out to the sample data;
    Trained to obtain coarseness mood grader and fine granularity mood grader according to the sample data after mark.
  7. 7. according to the method for claim 6, it is characterised in that also include:Sample data after mark is divided into training set And test set;
    The sample data according to after mark trains to obtain coarseness mood grader and fine granularity mood grader, including:
    Sample data in training set trains to obtain coarseness mood grader and fine granularity mood grader;
    After having trained, in addition to:
    Using the sample data in test set, the coarseness mood grader obtained to training and the fine granularity mood point Class device is tested, and obtains the confidence level of the coarseness mood grader and the fine granularity mood grader;
    According to coarseness mood grader described in the Confidence test and the training result of the fine granularity mood grader.
  8. 8. the method according to claim 6 or 7, it is characterised in that obtain the sample data related to the user, bag Include:
    It is text data to obtain the user and produced with robot interactive,
    And/or
    The text data related to the user is obtained by internet.
  9. A kind of 9. user view automatic identification equipment, it is characterised in that including:
    First intention identification module, for according to read statement identify user behavior be intended to, using the user behavior be intended to as User view;
    Second intention identification module, if for that can not identify that user behavior is intended to according to the read statement, in database The retrieval sentence similar to the read statement, using user's query intention corresponding to the sentence retrieved as user view;
    3rd intention assessment module, if for retrieving in the database less than similar sentence, classified using user view Device identifies the other intentions of user, using the other intentions of the user as user view.
  10. 10. a kind of computer-readable recording medium, is stored thereon with computer program, it is characterised in that the program is by processor The method described in one of claim 1-8 is realized during execution.
CN201710701095.3A 2017-08-16 2017-08-16 Method and device for automatically identifying user intention Active CN107562816B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710701095.3A CN107562816B (en) 2017-08-16 2017-08-16 Method and device for automatically identifying user intention

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710701095.3A CN107562816B (en) 2017-08-16 2017-08-16 Method and device for automatically identifying user intention

Publications (2)

Publication Number Publication Date
CN107562816A true CN107562816A (en) 2018-01-09
CN107562816B CN107562816B (en) 2021-02-09

Family

ID=60974585

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710701095.3A Active CN107562816B (en) 2017-08-16 2017-08-16 Method and device for automatically identifying user intention

Country Status (1)

Country Link
CN (1) CN107562816B (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108197115A (en) * 2018-01-26 2018-06-22 上海智臻智能网络科技股份有限公司 Intelligent interactive method, device, computer equipment and computer readable storage medium
CN108334583A (en) * 2018-01-26 2018-07-27 上海智臻智能网络科技股份有限公司 Affective interaction method and device, computer readable storage medium, computer equipment
CN108509484A (en) * 2018-01-31 2018-09-07 腾讯科技(深圳)有限公司 Grader is built and intelligent answer method, apparatus, terminal and readable storage medium storing program for executing
CN109325106A (en) * 2018-07-31 2019-02-12 厦门快商通信息技术有限公司 A kind of U.S. chat robots intension recognizing method of doctor and device
CN109508376A (en) * 2018-11-23 2019-03-22 四川长虹电器股份有限公司 It can online the error correction intension recognizing method and device that update
CN109635080A (en) * 2018-11-15 2019-04-16 上海指旺信息科技有限公司 Acknowledgment strategy generation method and device
CN109635117A (en) * 2018-12-26 2019-04-16 零犀(北京)科技有限公司 A kind of knowledge based spectrum recognition user intention method and device
CN110085220A (en) * 2018-01-26 2019-08-02 上海智臻智能网络科技股份有限公司 Intelligent interaction device
CN110085262A (en) * 2018-01-26 2019-08-02 上海智臻智能网络科技股份有限公司 Voice mood exchange method, computer equipment and computer readable storage medium
CN110085221A (en) * 2018-01-26 2019-08-02 上海智臻智能网络科技股份有限公司 Speech emotional exchange method, computer equipment and computer readable storage medium
CN110110169A (en) * 2018-01-26 2019-08-09 上海智臻智能网络科技股份有限公司 Man-machine interaction method and human-computer interaction device
WO2019153522A1 (en) * 2018-02-09 2019-08-15 卫盈联信息技术(深圳)有限公司 Intelligent interaction method, electronic device, and storage medium
CN110187760A (en) * 2019-05-14 2019-08-30 北京百度网讯科技有限公司 Intelligent interactive method and device
CN110232108A (en) * 2019-05-13 2019-09-13 华为技术有限公司 Interactive method and conversational system
CN110309400A (en) * 2018-02-07 2019-10-08 鼎复数据科技(北京)有限公司 A kind of method and system that intelligent Understanding user query are intended to
CN110609903A (en) * 2019-08-01 2019-12-24 华为技术有限公司 Information presentation method and device
CN110888971A (en) * 2019-11-29 2020-03-17 支付宝(杭州)信息技术有限公司 Multi-round interaction method and device for robot customer service and user
CN111221955A (en) * 2020-01-09 2020-06-02 厦门快商通科技股份有限公司 Visitor intention data pre-extraction method and system based on small amount of data
CN111625680A (en) * 2020-05-15 2020-09-04 青岛聚看云科技有限公司 Method and device for determining search result
CN112035642A (en) * 2020-08-31 2020-12-04 康键信息技术(深圳)有限公司 Customer service matching method, device, equipment and storage medium
CN112632234A (en) * 2019-10-09 2021-04-09 科沃斯商用机器人有限公司 Human-computer interaction method and device, intelligent robot and storage medium
CN113010784A (en) * 2021-03-17 2021-06-22 北京十一贝科技有限公司 Method, apparatus, electronic device, and medium for generating prediction information
CN110085211B (en) * 2018-01-26 2021-06-29 上海智臻智能网络科技股份有限公司 Voice recognition interaction method and device, computer equipment and storage medium
CN113705558A (en) * 2021-08-31 2021-11-26 平安普惠企业管理有限公司 Emotion recognition method, device and equipment based on context iteration and storage medium
US11226673B2 (en) 2018-01-26 2022-01-18 Institute Of Software Chinese Academy Of Sciences Affective interaction systems, devices, and methods based on affective computing user interface
US20220044081A1 (en) * 2020-12-09 2022-02-10 Beijing Baidu Netcom Science And Technology Co., Ltd. Method for recognizing dialogue intention, electronic device and storage medium
US11403345B2 (en) * 2019-02-28 2022-08-02 Naver Corporation Method and system for processing unclear intent query in conversation system
CN115101074A (en) * 2022-08-24 2022-09-23 深圳通联金融网络科技服务有限公司 Voice recognition method, device, medium and equipment based on user speaking emotion

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005327096A (en) * 2004-05-14 2005-11-24 Nippon Telegr & Teleph Corp <Ntt> Communication support method, communication support device, program for supporting communication, and recording medium stored with the program
CN103207855A (en) * 2013-04-12 2013-07-17 广东工业大学 Fine-grained sentiment analysis system and method specific to product comment information
CN106372132A (en) * 2016-08-25 2017-02-01 北京百度网讯科技有限公司 Artificial intelligence-based query intention prediction method and apparatus
CN106462124A (en) * 2016-07-07 2017-02-22 深圳狗尾草智能科技有限公司 Method, system and robot for identifying and controlling household appliances based on intention
CN106815000A (en) * 2015-11-30 2017-06-09 北京奇虎科技有限公司 A kind of code generating method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005327096A (en) * 2004-05-14 2005-11-24 Nippon Telegr & Teleph Corp <Ntt> Communication support method, communication support device, program for supporting communication, and recording medium stored with the program
CN103207855A (en) * 2013-04-12 2013-07-17 广东工业大学 Fine-grained sentiment analysis system and method specific to product comment information
CN106815000A (en) * 2015-11-30 2017-06-09 北京奇虎科技有限公司 A kind of code generating method and device
CN106462124A (en) * 2016-07-07 2017-02-22 深圳狗尾草智能科技有限公司 Method, system and robot for identifying and controlling household appliances based on intention
CN106372132A (en) * 2016-08-25 2017-02-01 北京百度网讯科技有限公司 Artificial intelligence-based query intention prediction method and apparatus

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108197115B (en) * 2018-01-26 2022-04-22 上海智臻智能网络科技股份有限公司 Intelligent interaction method and device, computer equipment and computer readable storage medium
CN110110169A (en) * 2018-01-26 2019-08-09 上海智臻智能网络科技股份有限公司 Man-machine interaction method and human-computer interaction device
US11373641B2 (en) 2018-01-26 2022-06-28 Shanghai Xiaoi Robot Technology Co., Ltd. Intelligent interactive method and apparatus, computer device and computer readable storage medium
CN108197115A (en) * 2018-01-26 2018-06-22 上海智臻智能网络科技股份有限公司 Intelligent interactive method, device, computer equipment and computer readable storage medium
CN108334583A (en) * 2018-01-26 2018-07-27 上海智臻智能网络科技股份有限公司 Affective interaction method and device, computer readable storage medium, computer equipment
US11226673B2 (en) 2018-01-26 2022-01-18 Institute Of Software Chinese Academy Of Sciences Affective interaction systems, devices, and methods based on affective computing user interface
CN110085211B (en) * 2018-01-26 2021-06-29 上海智臻智能网络科技股份有限公司 Voice recognition interaction method and device, computer equipment and storage medium
CN110085220A (en) * 2018-01-26 2019-08-02 上海智臻智能网络科技股份有限公司 Intelligent interaction device
CN110085262A (en) * 2018-01-26 2019-08-02 上海智臻智能网络科技股份有限公司 Voice mood exchange method, computer equipment and computer readable storage medium
CN110085221A (en) * 2018-01-26 2019-08-02 上海智臻智能网络科技股份有限公司 Speech emotional exchange method, computer equipment and computer readable storage medium
CN108509484B (en) * 2018-01-31 2022-03-11 腾讯科技(深圳)有限公司 Classifier construction and intelligent question and answer method, device, terminal and readable storage medium
CN108509484A (en) * 2018-01-31 2018-09-07 腾讯科技(深圳)有限公司 Grader is built and intelligent answer method, apparatus, terminal and readable storage medium storing program for executing
CN110309400A (en) * 2018-02-07 2019-10-08 鼎复数据科技(北京)有限公司 A kind of method and system that intelligent Understanding user query are intended to
WO2019153522A1 (en) * 2018-02-09 2019-08-15 卫盈联信息技术(深圳)有限公司 Intelligent interaction method, electronic device, and storage medium
CN109325106A (en) * 2018-07-31 2019-02-12 厦门快商通信息技术有限公司 A kind of U.S. chat robots intension recognizing method of doctor and device
CN109635080A (en) * 2018-11-15 2019-04-16 上海指旺信息科技有限公司 Acknowledgment strategy generation method and device
CN109508376A (en) * 2018-11-23 2019-03-22 四川长虹电器股份有限公司 It can online the error correction intension recognizing method and device that update
CN109635117A (en) * 2018-12-26 2019-04-16 零犀(北京)科技有限公司 A kind of knowledge based spectrum recognition user intention method and device
US11403345B2 (en) * 2019-02-28 2022-08-02 Naver Corporation Method and system for processing unclear intent query in conversation system
CN110232108A (en) * 2019-05-13 2019-09-13 华为技术有限公司 Interactive method and conversational system
CN110232108B (en) * 2019-05-13 2023-02-03 华为技术有限公司 Man-machine conversation method and conversation system
CN110187760A (en) * 2019-05-14 2019-08-30 北京百度网讯科技有限公司 Intelligent interactive method and device
CN110609903A (en) * 2019-08-01 2019-12-24 华为技术有限公司 Information presentation method and device
CN112632234A (en) * 2019-10-09 2021-04-09 科沃斯商用机器人有限公司 Human-computer interaction method and device, intelligent robot and storage medium
CN110888971B (en) * 2019-11-29 2022-05-24 支付宝(杭州)信息技术有限公司 Multi-round interaction method and device for robot customer service and user
CN110888971A (en) * 2019-11-29 2020-03-17 支付宝(杭州)信息技术有限公司 Multi-round interaction method and device for robot customer service and user
CN111221955A (en) * 2020-01-09 2020-06-02 厦门快商通科技股份有限公司 Visitor intention data pre-extraction method and system based on small amount of data
CN111625680A (en) * 2020-05-15 2020-09-04 青岛聚看云科技有限公司 Method and device for determining search result
CN111625680B (en) * 2020-05-15 2023-08-25 青岛聚看云科技有限公司 Method and device for determining search results
CN112035642A (en) * 2020-08-31 2020-12-04 康键信息技术(深圳)有限公司 Customer service matching method, device, equipment and storage medium
US20220044081A1 (en) * 2020-12-09 2022-02-10 Beijing Baidu Netcom Science And Technology Co., Ltd. Method for recognizing dialogue intention, electronic device and storage medium
CN113010784A (en) * 2021-03-17 2021-06-22 北京十一贝科技有限公司 Method, apparatus, electronic device, and medium for generating prediction information
CN113010784B (en) * 2021-03-17 2024-02-06 北京十一贝科技有限公司 Method, apparatus, electronic device and medium for generating prediction information
CN113705558A (en) * 2021-08-31 2021-11-26 平安普惠企业管理有限公司 Emotion recognition method, device and equipment based on context iteration and storage medium
CN115101074A (en) * 2022-08-24 2022-09-23 深圳通联金融网络科技服务有限公司 Voice recognition method, device, medium and equipment based on user speaking emotion
CN115101074B (en) * 2022-08-24 2022-11-11 深圳通联金融网络科技服务有限公司 Voice recognition method, device, medium and equipment based on user speaking emotion

Also Published As

Publication number Publication date
CN107562816B (en) 2021-02-09

Similar Documents

Publication Publication Date Title
CN107562816A (en) User view automatic identifying method and device
KR102222451B1 (en) An apparatus for predicting the status of user&#39;s psychology and a method thereof
Ramakrishnan et al. Speech emotion recognition approaches in human computer interaction
Janning et al. Perceived task-difficulty recognition from log-file information for the use in adaptive intelligent tutoring systems
Hema et al. Emotional speech recognition using cnn and deep learning techniques
CN112101044A (en) Intention identification method and device and electronic equipment
Wang et al. Significance of phonological features in speech emotion recognition
Somogyi The Application of Artificial Intelligence
CN113761942B (en) Semantic analysis method, device and storage medium based on deep learning model
CN114925212A (en) Relation extraction method and system for automatically judging and fusing knowledge graph
CN114360584A (en) Phoneme-level-based speech emotion layered recognition method and system
KR102586799B1 (en) Method, device and system for automatically processing creation of web book based on web novel using artificial intelligence model
Helaly et al. Deep convolution neural network implementation for emotion recognition system
Sasidharan Rajeswari et al. Speech Emotion Recognition Using Machine Learning Techniques
CN110472032A (en) More classification intelligent answer search methods of medical custom entities word part of speech label
Agrima et al. Emotion recognition from syllabic units using k-nearest-neighbor classification and energy distribution
Cummins et al. Latest advances in computational speech analysis for mobile sensing
Huang et al. Exploring the effect of emotions in human–machine dialog: an approach toward integration of emotional and rational information
Du et al. Composite Emotion Recognition and Feedback of Social Assistive Robot for Elderly People
Khalafi et al. A hybrid deep learning approach for phenotype prediction from clinical notes
Panditharathna et al. Question and answering system for investment promotion based on nlp
Dhaouadi et al. Speech Emotion Recognition: Models Implementation & Evaluation
Li et al. Research on Chorus Emotion Recognition and Intelligent Medical Application Based on Health Big Data
Shukla et al. Deep ganitrus algorithm for speech emotion recognition
Ganie et al. A VGG16 Based Hybrid Deep Convolutional Neural Network Based Real-Time Video Frame Emotion Detection System for Affective Human Computer Interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Room 301, Building 39, 239 Renmin Road, Gusu District, Suzhou City, Jiangsu Province, 215000

Applicant after: Suzhou Dogweed Intelligent Technology Co., Ltd.

Address before: 518000 Dongfang Science and Technology Building 1307-09, 16 Keyuan Road, Yuehai Street, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: Shenzhen green bristlegrass intelligence Science and Technology Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
PP01 Preservation of patent right

Effective date of registration: 20220228

Granted publication date: 20210209

PP01 Preservation of patent right