CN109271508B - Personalized Area generation and methods of exhibiting based on emotion - Google Patents
Personalized Area generation and methods of exhibiting based on emotion Download PDFInfo
- Publication number
- CN109271508B CN109271508B CN201810969038.8A CN201810969038A CN109271508B CN 109271508 B CN109271508 B CN 109271508B CN 201810969038 A CN201810969038 A CN 201810969038A CN 109271508 B CN109271508 B CN 109271508B
- Authority
- CN
- China
- Prior art keywords
- emotion
- user
- function
- sta
- sol
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/205—Parsing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The present invention is personalized Area generation and methods of exhibiting based on emotion, collects the emotional factor of user, is analyzed the emotion of user, analyzes the correspondence emotion that individual consumer's emotion and this emotion are inclined to according to emotion learning;The mood centralized displaying of a region in-group user is come out in a manner of visual, the group of subscribers region for recommending the correspondence emotion for meeting user feeling tendency to concentrate for individual consumer on region afterwards, the invention belongs to the crossing domains of graph image and soft project.
Description
Technical field
The present invention is personalized Area generation and methods of exhibiting based on emotion, and the invention belongs to graph images and software work
Journey crossing domain.
Background technique
Mood detection or Emotional Intelligence are to disclose a people or the understanding of involuntary exchange, it is related to facial expression, hand
Gesture, posture, intonation, vocabulary, speech speed, breathing and skin physiology explanation, to decode personal affective state, this needs
Understand society and cultural custom, environmental background and to personal familiarity, system it can be considered that when all of these factors taken together,
Emotion recognition just will lead to optimal computed, generate optimum efficiency, and the emotional state of customer just become seller's consideration one is important
Dimension, the emotion recognition based on artificial intelligence are that digital world brings emotion intelligence, not only change the mutual of the mankind and technology
Flowing mode, and change how the mankind interact with other people;The present invention is personalized Area generation and displaying side based on emotion
Method collects the emotional factor of user, analyzes the emotion of user, analyzes individual consumer's emotion and this feelings according to emotion learning
Feel be inclined to correspondence emotion;The mood centralized displaying of a region in-group user is come out in a manner of visual, after
The group of subscribers region for recommending the correspondence emotion for meeting user feeling tendency to concentrate for individual consumer on region.
Summary of the invention
Architecture
One personalized Area generation based on emotion and methods of exhibiting mainly include the content of three aspects: emotion point
Analysis, matching module and personalized displaying, Fig. 1 give the system figure of personalized Area generation and methods of exhibiting based on emotion;
The concept in terms of these three is introduced first, then provides the definition of various aspects;
(1) emotion (E) is analyzed: in the sentiment analysis factor (MoFa) of this stage input user (U), according to the feelings of user
The emotion of sense factor pair user at this very moment is analyzed;The MoFa of user includes facial expression, gesture, posture, intonation, word
Remittance, speech speed, breathing, skin physiology and text analyzing, sentiment analysis (EmoAnaly)=(Sol, Gro) include two calculations
Method, specific as follows:
1)Sol(MoFa,Sta,γ i ) → (E, E i ): user feeling classification function Sol is by the sentiment analysis factor of user
It is compared after MoFa input with the factor S ta under standard emotion, the ratio of each MoFa and Sta pass through machine in threshold value
Learn then to determine the emotion of user for Sta corresponding emotion E and degree E within the scope of obtained γ ii;
2)Gro(Sol,β) →E∪UE: group of subscribers emotion function Gro is based on user feeling classification function Sol, when multiple
The emotion difference of user is to determine that multiple users are same emotion E in the β that obtains by machine learning in a threshold value, otherwise for
Different emotions UE;
This patent increases the text analyzing MoFa other than conventional method in sentiment analysis partTXT=(Ag, Spee) include
Two algorithms, specific as follows:
1)Ag(word,key, γ) → (E): vocabulary and sentence word that text emotion analytic function Ag inputs user with
Emotion keyword key is matched, if the ratio of word and key determines within the scope of the γ obtained by machine learning
The affective style E of user is the corresponding affective style of key;Such as user inputs " I has eaten a watermelon, self-satisfied ", passes through
Key " self-satisfied " may determine that the affective style E of user this moment is happy;
2)Spee(fac,sta,α) → (E i ): the speed fac and standard that text velocity analysis function Spee inputs user
Speed sta(standard emotion) it compares, threshold alpha is obtained according to machine learning, the ratio between speed of fac and sta and α are carried out pair
Than obtaining user and belonging to specific emotional category E in affective style E this momenti;Such as the speed fac and sta of " self-satisfied " input
Ratio be compared with α after discovery user emotion ratio sta this moment when it is strong, then user is the Xing Gaocai in happy this moment
It is strong;What Spee function reflected is the degree of user feeling;
(2) matching module (Match): Match=(SolGro, DegGro) includes two algorithms, specific as follows:
1)SolGro(Sol,Gro,ρ) →y∪n: population of individuals comparison function SolGro is based on user feeling classification function
Sol and group of subscribers emotion function Gro, when the difference of Sol and Gro is then to determine in the ρ obtained by machine learning in threshold value
Sol is identical as Gro emotion, is denoted as y, otherwise not identical, is denoted as n;
2)DegGro(num,num sta ) → (DegGro i ): matching degree function DegGro utilizes the quantity of user in group
Num and standard class quantity numstaIt is compared, numstaIn have the standard numbers of multiple grades, such as level-one is 100 people,
Second level is 300 people ... when num is less than or equal to numstaWhen be numstaPlace grade, num are greater than numstaWhen determine Gro degree
For numstaThe lower level of grade or under several grades;
(3) personalized displaying (PerShow): PerShow=(wg, sw, tj), including three algorithms, specific as follows:
1)wg(DegGro,diy) → (cl): looking display function wg is made by oneself using matching degree function DegGro and user
Adopted diy obtains the variation of color cl when map is shown;The color of the customized similar emotion of user indicates, according to the degree of DegGro
Correspondence from shallow to deep is carried out to the color of user's selection;
2)sw(timeline,wg) → (PerMap): time change explicit function sw is using time shaft timeline and outside
It sees and shows function wg, when user drags timeline, user can see and be determined on PerMap according to DegGro and diy
Field color variation, the group region of oneself Sentiment orientation can be observed in real time in user;
3)tj(Sol,δ) → (Gro): recommend function tj to utilize user feeling classification function Sol, passes through machine learning user
Emotion select δ, recommend most suitable group's emotion ownership place Gro for user;Such as user A failed in love it is very sentimental, according to feelings
Sense selection δ obtains user A and likes sad therapy, therefore recommends A that Gro emotion is gone to be shown as E={ sad }, Ei={ sad caused by failure in love
Wound } place.
The utility model has the advantages that
The method of the present invention proposes personalized Area generation and methods of exhibiting based on emotion, has the advantages that
1) the personalized Area generation and methods of exhibiting proposed by the present invention based on emotion is added in user feeling analysis
Text analyzing, it is in conjunction with traditional analysis method, more accurate to the judgement of user feeling;
2) the personalized Area generation proposed by the present invention based on emotion and methods of exhibiting can be according to the feelings of individual consumer
Sense is accustomed in conjunction with the Sentiment orientation of emotion learning individual consumer, selects to be inclined to the intensive group of emotion intensively for user;
3) the personalized Area generation proposed by the present invention based on emotion provides time shaft with methods of exhibiting for user, passes through
Time shaft can grasp rapidly the variation of emotion on map, and in conjunction with intelligent recommendation, user intuitively can promptly select oneself to incline
To destination;
Detailed description of the invention
Fig. 1 is the system figure of personalized Area generation and methods of exhibiting based on emotion;
Fig. 2 is the specific implementation flow chart of personalized Area generation and methods of exhibiting based on emotion.
Specific embodiment
The detailed process of personalized Area generation and methods of exhibiting based on emotion is as follows:
Shown in 001 in step 1) corresponding diagram 2, the sentiment analysis factor (MoFa) of user (U) is inputted in this stage, according to
The emotional factor of user analyzes the emotion of user at this very moment;The MoFa of user include facial expression, gesture, posture,
Intonation, vocabulary, speech speed, breathing, skin physiology and text analyzing, this patent increase tradition side in sentiment analysis part
Text analyzing MoFa other than methodTXT=(Ag, Spee) includes two algorithms, specific as follows:
1)Ag(word,key, γ) → (E): vocabulary and sentence word that text emotion analytic function Ag inputs user with
Emotion keyword key is matched, if the ratio of word and key obtains within the scope of γ by machine learning, determines to use
The affective style E at family is the corresponding affective style of key;Such as user inputs " I has eaten a watermelon, self-satisfied ", passes through key
" self-satisfied " may determine that the affective style E of user this moment is happy;
2)Spee(fac,sta,α) → (E i ): the speed fac and standard that text velocity analysis function Spee inputs user
Speed sta(standard emotion) it compares, threshold alpha is obtained according to machine learning, the ratio between speed of fac and sta and α are carried out pair
Than obtaining user and belonging to specific emotional category E in affective style E this momenti;Such as the speed fac and sta of " self-satisfied " input
Ratio be compared with α after discovery user emotion ratio sta this moment when it is strong, then user is the Xing Gaocai in happy this moment
It is strong;What Spee function reflected is the degree of user feeling;
Shown in 002 in step 2 corresponding diagram 2, the sentiment analysis (EmoAnaly) of individual consumer is carried out, EmoAnaly=
(Sol, Gro) includes two algorithms, specific as follows:
1)Sol(MoFa,Sta,γ i ) → (E, E i ): user feeling classification function Sol is by the sentiment analysis factor of user
It is compared after MoFa input with the factor S ta under standard emotion, the ratio of each MoFa and Sta pass through machine in threshold value
Learn obtained γ i In range, then determine the emotion of user for Sta corresponding emotion E and degree Ei;
Second algorithm is shown in step 3);
Shown in 003 in step 3) corresponding diagram 2, group's sentiment analysis is carried out;Gro(Sol,β) →E∪UE: group of subscribers feelings
Feel function Gro and be based on user feeling classification function Sol, when the emotion difference of multiple users passes through machine learning in a threshold value
Determine that multiple users are same emotion E in obtained β, is otherwise different emotions UE;
Shown in 004 in step 4) corresponding diagram 2, into matching module (Match): Match=(SolGro, DegGro), include
Two algorithms, specific as follows:
1)SolGro(Sol,Gro,ρ) →y∪n: population of individuals comparison function SolGro is based on user feeling classification function
Sol and group of subscribers emotion function Gro, when the difference of Sol and Gro is then to determine in the ρ obtained by machine learning in threshold value
Sol is identical as Gro emotion, is denoted as y, otherwise not identical, is denoted as n;
2)DegGro(num,num sta ) → (DegGro i ): matching degree function DegGro utilizes the quantity of user in group
Num and standard class quantity numstaIt is compared, numstaIn have the standard numbers of multiple grades, such as level-one is 100 people,
Second level is 300 people ... when num is less than or equal to numstaWhen be numstaPlace grade, num are greater than numstaWhen determine Gro degree
For numstaThe lower level of grade or under several grades;
Shown in 005 in step 5) corresponding diagram 2, to the personalized displaying (PerShow) of emotion map progress: PerShow=(wg,
Sw, tj), including three algorithms, it is specific as follows:
1)wg(DegGro,diy) → (cl): looking display function wg is made by oneself using matching degree function DegGro and user
Adopted diy obtains the variation of color cl when map is shown;The color of the customized similar emotion of user indicates, according to the degree of DegGro
Correspondence from shallow to deep is carried out to the color of user's selection;
2)sw(timeline,wg) → (PerMap): time change explicit function sw is using time shaft timeline and outside
It sees and shows function wg, when user drags timeline, user can see and be determined on PerMap according to DegGro and diy
Field color variation, the group region of oneself Sentiment orientation can be observed in real time in user;
3) (Sol,δ) → (Gro): recommend function tj to utilize user feeling classification function Sol, passes through machine learning user's
Emotion selects δ, recommends most suitable group's emotion ownership place Gro for user;Such as user A failed in love it is very sentimental, according to emotion
Selection δ obtains user A and likes sad therapy, therefore recommends A that Gro emotion is gone to be shown as E={ sad }, Ei={ sad caused by failure in love }
Place;
Shown in 006 in step 6) corresponding diagram 2, user uses personalized Area generation and methods of exhibiting based on emotion, knot
Line journey.
Claims (1)
1. a kind of personalized Area generation and methods of exhibiting based on emotion, it is characterised in that the emotional factor of user is collected, it is right
The emotion of user is analyzed, and analyzes the correspondence emotion that individual consumer's emotion and this emotion are inclined to according to emotion learning;With can
Depending on change mode the mood centralized displaying of one region in-group user is come out, after on region for individual consumer recommend meet
The group of subscribers region that the correspondence emotion of user feeling tendency is concentrated;Personalized Area generation and methods of exhibiting based on emotion
Detailed process is as follows:
Step 1) this stage input user U sentiment analysis factor M oFa, according to the emotional factor of user to user at this time
Emotion this moment is analyzed;The MoFa of user include facial expression, gesture, posture, intonation, vocabulary, speech speed, breathing,
Skin physiology and text analyzing, text analyzing MoFaTXTIt is specific as follows comprising two algorithms:
1)Ag(word,key, γ) → (E): the vocabulary and sentence word and emotion that text emotion analytic function Ag inputs user
Keyword key is matched, if the ratio of word and key determines user within the scope of the γ obtained by machine learning
Affective style E be the corresponding affective style of key;
2)Spee(fac,sta,α) → (E i ): the speed fac and standard speed that text velocity analysis function Spee inputs user
Sta is compared, and obtains threshold alpha according to machine learning, and the ratio between speed of fac and sta is compared with α, obtain user this
Belong to specific emotional category E in affective style E quarteri;What Spee function reflected is the degree of user feeling;
The sentiment analysis EmoAnaly that step 2 carries out individual consumer includes two algorithms, specific as follows:
1)Sol(MoFa,Sta,γ i ) → (E, E i ): user feeling classification function Sol is defeated by the sentiment analysis factor M oFa of user
Enter and compared afterwards with the factor S ta under standard emotion, the ratio of each MoFa and Sta are obtained by machine learning in threshold value
The γ arrived i In range, then determine the emotion of user for Sta corresponding emotion E and degree Ei;
Second algorithm is shown in step 3);
Step 3) carries out group's sentiment analysis;Gro(Sol,β) →E∪UE: group of subscribers emotion function Gro is based on user feeling
Classification function Sol, when the emotion difference of multiple users is to determine multiple users in the β obtained by machine learning in a threshold value
It is otherwise different emotions UE for same emotion E;
Step 4) enters matching module Match, includes two algorithms, specific as follows:
1)SolGro(Sol,Gro,ρ) →y∪n: population of individuals comparison function SolGro be based on user feeling classification function Sol and
Group of subscribers emotion function Gro, when the difference of Sol and Gro in threshold value be in the ρ that is obtained by machine learning then determine Sol with
Gro emotion is identical, is denoted as y, otherwise not identical, is denoted as n;
2)DegGro(num,num sta ) → (DegGro i ): matching degree function DegGro using user in group quantity num and
Standard class quantity numstaIt is compared, numstaIn have the standard numbers of multiple grades;
It includes three algorithms that step 5), which carries out personalized displaying PerShow to emotion map, specific as follows:
1)wg(DegGro,diy) → (cl): looking display function wg is customized using matching degree function DegGro and user
Diy obtains the variation of color cl when map is shown;The color of the customized similar emotion of user indicates, according to the degree pair of DegGro
The color of user's selection carries out correspondence from shallow to deep;
2)sw(timeline,wg) → (PerMap): time change explicit function sw utilizes time shaft timeline and appearance exhibition
Show function wg, when user drags timeline, user can see basis on individualized emotion map PerMap
The group region of oneself Sentiment orientation can be observed in real time in the field color variation that DegGro and diy is determined, user;
3)tj(Sol,δ) → (Gro): recommend function tj to utilize user feeling classification function Sol, passes through the feelings of machine learning user
Sense selection δ, recommends most suitable group's emotion ownership place Gro for user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810969038.8A CN109271508B (en) | 2018-08-23 | 2018-08-23 | Personalized Area generation and methods of exhibiting based on emotion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810969038.8A CN109271508B (en) | 2018-08-23 | 2018-08-23 | Personalized Area generation and methods of exhibiting based on emotion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109271508A CN109271508A (en) | 2019-01-25 |
CN109271508B true CN109271508B (en) | 2019-11-15 |
Family
ID=65154238
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810969038.8A Active CN109271508B (en) | 2018-08-23 | 2018-08-23 | Personalized Area generation and methods of exhibiting based on emotion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109271508B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111191554B (en) * | 2019-12-24 | 2022-11-15 | 中国科学院软件研究所 | Video emotion analysis and visualization method and system based on metaphor map |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105610884A (en) * | 2014-11-21 | 2016-05-25 | 阿里巴巴集团控股有限公司 | Method and device for providing travel information |
CN106202252A (en) * | 2016-06-29 | 2016-12-07 | 厦门趣处网络科技有限公司 | Method, system are recommended in a kind of trip analyzed based on user emotion |
CN107679249A (en) * | 2017-10-27 | 2018-02-09 | 上海掌门科技有限公司 | Friend recommendation method and apparatus |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10289641B2 (en) * | 2015-10-16 | 2019-05-14 | Accenture Global Services Limited | Cluster mapping based on measured neural activity and physiological data |
-
2018
- 2018-08-23 CN CN201810969038.8A patent/CN109271508B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105610884A (en) * | 2014-11-21 | 2016-05-25 | 阿里巴巴集团控股有限公司 | Method and device for providing travel information |
CN106202252A (en) * | 2016-06-29 | 2016-12-07 | 厦门趣处网络科技有限公司 | Method, system are recommended in a kind of trip analyzed based on user emotion |
CN107679249A (en) * | 2017-10-27 | 2018-02-09 | 上海掌门科技有限公司 | Friend recommendation method and apparatus |
Also Published As
Publication number | Publication date |
---|---|
CN109271508A (en) | 2019-01-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Boxenbaum et al. | Towards an articulation of the material and visual turn in organization studies | |
Kuzminykh et al. | Genie in the bottle: Anthropomorphized perceptions of conversational agents | |
Thompson et al. | Speaking of fashion: consumers' uses of fashion discourses and the appropriation of countervailing cultural meanings | |
Tarasti | Existential semiotics | |
Gottman | Marital interaction: Experimental investigations | |
Emerson et al. | Writing ethnographic fieldnotes | |
Hammer | Blindness through the looking glass: The performance of blindness, gender, and the sensory body | |
Basori | Emotion walking for humanoid avatars using brain signals | |
Highmore | Taste as feeling | |
Volonté | Fat fashion: The thin ideal and the segregation of plus-size bodies | |
CN109271508B (en) | Personalized Area generation and methods of exhibiting based on emotion | |
CN109101650A (en) | The region recommended method of emotion guiding | |
Dennler et al. | Designing Robot Identity: The Role of Voice, Clothing, and Task on Robot Gender Perception | |
Chen et al. | Detecting novelty seeking from online travel reviews: A deep learning approach | |
Mani | On breaking with | |
Karyotis et al. | Affect aware ambient intelligence: current and future directions | |
Liang | Exploring Chinese consumers’ luxury value perceptions: development and assessment of a conceptual model. | |
Nairn | The me you see: The creative identity as constructed in music documentaries | |
Morris | Fashion, social media, and identity expression: an intersectional approach to understanding the fashion consumption patterns of black middle-class women | |
Novick et al. | PaolaChat: a virtual agent with naturalistic breathing | |
Smith et al. | Hair Messages: A Context for Exploring Racial Socialization Among African American Males | |
Yum et al. | Improving User-Centered Interface for Smart Mirror as a Product-Service System Design | |
Zhang et al. | The Influence of Design Aesthetics on the Purchase Intention of AI-Generated Products: Taking Cultural and Creative Products as an Example | |
Sgroi | Joe Millionaire And Women's Positions a question of class | |
McIntosh et al. | Storytelling: Beyond the Academic Article–Using Fiction, Art and Literary Techniques to Communicate: A special theme issue of The Journal of Corporate Citizenship (Issue 54) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |