CN109543039A - A kind of natural language sentiment analysis method based on depth network - Google Patents

A kind of natural language sentiment analysis method based on depth network Download PDF

Info

Publication number
CN109543039A
CN109543039A CN201811409537.8A CN201811409537A CN109543039A CN 109543039 A CN109543039 A CN 109543039A CN 201811409537 A CN201811409537 A CN 201811409537A CN 109543039 A CN109543039 A CN 109543039A
Authority
CN
China
Prior art keywords
square
context
memory
module
loss
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811409537.8A
Other languages
Chinese (zh)
Other versions
CN109543039B (en
Inventor
杨猛
林佩勤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sun Yat Sen University
National Sun Yat Sen University
Original Assignee
National Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Sun Yat Sen University filed Critical National Sun Yat Sen University
Priority to CN201811409537.8A priority Critical patent/CN109543039B/en
Publication of CN109543039A publication Critical patent/CN109543039A/en
Application granted granted Critical
Publication of CN109543039B publication Critical patent/CN109543039B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Machine Translation (AREA)

Abstract

Natural language sentiment analysis method provided by the invention based on depth network, on the basis of memory network, semantic dependency information introduces the execution to guide attention mechanism, and the context square information for containing sentence entirety emotion information is also utilized for present analysis subject word and provides background information.Entire model includes insertion module, memory sequences construct module, semantic dependency mask pays attention to power module, context square emotion learning module and output module.In a model, the semantic dependency relations information of the subject word and context that are obtained according to interdependent syntax tree will be introduced into memory network, so that the memory sequences of each layer were dynamically generated, to guide the execution of the attention mechanism in the multilayer module of memory network.In addition, we have proposed the learning tasks based on context square in order to introduce the whole emotion information of sentence, i.e. relation information in same sentence between all subject words, the sentiment analysis of special object word is assisted by multi-task learning.

Description

A kind of natural language sentiment analysis method based on depth network
Technical field
The present invention relates to the sentiment analysis fields in Computer Natural Language Processing, more particularly, to one kind based on deep Spend the natural language sentiment analysis method of network.
Background technique
The main target of object level sentiment analysis task be for one or more evaluation goal present in given sentence, Provide feeling polarities of each evaluation goal in sentence respectively (as positive, passive or neutral).For example, given sentence " this family The price in restaurant is very cheap, but services very poor " and evaluation goal " price " and " service ", for evaluation object " price ", emotion Polarity is positive, and for evaluation object " service ", feeling polarities are then passive.It is obvious that in the same sentence not Same subject word, sentiment analysis result may be different.
As attention mechanism and memory network take in multiple natural language processing tasks such as machine translation, reading understanding It obtains and well shows, the method for having merged attention mechanism and memory network also becomes the master for solving object level sentiment analysis task Want method.The exemplary process of this respect has MemNet (Memory Network, memory network) and RAM (Recurrent Attention Network on Memory acts on the recurrence attention network on memory).
MemNet model utilizes the location information of context and subject word using the word embeded matrix of sentence as memory sequences Implement attention mechanism on memory sequences with content information, finally obtains the sentence feeling polarities for subject word.RAM model Then on the basis of MemNet, square is embedded in word using LSTM (Long Short-Term Memory, long memory unit in short-term) Battle array is operated, and the memory sequences comprising sentence internal structural information are obtained, then is gone to combine recurrence with a kind of nonlinear mode Each layer of output in attention network.
Subject word is utilized in existing method and the positional relationship of context carries out tax power to memory sequences, so that being directed to Different subject words can generate different memory sequences.However it goes to be weighted simultaneously memory sequences just with positional relationship Contacting between subject word and context cannot be made full use of, including the semantic relation between subject word and context and is currently commented Emotional relationship in valence subject word and sentence between other subject words is made a concrete analysis of as follows:
The semantic relation of subject word and context is lacked, has utilized location information merely, can make on semantic dependent tree It is different from subject word distance, but attention of the text apart from identical word by equal extent, this is clearly unreasonable.In addition, In complicated sentence, often there is semantic relation by force but the context words of text distance relatively far away from, if merely used Position weighting can make model that can not capture and determine the highly important, context of distance relatively far away to subject word emotion Word.
Considering for subject word relationship is lacked, so that with judging that the operation of different subject words is independent, nothing in sentence Method considers influence of other subject words of same sentence for existing object word Judgment by emotion.It is some comprising comparing, it is arranged side by side In the complicated sentence of multipair elephant, if not accounting for these relationships, individually go to judge each that there are the objects in sentence The feeling polarities of word, it is clear that not than considering these relationships, allow the emotion recognition task of each subject word by same sentence The assistance of other subject words is more helpful.
Summary of the invention
It only considered for the existing main stream approach based on attention mechanism and memory network when generating memory sequences Positional relationship between subject word and context, the problem of without in view of semantic relation between subject word and context, The present invention proposes a kind of natural language sentiment analysis method based on depth network, the technical solution adopted by the present invention is that:
A kind of natural language sentiment analysis method based on depth network, including insertion module, memory sequences building module, Semantic dependency mask pays attention to power module, context square emotion learning module and output module;
The insertion look-up table that the insertion module is obtained using one by unsupervised approaches pre-training, will be in corpus Word is converted to corresponding term vector;For the non-dictionary word being not present in look-up table, Gaussian Profile random initializtion is used Its random transition is embedded at the word of a low dimensional;
The memory sequences building module passes through the two-way length insertion sequence that memory unit obtains insertion module in short-term Memory sequences are converted to, the memory sequences after conversion can indicateWherein n is sequence length;
The semantic dependency mask notices that power module according to the interdependent syntax tree of sentence, extracts semantic dependency information, so The execution of attention mechanism is guided, object is obtained come the different piece of dynamic select memory sequences according to semantic dependency information afterwards The loss of word emotional semantic classification;The context square emotion learning module passes through the Cooperative Study to context square recurrence task simultaneously Context memory sequence is constructed, and calculates context square and returns loss;
The output module by simultaneously minimize subject word emotional semantic classification loss and context square return loss come into Row training, to predict the feeling polarities of subject word.
Described, the semantic mask notices that the workflow of power module includes the following steps:
Step 1: in l layers of mask memory sequences, each memory unit will carry out mask behaviour according to semantic dependency information Make, i.e. up and down cliction corresponding memory unit of the selection semantic distance less than current layer number l, specific formula is as follows:
Wherein dist (wi,wt) referring to existing object word to the semantic distance of context, l is the layer of multilayer profound memory network Number, l is positive integer;
Step 2: being in the mask memory sequences that each computation layer l is generatedRemembered by mask Recall sequence, available in l layers of mask memory sequences, the attention score of each memory unit are as follows:
WhereindALIndicate weight dimension used in attention mechanism,rl-1And vaRespectively indicate memory One layer of output and subject word indicate on unit, memory network;
Step 3: the score value of each memory unit then being obtained by softmax function normalizationTo gain attention The final output of power mechanism layer:
WhereinIt is score value,Indicate corresponding memory unit
Step 4: a change-over gate and a carrying door is added, controls one layer of output r respectivelyl-1How many is noted power Next layer is brought into after system conversion, and how many does not pass through conversion, is directly carried to next layer, to obtain this layer of output rl;The sentence expression for being directed to special object word is obtained by the multilayer attention mechanism nonlinear iteration of profound memory network, To obtain the emotional semantic classification of subject word as a result, using this as a result, being trained step obtains the loss of subject word emotional semantic classification.
Described, the workflow of the context emotion learning module is as follows:
Step 1: use -1,0 and 1 respectively indicate passive, neutral and positive three feeling polarities describe one with square in model The feeling polarities distribution of all subject words, square are defined as follows in a sentence:
μk=E ((X- μ)k)
Wherein X is sample value, and E () indicates expectation function, and k indicates the order of square, when order is odd number, μk∈[-1, 1];When order is even number, μk∈[0,1];All squares are normalized into [0,1];
Step 2: using first moment μ1With second moment μ2As the target of global square study, obtained using attention mechanism Sentence expression vs, then the estimated value of square, such as μ are obtained with different full articulamentums1Estimated value μ '1It can obtain as follows:
WhereinFor the weight of full articulamentum, vsIt is sentence expression
Step 3: and then each sample x is defined in global first moment μ1On loss it is as follows:
Step 4: obtaining global second moment using identical calculation and loseAnd then obtain global loss:
Step 5: the subject word emotion of a sentence is divided into two parts, i.e. left-half and right half part;If Subject word number is odd number, then median can be divided into left-half, calculate separately left-half and right half part respectively First order and second order moments;According to the calculating that global square loses, local moment loss is obtained:
Step 6: to lgloble(x) and llocal(x) it is weighted summation, to obtain context square study total losses.
It is described, for each sample x, one context square study loss l of context square emotion learning module definitionm (x) carry out the optimization of auxiliary object word emotional semantic classification task;This loss loses l by the overall situationgloble(x) and local losses llocal(x) Two parts composition:
Wherein naIndicate the number of subject word in a sentence, lgloble(x) for the loss of context square in entire sentence it With, and llocal(x) be the respective context square of left-half and right half part in sentence the sum of loss.
Described, the described output module, specific formula is as follows:
Wherein C is emotional category collection, and D is training set;y∈R|C|It is an only hot vector, i.e., it is only on correct label Component be 0, fc(x, θ) is model prediction as a result, λ is L2The weight of regular terms, and λmIt is context square recurrence learning loss lm (x) weight;
In the training process, training will be realized by minimizing loss L, obtain the parameter that model can be made to optimize;? In test process, prediction result f is obtained by the most optimized parameter that training process obtainsc(x, θ), wherein maximum point of score value The corresponding classification of amount is exactly the classification predicted.
Compared with prior art, the beneficial effect of technical solution of the present invention is:
For confirmatory experiment effect, the Restaurant and Laptop two that we provide in 2014 Task 4 of SemEval It is tested on a data set, and assesses experiment effect with accuracy rate, final experiment effect is as follows:
Model Restaurant data set Laptop data set
MemNet 78.2% 70.3%
RAM 80.0% 74.1%
DMMN-SDCM 81.9% 75.1%
Table 1.DMMN-SDCM can from the result of table compared with current main-stream method MemNet and RAM experiment effect Out, performance of the DMMN-SDCM model in Restaurant and Laptop two datasets will be better than current main method MemNet and RAM, wherein than the experiment accuracy rate of the RAM model that behaves oneself best at present on both data sets respectively than be higher by about 2 and 1 percentage point, this demonstrate that we are meaningful to the improvement of current method.
Detailed description of the invention
Fig. 1 is the flow chart of the natural language sentiment analysis method provided by the invention based on depth network.
Fig. 2 is that the semantic dependency mask of the natural language sentiment analysis method based on depth network pays attention to the work of power module Flow chart.
Fig. 3 is the workflow of the context square emotion learning module of the natural language sentiment analysis method based on depth network Cheng Tu.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, only for illustration, Bu Nengli Solution is the limitation to this patent.Based on the embodiments of the present invention, those of ordinary skill in the art are not making creative labor Every other embodiment obtained under the premise of dynamic, shall fall within the protection scope of the present invention.
The following further describes the technical solution of the present invention with reference to the accompanying drawings and examples.
Embodiment 1
Shown in Fig. 1~3, a kind of natural language sentiment analysis method based on depth network, including insertion module, memory sequence Column building module, semantic dependency mask pay attention to power module, context square emotion learning module and output module;
The insertion look-up table that the insertion module is obtained using one by unsupervised approaches pre-training, will be in corpus Word is converted to corresponding term vector;For the non-dictionary word being not present in look-up table, Gaussian Profile random initializtion is used Its random transition is embedded at the word of a low dimensional;
The memory sequences building module passes through the two-way length insertion sequence that memory unit obtains insertion module in short-term Memory sequences are converted to, the memory sequences after conversion can indicateWherein n is sequence length;
The semantic dependency mask notices that power module according to the interdependent syntax tree of sentence, extracts semantic dependency information, so The execution of attention mechanism is guided, object is obtained come the different piece of dynamic select memory sequences according to semantic dependency information afterwards The loss of word emotional semantic classification;The context square emotion learning module passes through the Cooperative Study to context square recurrence task simultaneously Context memory sequence is constructed, and calculates context square and returns loss;
The output module by simultaneously minimize subject word emotional semantic classification loss and context square return loss come into Row training, to predict the feeling polarities of subject word.
Described, the semantic mask notices that the workflow of power module includes the following steps:
Step 1: in l layers of mask memory sequences, each memory unit will carry out mask behaviour according to semantic dependency information Make, i.e. up and down cliction corresponding memory unit of the selection semantic distance less than current layer number l, specific formula is as follows:
Wherein dist (wi,wt) referring to existing object word to the semantic distance of context, l is the layer of multilayer profound memory network Number, l is positive integer;
Step 2: being in the mask memory sequences that each computation layer l is generatedRemembered by mask Recall sequence, available in l layers of mask memory sequences, the attention score of each memory unit are as follows:
WhereindALIndicate weight dimension used in attention mechanism,rl-1And vaRespectively indicate memory One layer of output and subject word indicate on unit, memory network;
Step 3: the score value of each memory unit then being obtained by softmax function normalizationTo gain attention The final output of power mechanism layer:
WhereinIt is score value,Indicate corresponding memory unit
Step 4: a change-over gate and a carrying door is added, controls one layer of output r respectivelyl-1How many is noted power Next layer is brought into after system conversion, and how many does not pass through conversion, is directly carried to next layer, to obtain this layer of output rl;The sentence expression for being directed to special object word is obtained by the multilayer attention mechanism nonlinear iteration of profound memory network, To obtain the emotional semantic classification of subject word as a result, using this as a result, being trained step obtains the loss of subject word emotional semantic classification.
Described, the workflow of the context emotion learning module is as follows:
Step 1: use -1,0 and 1 respectively indicate passive, neutral and positive three feeling polarities describe one with square in model The feeling polarities distribution of all subject words, square are defined as follows in a sentence:
μk=E ((X- μ)k)
Wherein X is sample value, and E () indicates expectation function, and k indicates the order of square, when order is odd number, μk∈[-1, 1];When order is even number, μk∈[0,1];All squares are normalized into [0,1];
Step 2: using first moment μ1With second moment μ2As the target of global square study, obtained using attention mechanism Sentence expression vs, then the estimated value of square, such as μ are obtained with different full articulamentums1Estimated value μ '1It can obtain as follows:
WhereinFor the weight of full articulamentum, vsIt is sentence expression
Step 3: and then each sample x is defined in global first moment μ1On loss it is as follows:
Step 4: obtaining global second moment using identical calculation and loseAnd then obtain global loss:
Step 5: the subject word emotion of a sentence is divided into two parts, i.e. left-half and right half part;If Subject word number is odd number, then median can be divided into left-half, calculate separately left-half and right half part respectively First order and second order moments;According to the calculating that global square loses, local moment loss is obtained:
Step 6: to lgloble(x) and llocal(x) it is weighted summation, to obtain context square study total losses.
It is described, for each sample x, one context square study loss l of context square emotion learning module definitionm (x) carry out the optimization of auxiliary object word emotional semantic classification task;This loss loses l by the overall situationgloble(x) and local losses llocal(x) Two parts composition:
Wherein naIndicate the number of subject word in a sentence, lgloble(x) for the loss of context square in entire sentence it With, and llocal(x) be the respective context square of left-half and right half part in sentence the sum of loss.
Described, the described output module, specific formula is as follows:
Wherein C is emotional category collection, and D is training set;y∈R|C|It is an only hot vector, i.e., it is only on correct label Component be 0, fc(x, θ) is model prediction as a result, λ is L2The weight of regular terms, and λmIt is context square recurrence learning loss lm (x) weight;
In the training process, training will be realized by minimizing loss L, obtain the parameter that model can be made to optimize;? In test process, prediction result f is obtained by the most optimized parameter that training process obtainsc(x, θ), wherein maximum point of score value The corresponding classification of amount is exactly the classification predicted.
Embodiment 2
In the present embodiment,
Given sentence " Great food but the service was dreadful!" and subject word " food " and “service”。
As a result: in RAM model, two subject words in this sentence are all judged as positive emotion, and DMMN-SDCM Model then successfully identifies two respective feeling polarities of subject word.
Analysis: since DMMN-SDCM model replaces traditional text range information with semantic dependency information, so that Model can judge that influence of the upper and lower cliction " dreadful " to subject word " service " is more compared to upper and lower cliction " Great " It is deep, the feeling polarities judgement of this subject word is helped bigger.In addition, due to the learning tasks for introducing context square, so mould Pair type can be gone while the relationship of learning object word " food " and " service " when constructing context memory sequence, i.e., Than relationship, so that the memory sequences that building is more scientific, assist the feeling polarities of two subject words to judge.
Obviously, the above embodiment of the present invention be only to clearly illustrate example of the present invention, and not be pair The restriction of embodiments of the present invention.For those of ordinary skill in the art, may be used also on the basis of the above description To make other variations or changes in different ways.There is no necessity and possibility to exhaust all the enbodiments.It is all this Made any modifications, equivalent replacements, and improvements etc., should be included in the claims in the present invention within the spirit and principle of invention Protection scope within.

Claims (5)

1. a kind of natural language sentiment analysis method based on depth network, which is characterized in that including being embedded in module, memory sequences Building module, semantic dependency mask pay attention to power module, context square emotion learning module and output module;
The insertion look-up table that the insertion module is obtained using one by unsupervised approaches pre-training turns the word in corpus It is changed to corresponding term vector;For the non-dictionary word being not present in look-up table, use Gaussian Profile random initializtion by its Random transition is embedded at the word of a low dimensional;
The memory sequences building module passes through the insertion sequence conversion that memory unit obtains insertion module in short-term of two-way length For memory sequences, the memory sequences after conversion can be indicatedWherein n is sequence length;
The semantic dependency mask notices that power module according to the interdependent syntax tree of sentence, extracts semantic dependency information, then root Carry out the different piece of dynamic select memory sequences according to semantic dependency information, guides the execution of attention mechanism, obtain subject word feelings Feel Classification Loss;The context square emotion learning module is by returning the Cooperative Study of task to context square come structure simultaneously Context memory sequence is built, and calculates context square and returns loss;
The output module is instructed by minimizing the loss of subject word emotional semantic classification and context square recurrence loss simultaneously Practice, to predict the feeling polarities of subject word.
2. the natural language sentiment analysis method according to claim 1 based on depth network, which is characterized in that described Semantic mask notices that the workflow of power module includes the following steps:
Step 1: in l layers of mask memory sequences, each memory unit will carry out mask operation according to semantic dependency information, i.e., Semantic distance is selected to be less than the corresponding memory unit of cliction up and down of current layer number l, specific formula is as follows:
Wherein dist (wi,wt) referring to existing object word to the semantic distance of context, l is the number of plies of multilayer profound memory network, l For positive integer;
Step 2: being in the mask memory sequences that each computation layer l is generatedSequence is remembered by mask Column, available in l layers of mask memory sequences, the attention score of each memory unit are as follows:
WhereindALIndicate weight dimension used in attention mechanism,rl-1And vaIt is single to respectively indicate memory One layer of output and subject word indicate in member, memory network;
Step 3: the score value of each memory unit then being obtained by softmax function normalizationTo the power machine that gains attention The final output of preparative layer:
WhereinIt is score value,Indicate corresponding memory unit
Step 4: a change-over gate and a carrying door is added, controls one layer of output r respectivelyl-1How many is noted power system It brings next layer after conversion into, and how many does not pass through conversion, is directly carried to next layer, to obtain this layer of output rl;It is logical The multilayer attention mechanism nonlinear iteration for crossing depth memory network obtains the sentence expression for being directed to special object word, thus To subject word emotional semantic classification as a result, using this as a result, be trained step obtain subject word emotional semantic classification loss.
3. the natural language sentiment analysis method according to claim 2 based on depth network, which is characterized in that described The workflow of context emotion learning module is as follows:
Step 1: use -1,0 and 1 respectively indicate passive, neutral and positive three feeling polarities describe a sentence with square in model The feeling polarities distribution of all subject words, square are defined as follows in son:
μk=E ((X- μ)k)
Wherein X is sample value, and E () indicates expectation function, and k indicates the order of square, when order is odd number, μk∈[-1,1];When When order is even number, μk∈[0,1];All squares are normalized into [0,1];
Step 2: using first moment μ1With second moment μ2As the target of global square study, sentence is obtained using attention mechanism Indicate vs, then the estimated value of square, such as μ are obtained with different full articulamentums1Estimated value μ '1It can obtain as follows:
WhereinFor the weight of full articulamentum, vsIt is sentence expression
Step 3: and then each sample x is defined in global first moment μ1On loss it is as follows:
lμ1(x)=(μ '11)2
Step 4: obtaining global second moment using identical calculation and loseAnd then obtain global loss:
Step 5: the subject word emotion of a sentence is divided into two parts, i.e. left-half and right half part;If object Word number is odd number, then median can be divided into left-half, calculate separately left-half and right half part respective one Rank square and second moment;According to the calculating that global square loses, local moment loss is obtained:
Step 6: to lgloble(x) and llocal(x) it is weighted summation, to obtain context square study total losses.
4. the natural language sentiment analysis method according to claim 3 based on depth network, which is characterized in that for every A sample x, one context square study loss l of context square emotion learning module definitionm(x) carry out auxiliary object word emotion point The optimization of generic task;This loss loses l by the overall situationgloble(x) and local losses llocal(x) two parts form:
Wherein naIndicate the number of subject word in a sentence, lglobleIt (x) is the sum of the loss of context square in entire sentence, and llocal(x) be the respective context square of left-half and right half part in sentence the sum of loss.
5. the natural language sentiment analysis method according to claim 3 based on depth network, which is characterized in that described Output module, specific formula is as follows:
Wherein C is emotional category collection, and D is training set;y∈R|C|An only hot vector, i.e., its only on correct label point Amount is 0, fc(x, θ) is model prediction as a result, λ is L2The weight of regular terms, and λmIt is context square recurrence learning loss lm(x) Weight;
In the training process, training will be realized by minimizing loss L, obtain the parameter that model can be made to optimize;It is testing In the process, prediction result f is obtained by the most optimized parameter that training process obtainsc(x, θ), the wherein maximum component institute of score value Corresponding classification is exactly the classification predicted.
CN201811409537.8A 2018-11-23 2018-11-23 Natural language emotion analysis method based on deep network Active CN109543039B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811409537.8A CN109543039B (en) 2018-11-23 2018-11-23 Natural language emotion analysis method based on deep network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811409537.8A CN109543039B (en) 2018-11-23 2018-11-23 Natural language emotion analysis method based on deep network

Publications (2)

Publication Number Publication Date
CN109543039A true CN109543039A (en) 2019-03-29
CN109543039B CN109543039B (en) 2022-04-08

Family

ID=65850342

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811409537.8A Active CN109543039B (en) 2018-11-23 2018-11-23 Natural language emotion analysis method based on deep network

Country Status (1)

Country Link
CN (1) CN109543039B (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110110323A (en) * 2019-04-10 2019-08-09 北京明略软件系统有限公司 A kind of text sentiment classification method and device, computer readable storage medium
CN110134757A (en) * 2019-04-19 2019-08-16 杭州电子科技大学 A kind of event argument roles abstracting method based on bull attention mechanism
CN110222827A (en) * 2019-06-11 2019-09-10 苏州思必驰信息科技有限公司 The training method of text based depression judgement network model
CN110321563A (en) * 2019-06-28 2019-10-11 浙江大学 Text emotion analysis method based on mixing monitor model
CN110349620A (en) * 2019-06-28 2019-10-18 广州序科码生物技术有限责任公司 One kind accurately identifying interaction of molecules and its polarity and directionality method from PubMed document
CN110457480A (en) * 2019-08-16 2019-11-15 国网天津市电力公司 The construction method of fine granularity sentiment classification model based on interactive attention mechanism
CN110490136A (en) * 2019-08-20 2019-11-22 电子科技大学 A kind of human body behavior prediction method of knowledge based distillation
CN110990552A (en) * 2019-12-18 2020-04-10 北京声智科技有限公司 Method and device for determining operation sequence of natural language formula
CN111160037A (en) * 2019-12-02 2020-05-15 广州大学 Fine-grained emotion analysis method supporting cross-language migration
CN111259663A (en) * 2020-01-14 2020-06-09 北京百度网讯科技有限公司 Information processing method and device
CN111488734A (en) * 2020-04-14 2020-08-04 西安交通大学 Emotional feature representation learning system and method based on global interaction and syntactic dependency
CN111625652A (en) * 2019-07-12 2020-09-04 杭州电子科技大学 Attention neural network method based on multi-path dynamic mask
CN111914185A (en) * 2020-07-06 2020-11-10 华中科技大学 Graph attention network-based text emotion analysis method in social network
CN112035661A (en) * 2020-08-24 2020-12-04 北京大学深圳研究生院 Text emotion analysis method and system based on graph convolution network and electronic device
WO2021012183A1 (en) * 2019-07-23 2021-01-28 中山大学 Deducible machine learning reading comprehension system, and storage medium
CN112784532A (en) * 2021-01-29 2021-05-11 电子科技大学 Multi-head attention memory network for short text sentiment classification
CN112784573A (en) * 2021-01-25 2021-05-11 中南民族大学 Text emotion content analysis method, device and equipment and storage medium
CN113065331A (en) * 2021-04-15 2021-07-02 上海金融期货信息技术有限公司 Entity emotion recognition method and system based on entity context discrimination
CN113157919A (en) * 2021-04-07 2021-07-23 山东师范大学 Sentence text aspect level emotion classification method and system
CN113571097A (en) * 2021-09-28 2021-10-29 之江实验室 Speaker self-adaptive multi-view dialogue emotion recognition method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107092596A (en) * 2017-04-24 2017-08-25 重庆邮电大学 Text emotion analysis method based on attention CNNs and CCR
CN107578785A (en) * 2017-09-05 2018-01-12 哈尔滨工业大学 The continuous emotional feature analysis evaluation method of music based on Gamma distributional analysis
CN107578092A (en) * 2017-09-01 2018-01-12 广州智慧城市发展研究院 A kind of emotion compounding analysis method and system based on mood and opinion mining
CN108133038A (en) * 2018-01-10 2018-06-08 重庆邮电大学 A kind of entity level emotional semantic classification system and method based on dynamic memory network
CN108399158A (en) * 2018-02-05 2018-08-14 华南理工大学 Attribute sensibility classification method based on dependency tree and attention mechanism
CN108460009A (en) * 2017-12-14 2018-08-28 中山大学 The attention mechanism Recognition with Recurrent Neural Network text emotion analytic approach of embedded sentiment dictionary
US10133791B1 (en) * 2014-09-07 2018-11-20 DataNovo, Inc. Data mining and analysis system and method for legal documents

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10133791B1 (en) * 2014-09-07 2018-11-20 DataNovo, Inc. Data mining and analysis system and method for legal documents
CN107092596A (en) * 2017-04-24 2017-08-25 重庆邮电大学 Text emotion analysis method based on attention CNNs and CCR
CN107578092A (en) * 2017-09-01 2018-01-12 广州智慧城市发展研究院 A kind of emotion compounding analysis method and system based on mood and opinion mining
CN107578785A (en) * 2017-09-05 2018-01-12 哈尔滨工业大学 The continuous emotional feature analysis evaluation method of music based on Gamma distributional analysis
CN108460009A (en) * 2017-12-14 2018-08-28 中山大学 The attention mechanism Recognition with Recurrent Neural Network text emotion analytic approach of embedded sentiment dictionary
CN108133038A (en) * 2018-01-10 2018-06-08 重庆邮电大学 A kind of entity level emotional semantic classification system and method based on dynamic memory network
CN108399158A (en) * 2018-02-05 2018-08-14 华南理工大学 Attribute sensibility classification method based on dependency tree and attention mechanism

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
梁斌等: "基于多注意力卷积神经网络的特定目标情感分析", 《计算机研究与发展》 *

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110110323A (en) * 2019-04-10 2019-08-09 北京明略软件系统有限公司 A kind of text sentiment classification method and device, computer readable storage medium
CN110110323B (en) * 2019-04-10 2022-11-11 北京明略软件系统有限公司 Text emotion classification method and device and computer readable storage medium
CN110134757A (en) * 2019-04-19 2019-08-16 杭州电子科技大学 A kind of event argument roles abstracting method based on bull attention mechanism
CN110222827A (en) * 2019-06-11 2019-09-10 苏州思必驰信息科技有限公司 The training method of text based depression judgement network model
CN110349620A (en) * 2019-06-28 2019-10-18 广州序科码生物技术有限责任公司 One kind accurately identifying interaction of molecules and its polarity and directionality method from PubMed document
CN110321563A (en) * 2019-06-28 2019-10-11 浙江大学 Text emotion analysis method based on mixing monitor model
CN111625652A (en) * 2019-07-12 2020-09-04 杭州电子科技大学 Attention neural network method based on multi-path dynamic mask
WO2021012183A1 (en) * 2019-07-23 2021-01-28 中山大学 Deducible machine learning reading comprehension system, and storage medium
CN110457480A (en) * 2019-08-16 2019-11-15 国网天津市电力公司 The construction method of fine granularity sentiment classification model based on interactive attention mechanism
CN110490136A (en) * 2019-08-20 2019-11-22 电子科技大学 A kind of human body behavior prediction method of knowledge based distillation
CN111160037A (en) * 2019-12-02 2020-05-15 广州大学 Fine-grained emotion analysis method supporting cross-language migration
CN110990552A (en) * 2019-12-18 2020-04-10 北京声智科技有限公司 Method and device for determining operation sequence of natural language formula
CN111259663A (en) * 2020-01-14 2020-06-09 北京百度网讯科技有限公司 Information processing method and device
US11775776B2 (en) 2020-01-14 2023-10-03 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for processing information
CN111259663B (en) * 2020-01-14 2023-05-26 北京百度网讯科技有限公司 Information processing method and device
CN111488734A (en) * 2020-04-14 2020-08-04 西安交通大学 Emotional feature representation learning system and method based on global interaction and syntactic dependency
CN111488734B (en) * 2020-04-14 2022-02-22 西安交通大学 Emotional feature representation learning system and method based on global interaction and syntactic dependency
CN111914185A (en) * 2020-07-06 2020-11-10 华中科技大学 Graph attention network-based text emotion analysis method in social network
CN111914185B (en) * 2020-07-06 2024-03-22 华中科技大学 Text emotion analysis method in social network based on graph attention network
CN112035661A (en) * 2020-08-24 2020-12-04 北京大学深圳研究生院 Text emotion analysis method and system based on graph convolution network and electronic device
CN112784573A (en) * 2021-01-25 2021-05-11 中南民族大学 Text emotion content analysis method, device and equipment and storage medium
CN112784573B (en) * 2021-01-25 2023-12-19 中南民族大学 Text emotion content analysis method, device, equipment and storage medium
CN112784532B (en) * 2021-01-29 2022-09-02 电子科技大学 Multi-head attention memory system for short text sentiment classification
CN112784532A (en) * 2021-01-29 2021-05-11 电子科技大学 Multi-head attention memory network for short text sentiment classification
CN113157919A (en) * 2021-04-07 2021-07-23 山东师范大学 Sentence text aspect level emotion classification method and system
CN113065331A (en) * 2021-04-15 2021-07-02 上海金融期货信息技术有限公司 Entity emotion recognition method and system based on entity context discrimination
CN113571097A (en) * 2021-09-28 2021-10-29 之江实验室 Speaker self-adaptive multi-view dialogue emotion recognition method and system

Also Published As

Publication number Publication date
CN109543039B (en) 2022-04-08

Similar Documents

Publication Publication Date Title
CN109543039A (en) A kind of natural language sentiment analysis method based on depth network
CN107239446B (en) A kind of intelligence relationship extracting method based on neural network Yu attention mechanism
CN103207855B (en) For the fine granularity sentiment analysis system and method for product review information
Tay et al. Densely connected attention propagation for reading comprehension
CN106649275A (en) Relation extraction method based on part-of-speech information and convolutional neural network
CN106855853A (en) Entity relation extraction system based on deep neural network
CN105976056A (en) Information extraction system based on bidirectional RNN
CN105975555A (en) Bidirectional recursive neural network-based enterprise abbreviation extraction method
CN106598950A (en) Method for recognizing named entity based on mixing stacking model
CN106202032A (en) A kind of sentiment analysis method towards microblogging short text and system thereof
CN110110754B (en) Method for classifying imbalance problems based on cost local generalization errors
CN109446331A (en) A kind of text mood disaggregated model method for building up and text mood classification method
CN112561718A (en) Case microblog evaluation object emotion tendency analysis method based on BilSTM weight sharing
CN110415071A (en) A kind of competing product control methods of automobile based on opining mining analysis
Chen et al. cs@ DravidianLangTech-EACL2021: Offensive language identification based on multilingual BERT model
CN108427865A (en) A method of prediction LncRNA and environmental factor incidence relation
CN113177417A (en) Trigger word recognition method based on hybrid neural network and multi-stage attention mechanism
Luu et al. A Multiple Choices Reading Comprehension Corpus for Vietnamese Language Education
Qi et al. MS-transformer: introduce multiple structural priors into a unified transformer for encoding sentences
Li Application of an Internet of Things Oriented Network Education Platform in English Language Teaching
Xu et al. AHRNN: Attention‐Based Hybrid Robust Neural Network for emotion recognition
Zhao et al. Detection of Chinese Grammatical Errors with Context Representation
Xu et al. BiRNN-DKT: transfer bi-directional LSTM RNN for knowledge tracing
Chandrasekaran et al. Automating Transfer Credit Assessment in Student Mobility--A Natural Language Processing-based Approach
Umapathy et al. Segmentation of Floorplans and Heritage Sites: An Approach to Unbalanced Dataset

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant