CN108133038A - A kind of entity level emotional semantic classification system and method based on dynamic memory network - Google Patents

A kind of entity level emotional semantic classification system and method based on dynamic memory network Download PDF

Info

Publication number
CN108133038A
CN108133038A CN201810022435.4A CN201810022435A CN108133038A CN 108133038 A CN108133038 A CN 108133038A CN 201810022435 A CN201810022435 A CN 201810022435A CN 108133038 A CN108133038 A CN 108133038A
Authority
CN
China
Prior art keywords
module
dynamic memory
input
entity
memory network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810022435.4A
Other languages
Chinese (zh)
Other versions
CN108133038B (en
Inventor
张祖凡
汪露
邹阳
甘臣权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN201810022435.4A priority Critical patent/CN108133038B/en
Publication of CN108133038A publication Critical patent/CN108133038A/en
Application granted granted Critical
Publication of CN108133038B publication Critical patent/CN108133038B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

A kind of entity level emotional semantic classification system and method based on dynamic memory network is claimed in the present invention, belongs to natural language processing field.This method includes the following steps:1st, it is question answering system to introduce dynamic memory network by entity level emotional semantic classification task modeling;2nd, the input module in dynamic memory network encodes input text;3rd, word position information and residual error structure is added in input module to represent to enhance input;4th, design is directed to target word problem in problem module;4th, memory module represents input by two submodules to carry out memory retrieval;5th, the feature that response means extract memory module carries out feeling polarities prediction and model training.6th, after the complete model of training, institute's training pattern can complete the emotional semantic classification of entity level, including positive, neutral and negative sense feeling polarities.The present invention can not only handle simple sentence and can correctly handle the emotional semantic classification problem of target entity in complicated clause.

Description

A kind of entity level emotional semantic classification system and method based on dynamic memory network
Technical field
The invention belongs to natural language processing fields, particularly belong to carry out special entity in sentence the side of emotional semantic classification Method.
Background technology
With pushing away social platforms and the Amazons such as special (Twitter), facebook (Facebook), microblogging (Weibo) (Amazon), the rise of the e-commerce platforms such as Taobao (Taobao), comment property textual resources are growing day by day on network.It faces A large amount of non-structured comment texts from microblogging, forum, there is an urgent need to by natural language processing technique to special in text Determine entity and carry out emotional category analysis.The classification of entity level feeling polarities is focused on identifies user to certain product, clothes from data Business or the Sentiment orientation of Social Public Feelings.In practice, entity level sentiment analysis passes through governability, public sentiment supervision, the consumer goods Xiao Shangdeng departments, which generate strategy, very important effect.Traditional sentiment analysis overwhelming majority is using tradition NLP features Model is built with mode that machine learning is combined.But the design of tradition NLP features generally requires the domain knowledge of expert, Cost of labor is higher, and the generalization and migration of system are poor.The deep learning method risen can relatively well make up within nearly 2 years The defects of stating method, deep learning can learn the feature representation of description data essence automatically, so as to avoid engineer The defects of feature.
For entity level emotional semantic classification task, have a large amount of models, learn including being based on manual features with conventional machines Method, based on neural network method and based on memory network method.But such method is there are problems, such as traditional artificial The special zone of feature needs a large amount of Feature Engineering work and the knowledge of grammar;Method structure based on LSTM is more single, can not The feeling polarities of special entity in the complicated clause of processing.It is such such as when sentence is confirmative question, interrogative sentence or comparative sentence Method performance degradation.The low-level feature of sentence has been utilized only to based on the methods of memory network, has not considered high level It is semantic.Therefore propose that more effective model is a very important job of the task.
Invention content
Present invention seek to address that above problem of the prior art.Entity in simple clause can not only be solved by proposing one kind Emotion is judged to problem, and can effectively solve the problem that entity emotion polarity discriminating difficulty problem in complicated clause based on dynamic memory The entity level emotional semantic classification system and method for network.Technical scheme is as follows:
A kind of entity level emotional semantic classification system based on dynamic memory network, including a dynamic memory network, institute It states dynamic memory network and mainly includes input module, problem module, memory module and response means;Utilize dynamic memory network Entity level emotional semantic classification problem is modeled as Question-Answering Model;Wherein, input module is for the input to designated entities target Text carries out coded treatment and is represented with obtaining text vector;
Design is for physical object word problem in problem module, and for memory module, update provides attention alignment every time Characteristic information;
Memory module respectively remembers text input expression by multi-hop attention and memory two submodules of update Extraction is updated with memory, and final updated feature is transferred to response means;
Response means, the feature for being extracted to memory module carry out feeling polarities prediction and model training;It is instructing After having practiced model, training pattern can complete the emotional semantic classification of entity level, including positive, neutral and negative sense feeling polarities.
Further, word position information and residual error structure are additionally added in the input module and is represented with enhancing input.
Further, the input module to input text carry out coded treatment obtain text vector expression specifically include;
Given input text sequence { w1,w2,...,wnAnd correspondent entity targetN represents to wrap in text The word number contained,Represent m-th of word of composition physical object.Text sequence will be inputted first with pre-training term vector Row are mapped to term vector sequence { e1,e2,...,en, by term vector Ordered stacks into term vector matrixWherein d generations Table term vector dimension;
Coded treatment is carried out to the vector after fusion using single-layer bidirectional GRU structures, vector table shows after being encodedIt encodes as follows:
Wherein, GRUfTo GRU networks, GRU before representingbRepresent backward GRU networks,Represent the output of two-way GRU networks Hidden vector.
Further, word position information and residual error structure are additionally added in the input module and is represented with enhancing input, It specifically includes:
The relative distance of the word and entity word in context is calculated first, is defined as pi, term vector training method is used for reference, Relative position is mapped as position vector, is defined as li, and be regarded as network can automatic learning parameter.In order to by position Vector is merged with term vector, using vectorial corresponding element addition method:si=ei+li, finally obtain fusion sequence vector { s1, s2,...,sn};
The residual error structure structure is introduced into input module enhancing text representation, the coded representation of input module final output For:
Wherein eiRepresent term vector.
Further, design for target word problem, specifically includes in described problem module
It encodes to obtain entity word character representation by the emotional problems corresponding to design object word, be asked first by designed Topic is mapped as problem term vector sequence, it is encoded followed by single-layer bidirectional GRU structures to obtain the coding schedule of target word Show, it is q to define the final moment hidden layer state after GRU codings0;In addition, in order to make problem representation space and input characterization space Existing characteristics difference encodes in GRU and non-linear layer is added in obtained feature base, and final problem module output is:
Q=tan (W(q)q0+b(q))
Wherein q0For the final hidden layer state of GRU codings, W(q)And b(q)For representation parameter.
Further, the multi-hop attention mechanism of the memory module includes:Soft attention, based on attention mechanism GRU networks and inward attention power GRU networks.
Further, after each attention step of the memory module using ReLU structures come fresh information, calculate It is as follows:
m0=q
mk=ReLU (Wk[mk-1;ck;q]+b)
Wherein Utilizing question coded representation q initialization memories m0, WkAnd bkTo remember undated parameter.Wherein, k represents kth It is secondary note that b represent offset parameter, ckRepresent that kth time pays attention to extracted memory character information.
Further, the output of memory module is sent by the response means after multiple attentionsteps is completed Softmax layers of progress feeling polarities prediction, output are emotional category probability distribution, are calculated as follows:
yp=softmax (W(o)mk+b(o))
Wherein ypRepresent the probability distribution of classification, W(o)Represent output layer parameter matrix, b(o)Represent output layer offset parameter mkRepresent the updated memory character of kth time.Model training passes through the following loss function of most lowerization:
Wherein D represents training dataset, and C is emotional category type, and θ represents model parameter, ycRepresent true classification mark Label, λ is L2Regular parameter item.
A kind of entity level sensibility classification method based on dynamic memory network based on the system, including following Step:
Step 1: entity level emotional semantic classification problem is modeled as Question-Answering Model using dynamic memory network;Dynamic memory Network mainly includes input module, problem module, memory module and response means;
Step 2: the input module in dynamic memory network carries out coded treatment to input text and obtains text vector table Show;
Step 3: the problems in dynamic memory network module, for the special entity in sentence, being responsible for design, its is corresponding Emotional problems;
Step 4: memory module in dynamic memory network represents to carry out to the text vector after coded by input module Processing updates two submodules by multi-hop attention and memory and extracts text feature.
Step 5: the text feature progress emotion that the response means in dynamic memory network extract memory module is general Rate is predicted, model training is carried out by minimizing corresponding loss function;
Step 6: after the complete model of training, which completes the emotional semantic classification problem of entity level, including positive, neutral Or negative sense feeling polarities.
It advantages of the present invention and has the beneficial effect that:
Physical object sentiment analysis task is converted into question answering system by the present invention first, and design is for the feelings of physical object Sense problem, and representation is carried out to designed problem with two-way GRU, better than the entity representation method of existing invention;Structure is dynamic State memory network by multi-hop attention and memory two submodules of update, more accurately realizes that physical object emotion is related Feature extraction.Compared to existing invention, the present invention can more accurately complete physical object sentiment analysis task, be more suitable for comprehensive Close complicated society, business scenario.
Description of the drawings
Fig. 1 is the system flow chart that the present invention provides preferred embodiment;
Fig. 2 is system model figure;
Fig. 3 is input module structure chart;
Fig. 4 calculates schematic diagram for relative distance;
Fig. 5 is memory module structure chart;
Fig. 6 is attentionbasedGRU network structures;
Fig. 7 is innerattentionGRU network structures.
Specific embodiment
Below in conjunction with the attached drawing in the embodiment of the present invention, the technical solution in the embodiment of the present invention is carried out clear, detailed Carefully describe.Described embodiment is only the part of the embodiment of the present invention.
The present invention solve above-mentioned technical problem technical solution be:
As shown in Figure 1, the entity level sensibility classification method based on dynamic memory network:
Step 1: entity level emotional semantic classification problem is modeled as Question-Answering Model first with dynamic memory network;Such as figure Shown in 2, dynamic memory network mainly includes input module, problem module, memory module and response means;
Step 2: the input module in dynamic memory network is responsible for obtaining sentence table to input text progress coded treatment Show, as shown in Figure 3;Detailed process is:Given input text sequence w1, w2 ..., wn } and correspondent entity target [wi0 ..., wim], first with pre-training term vector will input text sequence be mapped to term vector sequence e1, e2 ..., En }, term vector Ordered stacks are represented into term vector dimension into term vector matrix wherein d.It is worth noting that, In many tasks, the location information of word is also very important.Such as given sentence " Although the tables may be Closely situated, the candle-light, food-quality and service overcompensate. ", For entity word " table ", emotion word " situated " differentiates entity emotion more important.However to entity word " candle-light ", " overcompensate " but dominate entity emotion tendency.Therefore, it is obtained from above-mentioned can analyze, Word closer to entity word is usually even more important to entity emotion polarity judging.It is represented in order to which location information is dissolved into word In, input module calculates the relative distance of the word and entity word in context first, is defined as pi.Use for reference term vector training side Relative position is mapped as position vector by formula, is defined as li, and be regarded as network can automatic learning parameter.In order to by position It puts vector to merge with term vector, the present invention is simply using vectorial corresponding element addition method:Si=ei+li finally obtains fusion Sequence vector s1, s2 ..., sn }.Then the present invention is carried out at coding the vector after fusion using single-layer bidirectional GRU structures Reason, vector table shows that GRU structures can possess less under the premise of performance is not sacrificed after being encoded Parameter can significantly reduce model complexity, accelerate model training, and coding is as follows:
Residual error structure shows huge advantage in computer vision field, which is introduced input module by the present invention Enhance text representation.The coded representation of input module final output is:
Step 3: the problems in dynamic memory network module, for the special entity in sentence, being responsible for design, its is corresponding Emotional problems.Concrete processing procedure is:When entity is made of multiple words, different from other methods only simply by institute's mesh Term vector corresponding to mark word takes average as entity expression.For such method when handling long target word, emotion differentiates effect It is poor.For the problem, the present invention encodes to obtain entity word character representation by the emotional problems corresponding to design object word, Such as " What is the sentiment to the $ T $.Designed problem is mapped as problem term vector sequence first, It is encoded followed by single-layer bidirectional GRU structures to obtain the coded representation of target word, is defined final after GRU codings Moment hidden layer state is q0.In addition, in order to make problem representation space and input characterization space existing characteristics difference, the present invention exists GRU is encoded and is added in non-linear layer in obtained feature base.Final problem module, which exports, is:
Q=tan (W(q)q0+b(q))
Wherein q0Hidden layer state, W are encoded for GRU(q)And b(q)For representation parameter.
Representation method proposed by the invention mainly has two aspect advantages compared to simple average method:First, energy It is enough that depth representing is carried out to the entity repeatedly formed;Second, GRU structures can build the relationship between physical object word Mould.
4th, the entity level sensibility classification method according to claim 1 based on dynamic memory network, feature exist In the concrete processing procedure of the step 4 is:
Although current existing model, which can be handled in sentence, has apparent emotion word, when the complicated language of processing During sentence, model performance is poor.For example, given sentence " Myresponse to the film is best described as Lukewarm ", when the Sentiment orientation of analysis entities " film ", existing model is all inclined to being distributed more to emotion word " best " High weight.However, being found after contextual information is incorporated, " lukewarm " plays decisive role to entity.In addition, Compare clause " I have hadbetter Japanese food at a mall food court ", existing method is often The feeling polarities by entity " Japanese food " of mistake are determined as forward direction, due to the presence of emotion word " better ". But since the clause is comparative sentence, existing model, which fails to be truly realized, understands that sentence looks like, and depends only on apparent emotion Word.
In entity level emotional semantic classification task, entity information is dissolved into input term vector or hidden layer by many models In state.However, such method is there are attention biasing problem, and also only only accounts for single and pay attention to process, fail to handle Target entity emotional semantic classification problem in complicated clause.The defects of for existing model, the present invention in memory module repeatedly to defeated The coded representation for entering module represents to carry out relevant information extraction.Different from single feature extraction, this method is each It in attention, is only capable of a certain specific information in distich and is extracted, and can not notice stage construction characteristic information.This Invention can really infer the feeling polarities of entity rather than only rely only on by adding in multiple attention mechanism, memory module In emotion word.Memory module is made of two parts:Attention mechanism and memory update.Attention Mechanism obtains immense success in many fields.Three kinds of different attention mechanism of this research and probe, packet It includes:Soft attention, attentionbased GRU network and inner attention GRU network.
A.Attention mechanisms
A, soft attention mechanism is a kind of most common attention mechanism of existing model, each time During attention step, this method can calculate its attention to the coded representation vector of the output of input module Score obtains this time using weighted sum method and pays attention to extracted memory character.It calculates as follows:
Wherein hiIt is represented for input module exports coding, q is represented for representation, mk-1For -1 recall info of kth, Represent the vector after series connection, LiFor sentence length.ckFor kth time memory module characteristic information.W(2), W(1), b(1)And b(2)For net Network parameter.
This method has following both sides advantage:First, this method calculates simply, complexity is low;Second, when classification swashs When function of living is sharp, soft attention being capable of approximation hard attention.However, this method is there are still following deficiency, First, do not consider the sequence information and location information of sentence;Second, the entity emotion classification in complicated clause is asked Topic, effect are poor
B, therefore for more complicated sentence, the present invention explores attention based GRU structures.In standard GRU In structure, input gate allows how much information to pass through for decision.But input gate only only account for current input information with before Moment recall info, lacks entity information and a preceding step extracts information.Therefore, in order to preferably input gate be allowed to determine Determine information flow, we will replace input gate using new weight door.Wherein weight calculation is as follows:
Wherein hiIt is represented for input module exports coding, q is represented for representation, mk-1For -1 recall info of kth, ⊙ Represent that corresponding vector element is multiplied,Represent the vector after series connection.LiFor sentence length.ckFor kth time memory module feature letter Breath.W(a2), W(a1), b(a1)And b(a2)For network parameter.
It is obtaining weight behind the door, similar to standard GRU, attention based GRU networks is calculated using following formula Hidden layer state:
WhereinIt represents in kth time attention, the resetting door of attention based GRU networks;Represent kth In secondary attention, the encoded candidate states of attention based GRU;It represents in kth time attention, The hidden layer state of attentionbased GRU.WithFor network architecture parameters.
C, Attentionbased GRU structures need additional step and more parameters to calculate weight door.In order to subtract Inner attention mechanism is applied in model by light network structure, this research.Inner attention mechanism directly will External information:Target word represents and previous attention step extract the input gate and again that information is fused to standard GRU It puts in door.
Wherein,WithRepresent the input gate of inner attention GRU networks and resetting door.
B.Memory update
After each attention step of memory module, lift recall info and contain input text in a certain respect Information.In original dynamic memory network, one layer of unidirectional GRU structure extracts information for updating.In order to preferably into Row memory update, the present invention come fresh information, are calculated as follows using ReLU structures:
m0=q
mk=ReLU (Wk[mk-1;ck;q]+b)
Wherein
In order to reduce model parameter, acceleration model training, the different layers shared parameter of memory module.
Step 5: the response means in dynamic memory network are responsible for the text feature that memory module is extracted into market Feel probabilistic forecasting, model training is carried out by minimizing corresponding loss function.Concrete processing procedure is:It completes repeatedly After attention steps, softmax layers of progress feeling polarities prediction are sent into the output of memory module.It is exported as emotion Class probability is distributed, and is calculated as follows:
yp=softmax (W(o)mk+b(o))
Wherein ypRepresent the probability distribution of classification.Model training passes through the following loss function of most lowerization:
Wherein D represents training dataset, and C is emotional category type, and λ is L2Regular parameter item.
Step 6: after model training is completed, institute's training pattern can carry out emotion to entity non-classified in sentence and sentence Not, differentiate that classification includes:Forward direction, neutral and negative sense.
The above embodiment is interpreted as being merely to illustrate the present invention rather than limit the scope of the invention. After the content of record of the present invention has been read, technical staff can make various changes or modifications the present invention, these are equivalent Variation and modification equally fall into the scope of the claims in the present invention.

Claims (9)

1. a kind of entity level emotional semantic classification system based on dynamic memory network, which is characterized in that including a dynamic memory Network, the dynamic memory network mainly include input module, problem module, memory module and response means;Remembered using dynamic Recall network and entity level emotional semantic classification problem is modeled as Question-Answering Model;Wherein, input module is used for designated entities target Input text carry out coded treatment represented with obtaining text vector;
Design for memory module updates for physical object word problem and provides the feature letter for paying attention to alignment every time in problem module Breath;
Memory module represents text input to carry out memory retrieval by multi-hop attention and memory two submodules of update respectively It is updated with memory, and final updated feature is transferred to response means;
Response means, the feature for being extracted to memory module carry out feeling polarities prediction and model training;It is complete in training After model, training pattern can complete the emotional semantic classification of entity level, including positive, neutral and negative sense feeling polarities.
2. the entity level emotional semantic classification system according to claim 1 based on dynamic memory network, which is characterized in that institute State word position information and residual error structure are additionally added in input module with enhance input represent.
3. the entity level emotional semantic classification system according to claim 2 based on dynamic memory network, which is characterized in that institute State input module to input text carry out coded treatment obtain text vector expression specifically include;
Given input text sequence { w1,w2,...,wnAnd correspondent entity targetN represents what is included in text Word number,It represents m-th of word of composition physical object, text sequence mapping will be inputted first with pre-training term vector Into term vector sequence { e1,e2,...,en, by term vector Ordered stacks into term vector matrixWherein d represents term vector Dimension;
Coded treatment is carried out to the vector after fusion using single-layer bidirectional GRU structures, vector table shows after being encodedIt encodes as follows:
Wherein, GRUfTo GRU networks, GRU before representingbRepresent backward GRU networks,Represent the hidden vector of output of two-way GRU networks.
4. the entity level emotional semantic classification system according to claim 3 based on dynamic memory network, which is characterized in that institute State word position information and residual error structure are additionally added in input module with enhance input represent, specifically include:
The relative distance of the word and entity word in context is calculated first, is defined as pi, term vector training method is used for reference, it will be opposite Position is mapped as position vector, is defined as li, and be regarded as network can automatic learning parameter.In order to by position vector and word Vector Fusion, using vectorial corresponding element addition method:si=ei+li, finally obtain fusion sequence vector { s1,s2,...,sn};
The residual error structure structure is introduced into input module enhancing text representation, the coded representation of input module final output is:
Wherein eiRepresent term vector.
5. the entity level emotional semantic classification system according to claim 1 based on dynamic memory network, which is characterized in that institute Design in problem module is stated, for target word problem, to specifically include
It encodes to obtain entity word character representation by the emotional problems corresponding to design object word, first maps designed problem For problem term vector sequence, it is encoded followed by single-layer bidirectional GRU structures to obtain the coded representation of target word, be defined Final moment hidden layer state after GRU codings is q0;In addition, in order to make problem representation space and input characterization space existing characteristics Difference encodes in GRU and non-linear layer is added in obtained feature base, and final problem module output is:
Q=tan (W(q)q0+b(q))
Wherein q0For the final hidden layer state of GRU codings, W(q)And b(q)For representation parameter.
6. the entity level emotional semantic classification system according to claim 3 based on dynamic memory network, which is characterized in that institute The multi-hop attention mechanism for stating memory module includes:Soft attention, GRU networks and inward attention power based on attention mechanism GRU networks.
7. the entity level emotional semantic classification system according to claim 6 based on dynamic memory network, which is characterized in that institute It states after each attentionstep of memory module using ReLU structures come fresh information, calculates as follows:
m0=q
mk=ReLU (Wk[mk-1;ck;q]+b)
Wherein Utilizing question coded representation q initialization memories m0, WkAnd bkTo remember undated parameter.Wherein, k represents kth time note Meaning, b represent offset parameter, ckRepresent that kth time pays attention to extracted memory character information.
8. the entity level emotional semantic classification system according to claim 6 based on dynamic memory network, which is characterized in that institute Response means are stated after multiple attentionsteps is completed, the output of memory module is sent into softmax layers carries out emotion pole Property prediction, output calculated as follows for emotional category probability distribution:
yp=softmax (W(o)mk+b(o))
Wherein ypRepresent the probability distribution of classification, W(o)Represent output layer parameter matrix, b(o)Represent output layer offset parameter mkIt represents The updated memory character of kth time.Model training passes through the following loss function of most lowerization:
Wherein D represents training dataset, and C is emotional category type, and θ represents model parameter, ycRepresent true class label, λ is L2 Regular parameter item.
9. a kind of entity level sensibility classification method based on dynamic memory network based on system described in claim 1, special Sign is to include the following steps:
Step 1: entity level emotional semantic classification problem is modeled as Question-Answering Model using dynamic memory network;Dynamic memory network Mainly include input module, problem module, memory module and response means;
Step 2: the input module in dynamic memory network carries out coded treatment to input text and obtains text vector expression;
Step 3: the problems in dynamic memory network module is responsible for designing its corresponding emotion for the special entity in sentence Problem;
Step 4: memory module in dynamic memory network handles the text vector expression after coded by input module, Two submodules are updated by multi-hop attention and memory and extract text feature.
Step 5: the text feature progress emotion probability that the response means in dynamic memory network extract memory module is pre- It surveys, model training is carried out by minimizing corresponding loss function;
Step 6: after the complete model of training, which completes the emotional semantic classification problem of entity level, including positive, neutral or Negative sense feeling polarities.
CN201810022435.4A 2018-01-10 2018-01-10 Entity level emotion classification system and method based on dynamic memory network Active CN108133038B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810022435.4A CN108133038B (en) 2018-01-10 2018-01-10 Entity level emotion classification system and method based on dynamic memory network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810022435.4A CN108133038B (en) 2018-01-10 2018-01-10 Entity level emotion classification system and method based on dynamic memory network

Publications (2)

Publication Number Publication Date
CN108133038A true CN108133038A (en) 2018-06-08
CN108133038B CN108133038B (en) 2022-03-22

Family

ID=62399643

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810022435.4A Active CN108133038B (en) 2018-01-10 2018-01-10 Entity level emotion classification system and method based on dynamic memory network

Country Status (1)

Country Link
CN (1) CN108133038B (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108830507A (en) * 2018-06-29 2018-11-16 成都数之联科技有限公司 A kind of food safety risk method for early warning
CN108921285A (en) * 2018-06-22 2018-11-30 西安理工大学 Single-element classification method in sequence based on bidirectional valve controlled Recognition with Recurrent Neural Network
CN108920587A (en) * 2018-06-26 2018-11-30 清华大学 Merge the open field vision answering method and device of external knowledge
CN109033463A (en) * 2018-08-28 2018-12-18 广东工业大学 A kind of community's question and answer content recommendation method based on end-to-end memory network
CN109299429A (en) * 2018-07-11 2019-02-01 重庆邮电大学 A kind of dynamic society's relationship modeling method based on Wiener-Hopf equation
CN109472031A (en) * 2018-11-09 2019-03-15 电子科技大学 A kind of aspect rank sentiment classification model and method based on double memory attentions
CN109543039A (en) * 2018-11-23 2019-03-29 中山大学 A kind of natural language sentiment analysis method based on depth network
CN109558477A (en) * 2018-10-23 2019-04-02 深圳先进技术研究院 A kind of community's question answering system, method and electronic equipment based on multi-task learning
CN109726903A (en) * 2018-12-19 2019-05-07 中国电子科技集团公司信息科学研究院 Distributed multi agent Collaborative Decision Making Method based on attention mechanism
CN109885670A (en) * 2019-02-13 2019-06-14 北京航空航天大学 A kind of interaction attention coding sentiment analysis method towards topic text
CN109902174A (en) * 2019-02-18 2019-06-18 山东科技大学 A kind of feeling polarities detection method of the memory network relied on based on aspect
CN109933795A (en) * 2019-03-19 2019-06-25 上海交通大学 Based on context-emotion term vector text emotion analysis system
CN109960725A (en) * 2019-01-17 2019-07-02 平安科技(深圳)有限公司 Text classification processing method, device and computer equipment based on emotion
CN110276076A (en) * 2019-06-25 2019-09-24 北京奇艺世纪科技有限公司 A kind of text mood analysis method, device and equipment
CN110287296A (en) * 2019-05-21 2019-09-27 平安科技(深圳)有限公司 A kind of problem answers choosing method, device, computer equipment and storage medium
CN110298403A (en) * 2019-07-02 2019-10-01 郭刚 The sentiment analysis method and system of enterprise dominant in a kind of financial and economic news
CN110377744A (en) * 2019-07-26 2019-10-25 北京香侬慧语科技有限责任公司 A kind of method, apparatus, storage medium and the electronic equipment of public sentiment classification
CN110457450A (en) * 2019-07-05 2019-11-15 平安科技(深圳)有限公司 Answer generation method and relevant device based on neural network model
CN110472042A (en) * 2019-07-02 2019-11-19 桂林电子科技大学 A kind of fine granularity sensibility classification method
CN111222009A (en) * 2019-10-25 2020-06-02 汕头大学 Processing method of multi-modal personalized emotion based on long-time memory mechanism
CN111241842A (en) * 2018-11-27 2020-06-05 阿里巴巴集团控股有限公司 Text analysis method, device and system
CN111368536A (en) * 2018-12-07 2020-07-03 北京三星通信技术研究有限公司 Natural language processing method, apparatus and storage medium therefor
CN113326378A (en) * 2021-06-16 2021-08-31 山西财经大学 Cross-domain text emotion classification method based on parameter migration and attention sharing mechanism
CN113434669A (en) * 2021-05-31 2021-09-24 华东师范大学 Natural language relation extraction method based on sequence marking strategy
CN113849651A (en) * 2021-09-28 2021-12-28 平安科技(深圳)有限公司 Document-level emotional tendency-based emotion classification method, device, equipment and medium
CN114494980A (en) * 2022-04-06 2022-05-13 中国科学技术大学 Diversified video comment generation method, system, equipment and storage medium
CN115331128A (en) * 2022-10-11 2022-11-11 松立控股集团股份有限公司 Viaduct crack detection method
CN111079409B (en) * 2019-12-16 2023-04-25 东北大学秦皇岛分校 Emotion classification method utilizing context and aspect memory information

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060122834A1 (en) * 2004-12-03 2006-06-08 Bennett Ian M Emotion detection device & method for use in distributed systems
WO2017042620A1 (en) * 2015-09-08 2017-03-16 Iacus Stefano Maria Isa: a fast, scalable and accurate algorithm for supervised opinion analysis
US9633007B1 (en) * 2016-03-24 2017-04-25 Xerox Corporation Loose term-centric representation for term classification in aspect-based sentiment analysis
CN107092596A (en) * 2017-04-24 2017-08-25 重庆邮电大学 Text emotion analysis method based on attention CNNs and CCR
CN107153642A (en) * 2017-05-16 2017-09-12 华北电力大学 A kind of analysis method based on neural network recognization text comments Sentiment orientation
CN107247702A (en) * 2017-05-05 2017-10-13 桂林电子科技大学 A kind of text emotion analysis and processing method and system
CN107273348A (en) * 2017-05-02 2017-10-20 深圳大学 The topic and emotion associated detecting method and device of a kind of text

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060122834A1 (en) * 2004-12-03 2006-06-08 Bennett Ian M Emotion detection device & method for use in distributed systems
WO2017042620A1 (en) * 2015-09-08 2017-03-16 Iacus Stefano Maria Isa: a fast, scalable and accurate algorithm for supervised opinion analysis
US9633007B1 (en) * 2016-03-24 2017-04-25 Xerox Corporation Loose term-centric representation for term classification in aspect-based sentiment analysis
CN107092596A (en) * 2017-04-24 2017-08-25 重庆邮电大学 Text emotion analysis method based on attention CNNs and CCR
CN107273348A (en) * 2017-05-02 2017-10-20 深圳大学 The topic and emotion associated detecting method and device of a kind of text
CN107247702A (en) * 2017-05-05 2017-10-13 桂林电子科技大学 A kind of text emotion analysis and processing method and system
CN107153642A (en) * 2017-05-16 2017-09-12 华北电力大学 A kind of analysis method based on neural network recognization text comments Sentiment orientation

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
M.YANG等: ""Attention based LSTM for target dependent sentiment classification"", 《PROCEEDINGS OF THE THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE》 *
Y.TAY等: ""Dyadic memory networks for aspect-based sentiment analysis"", 《PROCEEDINGS OF THE ACM ON CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT》 *
YICHUN YIN等: ""Document-Level Multi-Aspect Sentiment Classification as Machine Comprehension "", 《PROCEEDINGS OF THE 2017 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING》 *
刘全 等: ""一种用于基于方面情感分析的深度分层网络模型"", 《计算机学报》 *
田竹: ""基于深度特征提取的文本情感极性分类研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921285A (en) * 2018-06-22 2018-11-30 西安理工大学 Single-element classification method in sequence based on bidirectional valve controlled Recognition with Recurrent Neural Network
CN108921285B (en) * 2018-06-22 2021-05-25 西安理工大学 Bidirectional gate control cyclic neural network-based classification method for power quality disturbance
CN108920587A (en) * 2018-06-26 2018-11-30 清华大学 Merge the open field vision answering method and device of external knowledge
CN108830507A (en) * 2018-06-29 2018-11-16 成都数之联科技有限公司 A kind of food safety risk method for early warning
CN109299429A (en) * 2018-07-11 2019-02-01 重庆邮电大学 A kind of dynamic society's relationship modeling method based on Wiener-Hopf equation
CN109033463A (en) * 2018-08-28 2018-12-18 广东工业大学 A kind of community's question and answer content recommendation method based on end-to-end memory network
CN109033463B (en) * 2018-08-28 2021-11-26 广东工业大学 Community question-answer content recommendation method based on end-to-end memory network
CN109558477B (en) * 2018-10-23 2021-03-23 深圳先进技术研究院 Community question-answering system and method based on multitask learning and electronic equipment
CN109558477A (en) * 2018-10-23 2019-04-02 深圳先进技术研究院 A kind of community's question answering system, method and electronic equipment based on multi-task learning
CN109472031A (en) * 2018-11-09 2019-03-15 电子科技大学 A kind of aspect rank sentiment classification model and method based on double memory attentions
CN109472031B (en) * 2018-11-09 2021-05-04 电子科技大学 Aspect level emotion classification model and method based on double memory attention
CN109543039B (en) * 2018-11-23 2022-04-08 中山大学 Natural language emotion analysis method based on deep network
CN109543039A (en) * 2018-11-23 2019-03-29 中山大学 A kind of natural language sentiment analysis method based on depth network
CN111241842B (en) * 2018-11-27 2023-05-30 阿里巴巴集团控股有限公司 Text analysis method, device and system
CN111241842A (en) * 2018-11-27 2020-06-05 阿里巴巴集团控股有限公司 Text analysis method, device and system
CN111368536A (en) * 2018-12-07 2020-07-03 北京三星通信技术研究有限公司 Natural language processing method, apparatus and storage medium therefor
CN109726903A (en) * 2018-12-19 2019-05-07 中国电子科技集团公司信息科学研究院 Distributed multi agent Collaborative Decision Making Method based on attention mechanism
CN109960725A (en) * 2019-01-17 2019-07-02 平安科技(深圳)有限公司 Text classification processing method, device and computer equipment based on emotion
CN109885670A (en) * 2019-02-13 2019-06-14 北京航空航天大学 A kind of interaction attention coding sentiment analysis method towards topic text
CN109902174B (en) * 2019-02-18 2023-06-20 山东科技大学 Emotion polarity detection method based on aspect-dependent memory network
CN109902174A (en) * 2019-02-18 2019-06-18 山东科技大学 A kind of feeling polarities detection method of the memory network relied on based on aspect
CN109933795A (en) * 2019-03-19 2019-06-25 上海交通大学 Based on context-emotion term vector text emotion analysis system
CN109933795B (en) * 2019-03-19 2023-07-28 上海交通大学 Text emotion analysis system based on context-emotion word vector
CN110287296A (en) * 2019-05-21 2019-09-27 平安科技(深圳)有限公司 A kind of problem answers choosing method, device, computer equipment and storage medium
CN110276076A (en) * 2019-06-25 2019-09-24 北京奇艺世纪科技有限公司 A kind of text mood analysis method, device and equipment
CN110472042A (en) * 2019-07-02 2019-11-19 桂林电子科技大学 A kind of fine granularity sensibility classification method
CN110298403B (en) * 2019-07-02 2023-12-12 北京金融大数据有限公司 Emotion analysis method and system for enterprise main body in financial news
CN110472042B (en) * 2019-07-02 2021-11-26 桂林电子科技大学 Fine-grained emotion classification method
CN110298403A (en) * 2019-07-02 2019-10-01 郭刚 The sentiment analysis method and system of enterprise dominant in a kind of financial and economic news
CN110457450A (en) * 2019-07-05 2019-11-15 平安科技(深圳)有限公司 Answer generation method and relevant device based on neural network model
CN110457450B (en) * 2019-07-05 2023-12-22 平安科技(深圳)有限公司 Answer generation method based on neural network model and related equipment
CN110377744A (en) * 2019-07-26 2019-10-25 北京香侬慧语科技有限责任公司 A kind of method, apparatus, storage medium and the electronic equipment of public sentiment classification
CN111222009A (en) * 2019-10-25 2020-06-02 汕头大学 Processing method of multi-modal personalized emotion based on long-time memory mechanism
CN111222009B (en) * 2019-10-25 2022-03-22 汕头大学 Processing method of multi-modal personalized emotion based on long-time memory mechanism
CN111079409B (en) * 2019-12-16 2023-04-25 东北大学秦皇岛分校 Emotion classification method utilizing context and aspect memory information
CN113434669A (en) * 2021-05-31 2021-09-24 华东师范大学 Natural language relation extraction method based on sequence marking strategy
CN113326378B (en) * 2021-06-16 2022-09-06 山西财经大学 Cross-domain text emotion classification method based on parameter migration and attention sharing mechanism
CN113326378A (en) * 2021-06-16 2021-08-31 山西财经大学 Cross-domain text emotion classification method based on parameter migration and attention sharing mechanism
CN113849651A (en) * 2021-09-28 2021-12-28 平安科技(深圳)有限公司 Document-level emotional tendency-based emotion classification method, device, equipment and medium
CN113849651B (en) * 2021-09-28 2024-04-09 平安科技(深圳)有限公司 Emotion classification method, device, equipment and medium based on document-level emotion tendencies
CN114494980B (en) * 2022-04-06 2022-07-15 中国科学技术大学 Diversified video comment generation method, system, equipment and storage medium
CN114494980A (en) * 2022-04-06 2022-05-13 中国科学技术大学 Diversified video comment generation method, system, equipment and storage medium
CN115331128A (en) * 2022-10-11 2022-11-11 松立控股集团股份有限公司 Viaduct crack detection method
CN115331128B (en) * 2022-10-11 2023-01-31 松立控股集团股份有限公司 Viaduct crack detection method

Also Published As

Publication number Publication date
CN108133038B (en) 2022-03-22

Similar Documents

Publication Publication Date Title
CN108133038A (en) A kind of entity level emotional semantic classification system and method based on dynamic memory network
CN106650789B (en) Image description generation method based on depth LSTM network
Chen et al. Knowedu: A system to construct knowledge graph for education
CN108647233B (en) Answer sorting method for question-answering system
CN104598611B (en) The method and system being ranked up to search entry
CN108009285B (en) Forest Ecology man-machine interaction method based on natural language processing
CN110390397B (en) Text inclusion recognition method and device
CN109597997A (en) Based on comment entity, aspect grade sensibility classification method and device and its model training
CN110427490A (en) A kind of emotion dialogue generation method and device based on from attention mechanism
CN110633730A (en) Deep learning machine reading understanding training method based on course learning
CN111061856A (en) Knowledge perception-based news recommendation method
CN108334891A (en) A kind of Task intent classifier method and device
CN106980650A (en) A kind of emotion enhancing word insertion learning method towards Twitter opinion classifications
CN111598118B (en) Visual question-answering task implementation method and system
CN110222770A (en) A kind of vision answering method based on syntagmatic attention network
CN113569001A (en) Text processing method and device, computer equipment and computer readable storage medium
CN108664512A (en) Text object sorting technique and device
CN110083702A (en) A kind of aspect rank text emotion conversion method based on multi-task learning
Zhang et al. A BERT fine-tuning model for targeted sentiment analysis of Chinese online course reviews
CN111914553B (en) Financial information negative main body judging method based on machine learning
CN112417118B (en) Dialog generation method based on marked text and neural network
CN110472063A (en) Social media data processing method, model training method and relevant apparatus
Reddy et al. Neural networks for prediction of loan default using attribute relevance analysis
CN104679492A (en) Computer-implemented technical support providing device and method
JP2018151892A (en) Model learning apparatus, information determination apparatus, and program therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant