CN109992780A - One kind being based on deep neural network specific objective sensibility classification method - Google Patents

One kind being based on deep neural network specific objective sensibility classification method Download PDF

Info

Publication number
CN109992780A
CN109992780A CN201910249992.4A CN201910249992A CN109992780A CN 109992780 A CN109992780 A CN 109992780A CN 201910249992 A CN201910249992 A CN 201910249992A CN 109992780 A CN109992780 A CN 109992780A
Authority
CN
China
Prior art keywords
specific objective
term vector
neural network
training
particular aspects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910249992.4A
Other languages
Chinese (zh)
Other versions
CN109992780B (en
Inventor
谢金宝
王振东
马骏杰
战岭
吕世伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin University of Science and Technology
Original Assignee
Harbin University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin University of Science and Technology filed Critical Harbin University of Science and Technology
Priority to CN201910249992.4A priority Critical patent/CN109992780B/en
Publication of CN109992780A publication Critical patent/CN109992780A/en
Application granted granted Critical
Publication of CN109992780B publication Critical patent/CN109992780B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Machine Translation (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present invention provides a kind of based on deep neural network specific objective sensibility classification method.Belong to the text emotion classification field of natural language processing.Chinese word segmentation is carried out to data set first, remove stop words, remove the operation of punctuate, then using word2vec algorithm, to treated, corpus is trained obtains corresponding term vector, then, training set is input in the shot and long term memory network model structure based on target attention mechanism, during realizing attention weight training, specific objective and particular aspects insertion are entered, specific objective is indicated with the weighted sum that particular aspects are embedded in, model is set to give more correctly concerns to specific objective and particular aspects, realize the true semanteme for preferably capturing target, finally improve the accuracy of specific objective emotional semantic classification.

Description

One kind being based on deep neural network specific objective sensibility classification method
Technical field
The present invention relates to comment text emotional semantic classifications, more particularly to a kind of deep neural network specific objective emotion that is based on to divide Class method, belongs to natural language processing technique field.
Background technique
Sentiment analysis method mainly has rule-based method, the method based on machine learning and based on deep neural network Method.Rule-based method usually requires building sentiment dictionary or emotion collocation template, then by comparing in comment text The emotion word that is included or regular collocation calculate the Sentiment orientation of text, but construct more complete sentiment dictionary or correlation Collocation rule be existing main problem now.Method based on machine learning mainly carries out the training corpus with label Feature extraction and modeling, to automatically realize the judgement of feeling polarities with machine learning algorithm;Such methods mainly have branch Vector machine, naive Bayesian, maximum informational entropy, condition random field etc. are held, still, the effect of machine learning classification is often depending on The selection of feature, artificial selection feature is there is very big uncertainty, and such methods are used when modeling to corpus Function is generally fairly simple, it is difficult to capture profound feature, modeling ability and generalization ability have significant limitation.With The development of deep learning and the liberalization of Expression of language and diversification, the advantage of deep neural network technology are gradually convex It is aobvious, become the mainstream technology of natural language processing field, compared to rule-based sentiment analysis method and is based on engineering The sentiment analysis method of habit, the method for deep neural network is due to the complexity of its model and function, and facing, current complexity is more When the language phenomenon of change, can capture more comprehensively, the text feature of deeper, i.e., to text have better understand ability, Sentiment analysis field also can achieve better effect.
LSTM neural network model is called shot and long term memory network model, is the variant of RNN model.LSTM solves RNN The information that model occurs when Chief Information Officer distance is transmitted disappears or information explosion problem, and LSTM neural network model is in RNN model On the basis of to neural network node plus a variety of doors be used to control information in the flowing of different moments.In order to control information Flowing, memory unit is specially devised in the internal node of LSTM neural network, and information is controlled by door It deletes or increases, door is that a kind of pair of information carries out the method that passes through of selection, and there are three types of doors in the node of LSTM neural network The state with control node is protected, these three doors are input gate respectively, forget door and out gate.Attention mechanism derives from people The key component that brain pays close attention to things distributes more attentions, and attention mechanism is used in visual pattern field at the beginning, after It is just applied in the task of natural language processing, and plays relatively good effect, by calculating attention probability distribution, Key input is protruded, to play optimization function to traditional model.
Specific objective sentiment analysis is deeper sentiment analysis as one important subtask of sentiment analysis.With Common sentiment analysis is different, and the differentiation of specific objective feeling polarities not only relies on the contextual information of text, while also relying on spy The characteristic information to set the goal.Such as " food in this family dining room is very nice, but price is expensive, but services very intimate for sentence ", " taste " aspect for target " food " is positive emotion, and for being then passive in terms of " price " of target " food " Emotion, " service " aspect for target " dining room " is positive emotion.So the just same sentence at last, for same target It is possible that antipodal feeling polarities, different targets also has different feeling polarities.But it is most of based on mind Text emotion disaggregated model through network cannot correctly pay close attention to the emotion of the particular aspects of specific objective, classifying quality ratio It is poor.Realize that the true semanteme for preferably capturing target improves specific objective feelings so that semantic information is more abundant in text The accuracy for feeling classification, is main direction of studying of the invention.
Summary of the invention
In view of the deficiencies of the prior art, it is an object of that present invention to provide one kind to be based on deep neural network for the purpose of the present invention With the specific objective sensibility classification method of target attention mechanism, the text data for analyzing in social networks includes specific The emotional color of target, particular aspects.
The present invention can be realized using following system:
One kind being based on deep neural network specific objective sensibility classification method, it is characterised in that:
Step 1: being acquired to Chinese emotional reaction categorization data set and Text Pretreatment, and emotional semantic classification data set divides For training set and test set;
Step 2: to pretreated data set using word2vec tool training term vector model and by the text in data set Originally it is mapped as term vector set;
It, can be in the LSTM of training parameter using having Step 3: the term vector set of training set is input in LSTM Three doors abandon or transmit information, and export a series of hiding vector h={ h1,h2,…,hn};
Step 4: by the term vector matrix of training set, the term vector square of the term vector matrix of specific objective and particular aspects Battle array is put into target attention mechanism, obtains each hiPositive weights pi, then obtain sentence expression ZS
Step 5: according to the sentence Z of generationS, the emotion pole of specific objective is judged with full articulamentum and softmax function Property.
Further, the Text Pretreatment, specifically::
Pretreatment mainly includes that will mark the polar sentence of emotion to carry out Chinese word segmentation, remove stop words, removal punctuate;At random It chooses 80% in data set and is used as training set, 20% is used as test set.
Further, described to include: using word2vec tool training term vector model to pretreated data set
After word2vec model training is completed, word2vec model can be used to map each word ω to continuous feature Vector eω∈Rd, wherein d represents the dimension of term vector, ultimately produces term vector matrix E ∈ Rv×d, wherein V represents word in data set The size of remittance amount.
Further, described the term vector set of training set is input in LSTM, it can training parameter using having Three doors in LSTM abandon or transmit information, and export a series of hiding vector h={ h1,h2,…,hnSpecifically include with Under:
Three doors in LSTM, including input gate, forgetting door and out gate.If xtWhen for LSTM neural network node t The input at quarter, htFor the output of t moment, WxTo input corresponding weight, WhTo export corresponding weight, then LSTM neural network Model is divided into following steps by the process that door controls information update:
Calculate the value i of input gate t momentt, input gate control is influence of the current input to memory unit state value, meter Calculation method is as follows
it=sigmoid (Wxixt+Whiht-1+Wcict-1+bi) (1)
Calculate the value f for forgeing door t momentt, forget door control is influence of the historical information to memory unit state value, meter Calculation method is as follows:
ft=sigmoid (Wxfxt+Whfht-1+Wcfct-1+bf) (2)
Calculate the value of current time candidate memory unitAnd the value of current time memory unit is updated, calculation method is such as Under:
ct=ft·ct-1+it·ct (4)
Finally calculate the output information h of t momentt, which is determined that calculation method is as follows by out gate:
ot=sigmoid (Wxoxt+Whoht-1+Wcoct-1+bo) (5)
ht=ot·tanh(ct) (6)
The term vector matrix by the term vector matrix of training set, the term vector matrix of specific objective and particular aspects is put Enter in target attention mechanism, specifically include following:
The term vector matrix of the term vector matrix of training set and specific objective is calculated as follows:
Wherein represent the average value that Average returns to input vector.For the term vector matrix of specific objective,For instruction Practice the term vector matrix of collection, cSEffect be and meanwhile capture target information and contextual information.
Calculate the weight vectors q in whole k particular aspects insertionst, formula is as follows:
qt=softmax (Wt·cS+bt) (8)
Wherein qtIndicate the weight vectors in whole k particular aspects insertions, each weight qtIndicate that specific objective belongs to phase A possibility that closing aspect, WtAnd btRespectively indicate weight matrix and bias vector.
Calculate the vector t of specific objectives, formula is as follows
ts=Tqt (9)
Wherein tsIndicate that the vector of specific objective, T indicate the term vector matrix of particular aspects, T ∈ RK×d, wherein K represents spy Fixed aspect number.
Calculate positive weights pi, formula is as follows:
WhereinWa∈Rd×dIt is a trainable weight matrix.
Calculate sentence expression ZS, formula is as follows:
Each hides vector hiA corresponding positive weights pi, intermediate value piIt is calculated by target attention model, piIt can solve It is interpreted as when judging the feeling polarities of specific objective a, ωiIt is the probability for the word that model is correctly paid close attention to.ZSIt represents and is used for emotional semantic classification Sentence.
In conclusion the present invention provides one kind based on deep neural network specific objective sensibility classification method.Belong to nature The text emotion classification field of Language Processing.Chinese word segmentation, removal stop words, the behaviour for removing punctuate are carried out to data set first Make, then using word2vec algorithm, to treated, corpus is trained obtains corresponding term vector, then, by training set It is input in the shot and long term memory network model structure based on target attention mechanism, in the process for realizing attention weight training In, specific objective and particular aspects insertion are entered, specific objective is indicated with the weighted sum that particular aspects are embedded in, makes model More correctly concerns are given to specific objective and particular aspects, realize the true semanteme for preferably capturing target, it is final to improve The accuracy of specific objective emotional semantic classification.
Compared with the prior art, the invention has the following beneficial effects:
The present invention introduces target attention mechanism on the basis of shot and long term memory network, is embedded in particular aspects Weighted sum indicates specific objective, so that model is preferably captured the true semanteme of specific objective, also makes model to specific objective More correctly concerns are given with particular aspects, while ignoring or reducing the influence of secondary information in text, realization is preferably caught The true semanteme for catching target, finally improves the accuracy of specific objective emotional semantic classification.
Detailed description of the invention
In order to illustrate more clearly of technical solution of the present invention, letter will be made to attached drawing needed in the embodiment below Singly introduce, it should be apparent that, the accompanying drawings in the following description is only some embodiments recorded in the present invention, for this field For those of ordinary skill, without creative efforts, it is also possible to obtain other drawings based on these drawings.
Fig. 1 is the flow chart of sentiment classification model training of the present invention;
Fig. 2 is LSTM internal structure chart of the present invention;
Fig. 3 is the general frame of sentiment classification model of the present invention.
Specific embodiment
The present invention gives one kind to be based on deep neural network specific objective sensibility classification method embodiment, in order to make this skill The personnel in art field more fully understand the technical solution in the embodiment of the present invention, and make above-mentioned purpose of the invention, feature and excellent Point can be more obvious and easy to understand, is described in further detail with reference to the accompanying drawing to technical solution in the present invention:
Present invention firstly provides one kind to be based on deep neural network specific objective sensibility classification method, as shown in Figure 1, packet It includes:
S101 step 1: the acquisition of Chinese emotional reaction categorization data set and the pretreatment of text, and data set is divided into training Collection and test set;
S102 step 2: training term vector model using word2vec tool to pretreated data set and will be in data set Text be mapped as term vector set;
S103 step 3: the term vector set of training set is input in LSTM, can be in the LSTM of training parameter using having Three doors abandon or transmit information, and export a series of hiding vector h={ h1,h2,…,hn};
S104 step 4: by the term vector matrix, the term vector matrix of specific objective and the term vector of particular aspects of training set Matrix is put into target attention mechanism, obtains each hiPositive weights pi, then obtain sentence ZS
S105 step 5: according to the sentence Z of generationS, the feelings of specific objective are judged with full articulamentum and softmax function Feel polarity.
Pretreatment mainly includes that will mark the polar sentence of emotion to carry out Chinese word segmentation, remove stop words, removal in step 1 Punctuate;80% in data set is randomly selected as training set, 20% and is used as test set.
After word2vec model training is completed in step 2, word2vec model can be used to map each word ω to one A continuous characteristic vector eω∈Rd, wherein d represents the dimension of term vector, ultimately produces term vector matrix E ∈ Rv×d, wherein V is represented The size of vocabulary in data set.
Three doors in LSTM in step 3, as shown in Fig. 2, including input gate, forgeing door and out gate.If xtFor The input of LSTM neural network node t moment, htFor the output of t moment, WxTo input corresponding weight, WhIt is corresponded to for output Weight, then LSTM neural network model is to calculate input gate t moment first by the process that door controls information update Value it, input gate control is influence of the current input to memory unit state value, and calculation method is as follows
it=sigmoid (Wxixt+Whiht-1+Wcict-1+bi) (1)
Then the value f for forgeing door t moment is calculatedt, forget door control is shadow of the historical information to memory unit state value It rings, calculation method is as follows:
ft=sigmoid (Wxfxt+Whfht-1+Wcfct-1+bf) (2)
The value of current time candidate's memory unit is calculated againAnd update the value of current time memory unit, calculation method It is as follows:
ct=ft·ct-1+it·ct (4)
Finally calculate the output information h of t momentt, which is determined that calculation method is as follows by out gate:
ot=sigmoid (Wxoxt+Whoht-1+Wcoct-1+bo) (5)
ht=ot·tanh(ct) (6)
The term vector matrix of the term vector matrix of training set and specific objective is averaged first in step 4, it is as follows Shown in formula:
Wherein Average indicates to return to the average value of input vector.For the term vector matrix of specific objective,For instruction Practice the term vector matrix of collection, cSEffect be and meanwhile capture target information and contextual information.
In next step by cSIt is put into the weight vectors q calculated in whole k particular aspects insertions in softmax functiont, formula It is as follows:
qt=softmax (Wt·cS+bt) (8)
Wherein qtIndicate the weight vectors in whole k particular aspects insertions, each weight qtIndicate that specific objective belongs to phase A possibility that closing aspect, WtAnd btRespectively indicate weight matrix and bias vector.
By qtDot product is carried out with the term vector matrix of particular aspects, calculates the vector t of specific objectives, formula is as follows
ts=Tqt (9)
Wherein tsIndicate that the vector of specific objective, T indicate the term vector matrix of particular aspects, T ∈ RK×d, wherein K represents spy Fixed aspect number, it is more much smaller than V.
Positive weights p is calculated in next stepi, formula is as follows:
WhereinWa∈Rd×dIt is a trainable weight matrix.
Then sentence expression Z is calculatedS, formula is as follows:
Each hides vector hiA corresponding positive weights pi, intermediate value piIt is calculated by target attention model, piIt can solve It is interpreted as when judging the feeling polarities of specific objective a, ωiIt is the probability for the word that model is correctly paid close attention to.ZSIt represents and is used for emotional semantic classification Sentence.
Finally according to the sentence Z of generationS, the feeling polarities of specific objective are judged with full articulamentum and softmax function, Specific calculation process is as shown in Figure 3.
In conclusion the present invention provides one kind based on deep neural network specific objective sensibility classification method.Belong to nature The text emotion classification field of Language Processing.Chinese word segmentation, removal stop words, the behaviour for removing punctuate are carried out to data set first Make, then using word2vec algorithm, to treated, corpus is trained obtains corresponding term vector, then, by training set It is input in the shot and long term memory network model structure based on target attention mechanism, in the process for realizing attention weight training In, specific objective and particular aspects insertion are entered, specific objective is indicated with the weighted sum that particular aspects are embedded in, makes model More correctly concerns are given to specific objective and particular aspects, realize the true semanteme for preferably capturing target, it is final to improve The accuracy of specific objective emotional semantic classification.
Above embodiments are to illustrative and not limiting technical solution of the present invention.Appointing for spirit and scope of the invention is not departed from What modification or part replacement, are intended to be within the scope of the claims of the invention.

Claims (6)

1. one kind is based on deep neural network specific objective sensibility classification method, which comprises the following steps:
Step 1: being acquired to Chinese emotional reaction categorization data set and Text Pretreatment, and emotional semantic classification data set is divided into instruction Practice collection and test set;
Step 2: using word2vec tool training term vector model and the text in data set is reflected to pretreated data set It penetrates as term vector set;
It, can three in the LSTM of training parameter using having Step 3: the term vector set of training set is input in LSTM Door abandons or transmits information, and exports a series of hiding vector h={ h1,h2,…,hn};
Step 4: the term vector matrix of the term vector matrix of training set, the term vector matrix of specific objective and particular aspects is put Enter in target attention mechanism, obtains each hiPositive weights pi, then obtain sentence expression ZS
Step 5: according to the sentence Z of generationS, the feeling polarities of specific objective are judged with full articulamentum and softmax function.
2. according to claim 1 a kind of based on deep neural network specific objective sensibility classification method, it is characterised in that: The Text Pretreatment, specifically: the polar sentence of emotion will be marked and carry out Chinese word segmentation, remove stop words, removal punctuate;At random It chooses 80% in data set and is used as training set, 20% is used as test set.
3. according to claim 1 a kind of based on deep neural network specific objective sensibility classification method, it is characterised in that: It is described to include: using word2vec tool training term vector model to pretreated data set
After word2vec model training is completed, word2vec model is used to map each word ω to continuous characteristic vector eω ∈Rd, wherein d represents the dimension of term vector, ultimately produces term vector matrix E ∈ Rv×d, wherein V represents vocabulary in data set Size.
4. according to claim 1 a kind of based on deep neural network specific objective sensibility classification method, it is characterised in that: It is described that the term vector set of training set is input in LSTM, using have can three doors in the LSTM of training parameter abandon Or transmitting information, and export a series of hiding vector h={ h1,h2,…,hnSpecifically include it is following:
Three doors in LSTM, including input gate, forgetting door and out gate;If xtFor the defeated of LSTM neural network node t moment Enter, htFor the output of t moment, WxTo input corresponding weight, WhTo export corresponding weight, then LSTM neural network model is logical The process for structure control information update of moving into one's husband's household upon marriage is divided into four steps:
(4.1) the value i of input gate t moment is calculatedt, input gate control is influence of the current input to memory unit state value, meter Calculation method is as follows
it=sigmoid (Wxixt+Whiht-1+Wcict-1+bi) (1)
(4.2) the value f for forgeing door t moment is calculatedt, forget door control is influence of the historical information to memory unit state value, meter Calculation method is as follows:
ft=sigmoid (Wxfxt+Whfht-1+Wcfct-1+bf) (2)
(4.3) value of current time candidate memory unit is calculatedAnd the value of current time memory unit is updated, calculation method is such as Under:
ct=ft·ct-1+it·ct (4)
(4.4) the output information h of t moment is finally calculatedt, which is determined that calculation method is as follows by out gate:
ot=sigmoid (Wxoxt+Whoht-1+Wcoct-1+bo) (5)
ht=ot·tanh(ct) (6) 。
5. according to claim 1 a kind of based on deep neural network specific objective sensibility classification method, it is characterised in that: The term vector matrix by the term vector matrix of training set, the term vector matrix of specific objective and particular aspects is put into target note In power mechanism of anticipating, specifically include following:
(5.1) the term vector matrix of the term vector matrix of training set and specific objective is calculated as follows:
Wherein Average represents the average value for returning to input vector,For the term vector matrix of specific objective,For training set Term vector matrix, cSEffect be and meanwhile capture target information and contextual information;
(5.2) the weight vectors q in whole k particular aspects insertions is calculatedt, formula is as follows:
qt=softmax (Wt·cS+bt) (8)
Wherein qtIndicate the weight vectors in whole k particular aspects insertions, each weight qtIndicate that specific objective belongs to related side A possibility that face, WtAnd btRespectively indicate weight matrix and bias vector
(5.3) the vector t of specific objective is calculateds, formula is as follows
ts=Tqt (9)
Wherein tsThe vector of specific objective, T indicate the term vector matrix of particular aspects, T ∈ RK×d, wherein K represents specific aspect Number;
(5.4) positive weights p is calculatedi, formula is as follows:
WhereinWa∈Rd×dIt is a trainable weight matrix;
(5.5) sentence expression Z is calculatedS, formula is as follows:
Each hides vector hiA corresponding positive weights pi, intermediate value piIt is calculated by target attention model, piIt can be construed to When judging the feeling polarities of specific objective a, ωiIt is the probability for the word that model is correctly paid close attention to.ZSRepresent the sentence for being used for emotional semantic classification Son.
6. according to claim 5 a kind of based on deep neural network specific objective sensibility classification method, it is characterised in that: During realizing attention weight training, specific objective and particular aspects insertion are entered, added with what particular aspects were embedded in Power summation finally improves the accuracy of specific objective emotional semantic classification to indicate specific objective.
CN201910249992.4A 2019-03-29 2019-03-29 Specific target emotion classification method based on deep neural network Expired - Fee Related CN109992780B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910249992.4A CN109992780B (en) 2019-03-29 2019-03-29 Specific target emotion classification method based on deep neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910249992.4A CN109992780B (en) 2019-03-29 2019-03-29 Specific target emotion classification method based on deep neural network

Publications (2)

Publication Number Publication Date
CN109992780A true CN109992780A (en) 2019-07-09
CN109992780B CN109992780B (en) 2022-07-01

Family

ID=67131875

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910249992.4A Expired - Fee Related CN109992780B (en) 2019-03-29 2019-03-29 Specific target emotion classification method based on deep neural network

Country Status (1)

Country Link
CN (1) CN109992780B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110390017A (en) * 2019-07-25 2019-10-29 中国民航大学 Target sentiment analysis method and system based on attention gate convolutional network
CN110517121A (en) * 2019-09-23 2019-11-29 重庆邮电大学 Method of Commodity Recommendation and the device for recommending the commodity based on comment text sentiment analysis
CN110704622A (en) * 2019-09-27 2020-01-17 北京明略软件系统有限公司 Text emotion classification method and device and electronic equipment
CN110728298A (en) * 2019-09-05 2020-01-24 北京三快在线科技有限公司 Multi-task classification model training method, multi-task classification method and device
CN111191026A (en) * 2019-12-10 2020-05-22 央视国际网络无锡有限公司 Text classification method capable of calibrating specific segments
CN111291189A (en) * 2020-03-10 2020-06-16 北京芯盾时代科技有限公司 Text processing method and device and computer readable storage medium
CN111444728A (en) * 2020-04-20 2020-07-24 复旦大学 End-to-end aspect-based emotion analysis method
CN112115243A (en) * 2020-08-11 2020-12-22 南京理工大学 Session representation learning method by modeling time-series time correlation
CN112434161A (en) * 2020-11-24 2021-03-02 哈尔滨工程大学 Aspect-level emotion analysis method adopting bidirectional long-short term memory network
CN112464281A (en) * 2020-11-29 2021-03-09 哈尔滨工程大学 Network information analysis method based on privacy grouping and emotion recognition
CN112699237A (en) * 2020-12-24 2021-04-23 百度在线网络技术(北京)有限公司 Label determination method, device and storage medium
CN114357166A (en) * 2021-12-31 2022-04-15 北京工业大学 Text classification method based on deep learning
CN114417851A (en) * 2021-12-03 2022-04-29 重庆邮电大学 Emotion analysis method based on keyword weighted information

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107491490A (en) * 2017-07-19 2017-12-19 华东师范大学 Text sentiment classification method based on Emotion center
US20180165554A1 (en) * 2016-12-09 2018-06-14 The Research Foundation For The State University Of New York Semisupervised autoencoder for sentiment analysis
CN108363753A (en) * 2018-01-30 2018-08-03 南京邮电大学 Comment text sentiment classification model is trained and sensibility classification method, device and equipment
CN108460009A (en) * 2017-12-14 2018-08-28 中山大学 The attention mechanism Recognition with Recurrent Neural Network text emotion analytic approach of embedded sentiment dictionary
CN108717439A (en) * 2018-05-16 2018-10-30 哈尔滨理工大学 A kind of Chinese Text Categorization merged based on attention mechanism and characteristic strengthening

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180165554A1 (en) * 2016-12-09 2018-06-14 The Research Foundation For The State University Of New York Semisupervised autoencoder for sentiment analysis
CN107491490A (en) * 2017-07-19 2017-12-19 华东师范大学 Text sentiment classification method based on Emotion center
CN108460009A (en) * 2017-12-14 2018-08-28 中山大学 The attention mechanism Recognition with Recurrent Neural Network text emotion analytic approach of embedded sentiment dictionary
CN108363753A (en) * 2018-01-30 2018-08-03 南京邮电大学 Comment text sentiment classification model is trained and sensibility classification method, device and equipment
CN108717439A (en) * 2018-05-16 2018-10-30 哈尔滨理工大学 A kind of Chinese Text Categorization merged based on attention mechanism and characteristic strengthening

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
TAO CHEN 等: "Improving sentiment analysis via sentence type classification using BiLSTM-CRF and CNN", 《EXPERT SYSTEMS WITH APPLICATIONS》 *
YEQUAN WANG 等: "attention-based LSTM for aspect-level sentiment classification", 《PROCEEDINGS OF THE 2016 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING》 *
梁斌等: "基于多注意力卷积神经网络的特定目标情感分析", 《计算机研究与发展》 *
王振东: "基于改进注意力机制的中文细粒度情感分析", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110390017B (en) * 2019-07-25 2022-12-27 中国民航大学 Target emotion analysis method and system based on attention gating convolutional network
CN110390017A (en) * 2019-07-25 2019-10-29 中国民航大学 Target sentiment analysis method and system based on attention gate convolutional network
CN110728298A (en) * 2019-09-05 2020-01-24 北京三快在线科技有限公司 Multi-task classification model training method, multi-task classification method and device
CN110517121A (en) * 2019-09-23 2019-11-29 重庆邮电大学 Method of Commodity Recommendation and the device for recommending the commodity based on comment text sentiment analysis
CN110704622A (en) * 2019-09-27 2020-01-17 北京明略软件系统有限公司 Text emotion classification method and device and electronic equipment
CN111191026A (en) * 2019-12-10 2020-05-22 央视国际网络无锡有限公司 Text classification method capable of calibrating specific segments
CN111291189A (en) * 2020-03-10 2020-06-16 北京芯盾时代科技有限公司 Text processing method and device and computer readable storage medium
CN111444728A (en) * 2020-04-20 2020-07-24 复旦大学 End-to-end aspect-based emotion analysis method
CN112115243A (en) * 2020-08-11 2020-12-22 南京理工大学 Session representation learning method by modeling time-series time correlation
CN112115243B (en) * 2020-08-11 2023-06-16 南京理工大学 Session representation learning method by modeling time-series time correlation
CN112434161A (en) * 2020-11-24 2021-03-02 哈尔滨工程大学 Aspect-level emotion analysis method adopting bidirectional long-short term memory network
CN112434161B (en) * 2020-11-24 2023-01-03 哈尔滨工程大学 Aspect-level emotion analysis method adopting bidirectional long-short term memory network
CN112464281A (en) * 2020-11-29 2021-03-09 哈尔滨工程大学 Network information analysis method based on privacy grouping and emotion recognition
CN112699237A (en) * 2020-12-24 2021-04-23 百度在线网络技术(北京)有限公司 Label determination method, device and storage medium
CN114417851A (en) * 2021-12-03 2022-04-29 重庆邮电大学 Emotion analysis method based on keyword weighted information
CN114357166A (en) * 2021-12-31 2022-04-15 北京工业大学 Text classification method based on deep learning
CN114357166B (en) * 2021-12-31 2024-05-28 北京工业大学 Text classification method based on deep learning

Also Published As

Publication number Publication date
CN109992780B (en) 2022-07-01

Similar Documents

Publication Publication Date Title
CN109992780A (en) One kind being based on deep neural network specific objective sensibility classification method
Kuremoto et al. Time series forecasting using a deep belief network with restricted Boltzmann machines
CN108733792A (en) A kind of entity relation extraction method
CN109829541A (en) Deep neural network incremental training method and system based on learning automaton
CN107423442A (en) Method and system, storage medium and computer equipment are recommended in application based on user's portrait behavioural analysis
CN110534132A (en) A kind of speech-emotion recognition method of the parallel-convolution Recognition with Recurrent Neural Network based on chromatogram characteristic
CN109543722A (en) A kind of emotion trend forecasting method based on sentiment analysis model
CN108764540A (en) Water supply network pressure prediction method based on parallel LSTM series connection DNN
CN110084610A (en) A kind of network trading fraud detection system based on twin neural network
CN110413838A (en) A kind of unsupervised video frequency abstract model and its method for building up
CN110110372B (en) Automatic segmentation prediction method for user time sequence behavior
CN109902823A (en) A kind of model training method and equipment based on generation confrontation network
CN109472030A (en) A kind of system replys the evaluation method and device of quality
CN110162751A (en) Text generator training method and text generator training system
McDowell et al. Learning from omission
CN111914553A (en) Financial information negative subject judgment method based on machine learning
Wang et al. [Retracted] Sports Action Recognition Based on GB‐BP Neural Network and Big Data Analysis
CN109408896B (en) Multi-element intelligent real-time monitoring method for anaerobic sewage treatment gas production
CN110458215A (en) Pedestrian's attribute recognition approach based on multi-time Scales attention model
CN112000793B (en) Man-machine interaction oriented dialogue target planning method
CN116258504B (en) Bank customer relationship management system and method thereof
Iraji et al. Students classification with adaptive neuro fuzzy
Hong et al. A multi-angle hierarchical differential evolution approach for multimodal optimization problems
Zhu [Retracted] A Face Recognition System Using ACO‐BPNN Model for Optimizing the Teaching Management System
CN109033413A (en) A kind of requirement documents neural network based and service document matches method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220701

CF01 Termination of patent right due to non-payment of annual fee