CN109165387A - A kind of Chinese comment sentiment analysis method based on GRU neural network - Google Patents
A kind of Chinese comment sentiment analysis method based on GRU neural network Download PDFInfo
- Publication number
- CN109165387A CN109165387A CN201811097770.7A CN201811097770A CN109165387A CN 109165387 A CN109165387 A CN 109165387A CN 201811097770 A CN201811097770 A CN 201811097770A CN 109165387 A CN109165387 A CN 109165387A
- Authority
- CN
- China
- Prior art keywords
- neural network
- gru neural
- training
- corpus
- sentence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 23
- 238000004458 analytical method Methods 0.000 title claims abstract description 13
- 239000013598 vector Substances 0.000 claims abstract description 43
- 238000012549 training Methods 0.000 claims abstract description 39
- 238000000034 method Methods 0.000 claims abstract description 25
- 238000012360 testing method Methods 0.000 claims abstract description 18
- 230000008451 emotion Effects 0.000 claims abstract description 11
- 230000002996 emotional effect Effects 0.000 claims abstract description 10
- 238000003062 neural network model Methods 0.000 claims abstract description 8
- 230000000694 effects Effects 0.000 abstract description 5
- 238000013135 deep learning Methods 0.000 abstract description 4
- 238000003058 natural language processing Methods 0.000 abstract 1
- 239000011159 matrix material Substances 0.000 description 7
- 238000010801 machine learning Methods 0.000 description 5
- 230000015654 memory Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000004913 activation Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000001351 cycling effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000005036 nerve Anatomy 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000002146 bilateral effect Effects 0.000 description 1
- 210000004027 cell Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013145 classification model Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000011478 gradient descent method Methods 0.000 description 1
- 230000007787 long-term memory Effects 0.000 description 1
- 230000035772 mutation Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/289—Phrasal analysis, e.g. finite state techniques or chunking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computational Linguistics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Machine Translation (AREA)
Abstract
The Chinese comment sentiment analysis method based on GRU neural network that the invention discloses a kind of, belongs to natural language processing and deep learning field.This method comprises the following steps: (1) being first loaded into corpus data, segmented using jieba participle tool;(2) some useless pause words are removed and corpus is divided into training set and test set in proportion;(3) all term vectors of each sentence are taken mean value by the word vectors for utilizing word2vec training corpus, to generate the vector of corresponding sentence, are then carried out final calculate of backpropagation training using word2vec prototype network and are generated corresponding term vector;(4) the term vector input GRU neural network model with sentence emotion for generating word2vec is trained;(5) test set carries out emotional semantic classification according to training set method building, input GRU neural network.The present invention carries out emotional semantic classification using based on the disaggregated model of GRU neural network, is also obviously improved in model velocity while obtaining good effect.
Description
Technical field
The Chinese comment sentiment analysis method based on GRU (cycling element) neural network that the present invention relates to a kind of, belongs to nature
Language Processing and deep learning field.
Background technique
Emotional semantic classification technology refers to that the emotion information for expressing user in comment text carries out identification classification, in general feelings
Positive class and negative class are classified as under condition.Positive class is divided into for the emotion praised and affirmed, the emotion criticized and negate is divided into
Negative class.Emotional semantic classification technology is there are mainly two types of mode: the first is the unsupervised segmentation method using sentiment dictionary, is for second
Supervised classification method based on machine learning.Because the emotional semantic classification based on machine learning method can obtain outstanding
Classifying quality, so becoming main stream approach.2002, the method for machine learning was applied to feelings for the first time by the scientific research personnel such as Pang
Feel classification field.They attempt to go out feature using N-grame model extraction, and are utilized respectively 3 kinds points of machine learning field
Class model (NB (Naive Bayes Classification Model), ME (Montgomery Algorithm disaggregated model) and SVM (support vector cassification model))
It is tested.It was found that carrying out final classification using unigrams (linear model) as characteristic set using SVM, can obtaining
Obtain preferable classifying quality.And 2006 show that unigrams can obtain preferable effect when corpus is less by experiments such as Cui
Fruit.But corpus increases, and n-grams (n > 3) really shows more preferable classification performance.
The explosion growth of data volume is caused as the n value of n-grams (n meta-model) constantly increases, so that positive reason
N takes 3 or so under condition, cannot more refine contacting for medium term and periphery word.
How to extract complex characteristic rather than simple feature and identify which type of feature is valuable, is mesh
Two main problems of preceding research.Occurred many methods in recent years, and mainly had Single-Charaxter N-grams (more
Feature n meta-model) model, Multi-Word N-grams (multiple features word n meta-model) model and vocabulary-syntactic model etc..
Ahmed etc. proposed rule-based polynary text feature selection method FRN (Feature Relation in 2011
Network characteristic relation net).The method of the machine learning of the statistics such as Yao selects feature, reduces the dimension of feature vector
Degree.There is also using DF (Document Frequency document frequency method), IG (Information Gain information gain),
CHI (Chi Squared Statistic chi-square statistics method) and MI (Mutual Information mutual information method) chooses spy
Sign.
Mikolov et al. proposed a kind of RNNLM (language model of recurrent neural network) in 2011, to solve
The problem of elongated sequence.But even on the data set of million magnitudes, even it is trained, consumes by means of 40 CPU
When several weeks can just provide one well solution come.One NNLM (neural network language model) of training is almost one impossible
Thing.Mikolov improves NNLM within 2013,2013 in Google increased income it is a for term vector calculate work
Tool --- word2vec.
GRU neural network unit is to be deformed by Cho et al. based on LSTM (shot and long term memory network) come GRU for 2014
Difference with LSTM is to be replaced input threshold using the same thresholding and forgets thresholding, and the benefit of the way is to calculate to be able to
Simplify, while ability to express also gets a promotion, so therefore GRU also becomes more and more popular.
Conventional method is limited to excavate the lexical feature and syntactic feature in sentence between word and word, and in language often
Contain the implicit information and semantic feature between word, great role can be played to the identification of emotion information.Emotion is indicated
Big multi-method can be identified precisely for accurate sentence, but (excessively praise to reach and criticize purpose) color with irony for some
Color comment can not but accurately identify.Because such methods can only classify to different word features can not extract it is hidden in sentence
The emotional expression contained.For now more personalized young man, comment on also more and more diversified.This be the following classification with
The problem of processing.
Summary of the invention
The Chinese comment sentiment analysis method that the invention proposes a kind of based on GRU neural network, using word2vec (word to
Amount generates model) word vectors of first training corpus, the COS distance calculated between word clustered, and the poly- of similar features is passed through
High similarity Field Words are extended to dictionary by class method.It will be commented on using extended lexicon as comment sentence term vector generator
Sentence is converted into sentence vector, and then word2vec, which is trained, generates corresponding feature vector.
The present invention is to solve its technical problem to adopt the following technical scheme that
A kind of Chinese comment sentiment analysis method based on GRU neural network, includes the following steps:
(1) first corpus data is loaded into, is segmented using jieba participle tool;
(2) some useless pause words are removed and corpus is divided into training set and test set in proportion;
(3) all term vectors of each sentence are taken mean value, to generate by the word vectors for utilizing word2vec training corpus
The vector of corresponding sentence, then using word2vec prototype network carry out backpropagation training it is final calculate the corresponding word of generation to
Amount;
(4) the term vector input GRU neural network model with sentence emotion for generating word2vec is trained;
(5) test set carries out emotional semantic classification according to training set method building, input GRU neural network.
Corpus data described in step (1) is the user comment on domestic well-known website Jingdone district, training corpus collection 11w item electricity
Son, life kind comment on sentence.
The corpus data is divided into positive class and negative class.
The corpus data is divided into training set and test set.
The training set and test set ratio are 5:1.
Beneficial effects of the present invention are as follows:
1, the present invention carries out emotional semantic classification using based on the disaggregated model of GRU neural network, is obtaining good effect
It is also obviously improved in model velocity simultaneously.
2, word2vec and GRU can complete the task of Chinese text emotion classification well.Although GRU model training
Accurate letter cannot be well beyond other deep learning methods.But there is certain promotion in training test spent time.Institute
Performance superior enough is shown in the method for the Chinese comment emotion model based on GRU neural network.
Detailed description of the invention
Fig. 1 is a kind of Chinese comment sentiment analysis method flow chart based on GRU neural network.
Fig. 2 is GRU unit internal structure chart.
Specific embodiment
The invention is described in further details with reference to the accompanying drawing.
Corpus data is first loaded by method as shown in Figure 1, is divided using jieba (Chinese word segmentation tool) participle tool
Then word removes some useless pause words and corpus is divided into training set and test set by a certain percentage.In order to extract word
Semantic feature between word, the reinforcement to words and phrases classification judgement.This patent utilizes the word of Word2Vec training corpus
Vector takes mean value to all term vectors of each sentence, to generate the term vector of word in corresponding sentence, then utilizes Word2Vec
Prototype network carries out final calculate of backpropagation training and generates corresponding low-dimensional term vector.Sentence is had by what Word2Vec was generated
The term vector input GRU neural network model of emotion is trained.Test set is constructed according to the method described above, input GRU nerve
Network carries out emotional semantic classification.
A kind of Chinese comment sentiment analysis method based on GRU neural network.Method is trained each using word2vec
The term vector of word calculates the cosine between vector to judge the similarity of vector to obtain the similarity on text semantic.It will
The high word of these similarities does clustering processing.By carrying out the processing of similar features word to training corpus collection, then it is added to feelings
Achieve the effect that expand sentiment dictionary among sense dictionary.Word2vec is recycled to get the vector of word to training corpus training
It indicates, this is used to indicate to indicate as term vector characteristic mathematicalization.This expression discloses the Latent Semantic between characteristic value.
Feature Words matching is converted to the value of vector form in this way.By GRU neural network model be trained test from
And complete the design of a model.The specific steps of which are as follows:
Step 1: word2vec training needs corpus of text to support, training corpus is abundanter, and training effect will
A more good only current domestic not relatively complete open experimental data set is tested to researcher, therefore is being carried out in fact
Before testing, need to obtain a large amount of user comment from internet.Training set data is classified, divides 4,5 by star
Star comments on the class that is positive, and 1,2 stars comment on the class that is negative.What the application utilized is the user comment on domestic well-known website Jingdone district, training
11w electronics of corpus, life kind comment on sentence, and after carrying out sort operation, the probably positive negative ratio of training set is 5:2 (through remarkable
For compression screening), data relative equilibrium.Corpus is divided into training set and test set, ratio 5:1 simultaneously.
Step 2: using jieba participle program to collected electrical type, life kind comment on commodity corpus carry out participle and
Part-of-speech tagging removes the symbols such as the various emoticons of punctuate and the various Mars words in corpus.Load stop words corpus is gone
Except stop words, corpus is trained using word2vec, obtains corresponding model file.It is utilized under windows platform
Anaconda3 (the Python release version of open source) configures corresponding python (program language) environment, installs and uses gensim
(tool of document semantic structure) module calls the word2vec under its module to be trained, corresponding parameter setting are as follows: hides
The number of plies: 1, the number of iterations: 400, batch size: 20, loss function: cross entropy declines mode: stochastic gradient descent method, learning rate:
0.0001, weight initial methods: orthogonal initialization, coefficient of waiving the right: 0.2.
Training is completed, and file saves as word file corresponding with vector.It is similar using the maximum under word2vec module
Function carries out similarity calculation, realizes the comparison of similarity between word, then achievees the purpose that cluster.By to training corpus
Collection carries out the processing of similar features word, then is added among sentiment dictionary and reaches expansion sentiment dictionary.
Step 3: indicating using the feature vector that word2vec gets word to training corpus training, this table is used
It is shown as the expression of term vector characteristic mathematicalization.Word2Vec is a kind of word and vector conversion method using deep learning, it is
A kind of neural network of shallow-layer.The content of text is indicated using the vector operation of N-dimensional degree, with one three layers of nerve
Network models language model, and the vector for obtaining word indicates, this is used to indicate as characteristic mathematical number table
Show.The Latent Semantic between characteristic value is disclosed simultaneously.Feature Words matching is converted to the value of vector form in this way.
Step 4: the word feature vector input GRU neural network model of generation is trained test.GRU is in LSTM
A kind of mutation, by between hidden layer simple node be added link, with cycling element control hidden neuron output, can
Effectively to model the variation in time series.
Traditional LSTM is generally made of multiple doors: being forgotten door, Memory-Gate, is forgotten that door and Memory-Gate combine and become one more
New door, out gate composition, can make model abandon some garbages, so being used widely.GRU as shown in Figure 2 is herein
On the basis of will forget to merge with Memory windows, so being made of update door and resetting door, update door for time step t, we are first
It first needs to calculate using following formula (1) and updates door zt:
zt=σ (W(z)xt+U(z)ht-1) (1)
Wherein xtFor the input vector of t-th of time step, i.e. t-th of component of list entries X, it can be linear by one
Transformation is (with weight matrix W(z)It is multiplied).ht-1What is saved is the information of previous t-1 time step.U(z)To update matrix, σ attaches most importance to
Set function.It updates this two parts information phase adduction of goalkeeper to put into sigmoid activation primitive, therefore activation result is compressed to
Between 0 to 1.It updates door and helps how much past information are transmitted to future on earth by model decision.Reset door basically
It determines and needs to forget by how much past information on earth, calculated using following formula:
rt=σ (W(r)xt+U(r)ht-1) (2)
Wherein: rtInformation, W are stored for resetting door(r)For weight matrix, U(r)To update matrix,
H in formula (2)t-1And xtFirst pass through a linear transformation, then be added investment sigmoid activation primitive it is sharp to export
Value living.It is finally exactly to update how door and resetting door influence final output on earth the problem of output.Resetting door in use,
New memory content will use resetting door storage relevant information in the past, its calculation expression are as follows:
ht'=tanh (Wxt+rt⊙Uht-1) (3)
Wherein: ht' it is current time information, W is weight matrix, and U is to update matrix, and tanh is non-linear transform function,
⊙ is Hadamard product,
Formula (3) inputs xtWith upper time step information ht-1First pass through a linear transformation, i.e., respectively the right side multiply matrix W and
U.Calculate resetting door r and Uht-1Hadamard product, i.e. rtWith Uht-1Corresponding element product.Because of the resetting that front calculates
Door is the vector formed by 0 to 1, it can measure the size that gate is opened.The Hadamard product to be retained determination
With the former information of forgetting.Final step network needs to calculate ht, which will retain the information of active cell and is transmitted to down
In one unit.In this process, it would be desirable to which, using door is updated, it determines current memory content h ' and previous time step
ht-1What the information that middle needs are collected is.This process can indicate are as follows:
ht=zt⊙ht-1+(1-zt)⊙ht′ (4)
Wherein: htFor temporal information,
Formula (4) ztFor the activation result for updating door, it is equally with the inflow of information of the form control of gate.ztWith ht-1
The previous time step of Hadamard product representation remain into the information finally remembered, which retains plus current memory to final
The information of memory is equal to the content of final gating cycle unit output.
The hyper parameter of GRU neural network model has: sentence handles size, and term vector dimension hides node layer, dropout
(loss ratio) is respectively 20,80,200,0.15 to be divided into that left and right is bilateral, and left window is set as 0, and right window is set for input window
It is set to 3.Four characters of t to t+4 are inputted simultaneously.Being trained test by GRU neural network model is that model is constantly complete
It is kind to obtain an ideal model.
Claims (5)
1. a kind of Chinese comment sentiment analysis method based on GRU neural network, which comprises the steps of:
(1) first corpus data is loaded into, is segmented using jieba participle tool;
(2) some useless pause words are removed and corpus is divided into training set and test set in proportion;
(3) all term vectors of each sentence are taken mean value, to generate correspondence by the word vectors for utilizing word2vec training corpus
Then the vector of sentence carries out final calculate of backpropagation training using word2vec prototype network and generates corresponding term vector;
(4) the term vector input GRU neural network model with sentence emotion for generating word2vec is trained;
(5) test set carries out emotional semantic classification according to training set method building, input GRU neural network.
2. a kind of Chinese comment sentiment analysis method based on GRU neural network according to claim 1, which is characterized in that
Corpus data described in step (1) is the user comment on domestic well-known website Jingdone district, 11w electronics of training corpus collection, life
Class comments on sentence.
3. a kind of Chinese comment sentiment analysis method based on GRU neural network according to claim 2, which is characterized in that
The corpus data is divided into positive class and negative class.
4. a kind of Chinese comment sentiment analysis method based on GRU neural network according to claim 2, which is characterized in that
The corpus data is divided into training set and test set.
5. a kind of Chinese comment sentiment analysis method based on GRU neural network according to claim 4, which is characterized in that
The training set and test set ratio are 5:1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811097770.7A CN109165387A (en) | 2018-09-20 | 2018-09-20 | A kind of Chinese comment sentiment analysis method based on GRU neural network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811097770.7A CN109165387A (en) | 2018-09-20 | 2018-09-20 | A kind of Chinese comment sentiment analysis method based on GRU neural network |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109165387A true CN109165387A (en) | 2019-01-08 |
Family
ID=64879779
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811097770.7A Pending CN109165387A (en) | 2018-09-20 | 2018-09-20 | A kind of Chinese comment sentiment analysis method based on GRU neural network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109165387A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110059187A (en) * | 2019-04-10 | 2019-07-26 | 华侨大学 | A kind of deep learning file classification method of integrated shallow semantic anticipation mode |
CN110083825A (en) * | 2019-03-21 | 2019-08-02 | 昆明理工大学 | A kind of Laotian sentiment analysis method based on GRU model |
CN110134947A (en) * | 2019-04-17 | 2019-08-16 | 中国科学院计算技术研究所 | A kind of sensibility classification method and system based on uneven multi-source data |
CN110147452A (en) * | 2019-05-17 | 2019-08-20 | 北京理工大学 | A kind of coarseness sentiment analysis method based on level BERT neural network |
CN110377691A (en) * | 2019-07-23 | 2019-10-25 | 上海应用技术大学 | Method, apparatus, equipment and the storage medium of text classification |
CN110414219A (en) * | 2019-07-24 | 2019-11-05 | 长沙市智为信息技术有限公司 | Detection method for injection attack based on gating cycle unit Yu attention mechanism |
CN110852063A (en) * | 2019-10-30 | 2020-02-28 | 语联网(武汉)信息技术有限公司 | Word vector generation method and device based on bidirectional LSTM neural network |
CN111563164A (en) * | 2020-05-07 | 2020-08-21 | 成都信息工程大学 | Specific target emotion classification method based on graph neural network |
CN111858945A (en) * | 2020-08-05 | 2020-10-30 | 上海哈蜂信息科技有限公司 | Deep learning-based comment text aspect level emotion classification method and system |
CN112434143A (en) * | 2020-11-20 | 2021-03-02 | 西安交通大学 | Dialog method, storage medium and system based on hidden state constraint of GRU (generalized regression Unit) |
CN113065038A (en) * | 2020-01-02 | 2021-07-02 | 北京京东尚科信息技术有限公司 | Short text matching method, device and storage medium |
CN113095087A (en) * | 2021-04-30 | 2021-07-09 | 哈尔滨理工大学 | Chinese word sense disambiguation method based on graph convolution neural network |
CN113288050A (en) * | 2021-04-23 | 2021-08-24 | 山东师范大学 | Multidimensional enhanced epileptic seizure prediction system based on graph convolution network |
CN114943290A (en) * | 2022-05-25 | 2022-08-26 | 盐城师范学院 | Biological invasion identification method based on multi-source data fusion analysis |
CN115146059A (en) * | 2022-06-17 | 2022-10-04 | 东方合智数据科技(广东)有限责任公司 | Raw paper market data processing method based on corrugated paper industry and related equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080249764A1 (en) * | 2007-03-01 | 2008-10-09 | Microsoft Corporation | Smart Sentiment Classifier for Product Reviews |
CN107247702A (en) * | 2017-05-05 | 2017-10-13 | 桂林电子科技大学 | A kind of text emotion analysis and processing method and system |
CN108446813A (en) * | 2017-12-19 | 2018-08-24 | 清华大学 | A kind of method of electric business service quality overall merit |
-
2018
- 2018-09-20 CN CN201811097770.7A patent/CN109165387A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080249764A1 (en) * | 2007-03-01 | 2008-10-09 | Microsoft Corporation | Smart Sentiment Classifier for Product Reviews |
CN107247702A (en) * | 2017-05-05 | 2017-10-13 | 桂林电子科技大学 | A kind of text emotion analysis and processing method and system |
CN108446813A (en) * | 2017-12-19 | 2018-08-24 | 清华大学 | A kind of method of electric business service quality overall merit |
Non-Patent Citations (2)
Title |
---|
张玉环 等: "基于两种LSTM结构的文本情感分析", 《软件》 * |
邢长征 等: "文本情感分析的深度学习方法", 《计算机应用与软件》 * |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110083825A (en) * | 2019-03-21 | 2019-08-02 | 昆明理工大学 | A kind of Laotian sentiment analysis method based on GRU model |
CN110059187A (en) * | 2019-04-10 | 2019-07-26 | 华侨大学 | A kind of deep learning file classification method of integrated shallow semantic anticipation mode |
CN110059187B (en) * | 2019-04-10 | 2022-06-07 | 华侨大学 | Deep learning text classification method integrating shallow semantic pre-judging mode |
CN110134947B (en) * | 2019-04-17 | 2021-03-26 | 中国科学院计算技术研究所 | Emotion classification method and system based on unbalanced multi-source data |
CN110134947A (en) * | 2019-04-17 | 2019-08-16 | 中国科学院计算技术研究所 | A kind of sensibility classification method and system based on uneven multi-source data |
CN110147452A (en) * | 2019-05-17 | 2019-08-20 | 北京理工大学 | A kind of coarseness sentiment analysis method based on level BERT neural network |
CN110377691A (en) * | 2019-07-23 | 2019-10-25 | 上海应用技术大学 | Method, apparatus, equipment and the storage medium of text classification |
CN110414219A (en) * | 2019-07-24 | 2019-11-05 | 长沙市智为信息技术有限公司 | Detection method for injection attack based on gating cycle unit Yu attention mechanism |
CN110852063A (en) * | 2019-10-30 | 2020-02-28 | 语联网(武汉)信息技术有限公司 | Word vector generation method and device based on bidirectional LSTM neural network |
CN110852063B (en) * | 2019-10-30 | 2023-05-05 | 语联网(武汉)信息技术有限公司 | Word vector generation method and device based on bidirectional LSTM neural network |
CN113065038A (en) * | 2020-01-02 | 2021-07-02 | 北京京东尚科信息技术有限公司 | Short text matching method, device and storage medium |
CN111563164A (en) * | 2020-05-07 | 2020-08-21 | 成都信息工程大学 | Specific target emotion classification method based on graph neural network |
CN111563164B (en) * | 2020-05-07 | 2022-06-28 | 成都信息工程大学 | Specific target emotion classification method based on graph neural network |
CN111858945A (en) * | 2020-08-05 | 2020-10-30 | 上海哈蜂信息科技有限公司 | Deep learning-based comment text aspect level emotion classification method and system |
CN111858945B (en) * | 2020-08-05 | 2024-04-23 | 上海哈蜂信息科技有限公司 | Deep learning-based comment text aspect emotion classification method and system |
CN112434143B (en) * | 2020-11-20 | 2022-12-09 | 西安交通大学 | Dialog method, storage medium and system based on hidden state constraint of GRU (generalized regression Unit) |
CN112434143A (en) * | 2020-11-20 | 2021-03-02 | 西安交通大学 | Dialog method, storage medium and system based on hidden state constraint of GRU (generalized regression Unit) |
CN113288050A (en) * | 2021-04-23 | 2021-08-24 | 山东师范大学 | Multidimensional enhanced epileptic seizure prediction system based on graph convolution network |
CN113095087A (en) * | 2021-04-30 | 2021-07-09 | 哈尔滨理工大学 | Chinese word sense disambiguation method based on graph convolution neural network |
CN114943290A (en) * | 2022-05-25 | 2022-08-26 | 盐城师范学院 | Biological invasion identification method based on multi-source data fusion analysis |
CN114943290B (en) * | 2022-05-25 | 2023-08-08 | 盐城师范学院 | Biological intrusion recognition method based on multi-source data fusion analysis |
CN115146059A (en) * | 2022-06-17 | 2022-10-04 | 东方合智数据科技(广东)有限责任公司 | Raw paper market data processing method based on corrugated paper industry and related equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109165387A (en) | A kind of Chinese comment sentiment analysis method based on GRU neural network | |
Zheng et al. | A hybrid bidirectional recurrent convolutional neural network attention-based model for text classification | |
Zulqarnain et al. | Efficient processing of GRU based on word embedding for text classification | |
Luo | Network text sentiment analysis method combining LDA text representation and GRU-CNN | |
CN109145112B (en) | Commodity comment classification method based on global information attention mechanism | |
CN104834747B (en) | Short text classification method based on convolutional neural networks | |
CN110807320B (en) | Short text emotion analysis method based on CNN bidirectional GRU attention mechanism | |
Srivastava et al. | Modeling documents with deep boltzmann machines | |
CN112364638B (en) | Personality identification method based on social text | |
Sari et al. | Text classification using long short-term memory with glove | |
CN112699960A (en) | Semi-supervised classification method and equipment based on deep learning and storage medium | |
CN110969020A (en) | CNN and attention mechanism-based Chinese named entity identification method, system and medium | |
CN111078833B (en) | Text classification method based on neural network | |
CN108875809A (en) | The biomedical entity relationship classification method of joint attention mechanism and neural network | |
CN113887643B (en) | New dialogue intention recognition method based on pseudo tag self-training and source domain retraining | |
CN110717330A (en) | Word-sentence level short text classification method based on deep learning | |
CN113516198B (en) | Cultural resource text classification method based on memory network and graphic neural network | |
CN111400494B (en) | Emotion analysis method based on GCN-Attention | |
CN111274790A (en) | Chapter-level event embedding method and device based on syntactic dependency graph | |
CN112199503B (en) | Feature-enhanced unbalanced Bi-LSTM-based Chinese text classification method | |
CN111753088A (en) | Method for processing natural language information | |
CN111309909A (en) | Text emotion classification method based on hybrid model | |
Sun et al. | Multi-channel CNN based inner-attention for compound sentence relation classification | |
CN114925205B (en) | GCN-GRU text classification method based on contrast learning | |
CN116543406A (en) | Multi-feature fusion double-target self-supervision medical problem text clustering method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 210032 No. 219 Ning six road, Jiangbei new district, Nanjing, Jiangsu Applicant after: Nanjing University of Information Science and Technology Address before: 211500 Yuting Square, 59 Wangqiao Road, Liuhe District, Nanjing City, Jiangsu Province Applicant before: Nanjing University of Information Science and Technology |
|
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190108 |