CN108363753A - Comment text sentiment classification model is trained and sensibility classification method, device and equipment - Google Patents
Comment text sentiment classification model is trained and sensibility classification method, device and equipment Download PDFInfo
- Publication number
- CN108363753A CN108363753A CN201810086816.9A CN201810086816A CN108363753A CN 108363753 A CN108363753 A CN 108363753A CN 201810086816 A CN201810086816 A CN 201810086816A CN 108363753 A CN108363753 A CN 108363753A
- Authority
- CN
- China
- Prior art keywords
- text
- comment
- level
- sentence
- comment text
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
Abstract
The invention discloses the training of comment text sentiment classification model and sensibility classification method, device and equipment, belong to the text emotion classification field of natural language processing.Model training includes:Obtain comment text, associated subject and object information;Comment subject and object attention mechanism, which is incorporated, based on first layer Bi LSTM networks extracts Sentence-level character representation;Documentation level character representation is extracted incorporating comment subject and object attention mechanism based on second layer Bi LSTM networks;Documentation level Feature Mapping is classified to emotional category space using softmax using tanh nonlinear mapping function, the parameter in model is trained, obtains optimal text emotion disaggregated model.The present invention uses the two-way Bi LSTM network models of level and attention mechanism, and the semantic steady perception of context and the semantic meaning representation of text not only may be implemented, can also significantly improve the robustness of text emotion classification, improve classification accuracy rate.
Description
Technical field
The present invention relates to comment text emotional semantic classifications, more particularly to one kind being based on the two-way LSTM (Bidirectional of level
Long Short-Term Memory, Bi-LSTM) and attention mechanism comment text emotion model training with emotional semantic classification side
Method, device and equipment, belong to natural language processing technique field.
Background technology
The key problem of text emotion classification is how effectively to indicate the emotional semantic of text.With Internet technology
Fast development generates a large number of users on network for valuable comment text information such as focus incident, products, for example, microblogging,
Electric business platform, food and drink platform etc..These comment informations include the abundant emotional color and Sentiment orientation of people.Sentiment analysis
Purpose be exactly automatically from text extraction and sorted users for the subjective emotion information of product or event, help businessman or
Government department completes the tasks such as data analysis and public sentiment monitoring.Therefore, sentiment analysis also becomes the weight of natural language processing field
Want one of project.Sentiment analysis is divided into the extraction of emotion information, the classification of emotion information and the retrieval of emotion information and conclusion.
Here the emotional semantic classification problem of the mainly text level solved.The emotional semantic classification task of text level is primarily to automatic point
Class goes out in the emotion text that user generates for a certain product or the Sentiment orientation of event representation (actively or passive) or emotion
Intensity (the 1-5 stars evaluation in such as film or dining room comment text).Emotional semantic classification is seen composition notebook by current most methods
One kind of classification problem.By the method for machine learning, by Sentiment orientation or emotion scoring as the data for having supervision, training
Grader carries out classification to text emotion becomes a kind of method of mainstream.Character representation in machine learning is to influence grader effect
An important factor for fruit, therefore, the character representation of text emotion semanteme, become the key in text emotion classification problem and time-consuming
One step.
Traditional character representation method includes that One-hot, N-Gram and domain expert pass through text or additional feelings
Feel some validity features of dictionary design.However, Feature Engineering is a labor-intensive task, and need more field
Knowledge.Therefore, the automatic study of feature becomes the emphasis of people's research gradually.Deep learning method based on neural network is exactly
A kind of method of automatic learning characteristic.And as deep learning is in computer vision, speech recognition and natural language processing etc.
The successful application in field, more and more the text emotion disaggregated model generation based on deep learning, the universal land productivity of these models
The method that word is embedded in (Word Embedding, WE) carries out character representation, and this low dimensional term vector representation method can not only
The problem that dimension present in word expression is excessive in conventional language model is well solved, moreover, the language of word can be effectively maintained
Adopted information so that semantic similar word distance closer to.In addition, on the basis of word is embedded in, pass through convolutional neural networks
(Convolutional Neural Network, CNN), recurrent neural network (Recursive Neural Network, RNN)
With the neural network models such as Recognition with Recurrent Neural Network (Recurrent Neural Network, RNN), can indicate well sentence or
The semantic information of person's text level, since deep learning has the automatic Extracting Ability of good feature, in this paper emotions point
It is widely used in class problem.
However, most of text emotion disaggregated models based on neural network only considered the relevant feelings of content of text at present
Sense is semantic, has ignored and comments on object information described in the relevant comment main information of text and content of text.Meanwhile having
Studies have shown that the hobby of comment main body has important influence with the characteristics of comment object for the scoring for commenting on main body.It realizes
The semantic steady perception of the context of text and semantic meaning representation, while merging attention mechanism will comment main body and comment object information
It is combined with text semantic information so that semantic information is more abundant in text, is the main direction of studying of the present invention.
Invention content
Goal of the invention:In view of the deficiencies of the prior art, a kind of based on hierarchical B i-LSTM and note present invention aims at providing
The comment text sentiment classification model training for power mechanism of anticipating and the method for emotional semantic classification is carried out based on the model and device and is set
It is standby, Sentence-level and documentation level semantic feature are modeled by using two-way LSTM, it is special above to introduce forward direction LSTM elements captures
Reference ceases, reversed LSTM elements captures following traits information, and will comment on main body and comment object information and believe with text semantic
Breath combines, and captures abundant semantic feature information, improves the robustness and accuracy of text emotion classification.
Technical solution:For achieving the above object, the present invention uses following technical scheme:
A kind of comment text sentiment classification model training method, includes the following steps:
(1) obtain includes comment text, the training set text with the associated subject and object information of the comment text;
(2) word in the comment text of training set is transformed into term vector to indicate, is input to first layer Bi-LSTM networks
In, in conjunction with front and back the associated subject and object information attention mechanism of word level comment text is multiplied by hidden layer output vector
The weight trained extracts Sentence-level character representation;
(3) Sentence-level character representation is input in second layer Bi-LSTM networks, in conjunction with front and back to hidden layer output vector
It is multiplied by the weight that the associated subject and object information attention mechanism of Sentence-level comment text trains, extracts the text of comment text
Shelf grade character representation;
(4) use tanh nonlinear mapping function that documentation level semantic feature to obtain the final product is mapped to emotional category sky
Between, classified using softmax, the parameter in model is trained, obtains optimal text emotion disaggregated model.
Preferably, in step (2), the Sentence-level language based on i-th of sentence that word level attention mechanism is extracted
Adopted featureWherein,Indicate j-th of word in i-th of sentenceSignificance level, T is word in i-th of sentence
Quantity,For the output of the forward direction LSTM models in first layer Bi-LSTM networksIt is defeated with backward LSTM models
Go outCombination;Wherein,
FormulaIt indicatesMain text information u and comment object text message p are commented on to wordThe calculating of significance level,
Wherein parameter matrix WH、WU、WPIt continues to optimize to obtain optimal value in the training process with vector v, b.
Preferably, in step (3), the documentation level feature extracted based on Sentence-level attention mechanism
Wherein, βiIndicate i-th of sentence y in documentiSignificance level, M is the quantity of sentence in document,For the second layer
The output of forward direction LSTM models in Bi-LSTM networksWith the output of backward LSTM modelsCombination;Wherein, e (hi, u, p) and=vTtanh(WHhi+WUu+WPP+b), formula e (hi, u, p) and it indicates
hi, comment main text information u and comment object text message p to sentence yiThe calculating of significance level, wherein parameter matrix WH、
WU、WPIt continues to optimize to obtain optimal value in the training process with vector v, b.
Preferably, step (4) includes:
(4.1) documentation level semantic feature d to obtain the final product is mapped to by emotion using tanh tanh nonlinear mapping functions
Class number is the emotional category space of C,Wherein WcIt is the parameter matrix of file characteristics d, bcIt is inclined
Difference vector, parameter matrix and bias vector obtain optimal value by training process optimization;
(4.2) adjudicate to obtain text emotion classification using softmax grader decisions, using cross entropy loss function as
The optimization aim of model training constantly updates related ginseng in model by backpropagation BP algorithm counting loss functional gradient simultaneously
Number obtains optimal models.
Preferably, the calculation formula for adjudicating to obtain text emotion classification using softmax grader decisions is:
Wherein,Be document semantic Feature Mapping it is the expression that emotional category is c, pcIt is the prediction that text emotion classification is c
Probability.
Preferably, the calculation formula of the cross entropy loss function is:
Wherein, training setbiIt is i-th of sample text in training set, yiIt is sample biBenchmark
Affective tag, N are training set sizes;I.e. if text emotion is classified as c,Value be 1, otherwiseValue be 0;pcIt is the prediction probability that text emotion classification is c.
A kind of text obtained using above-mentioned comment text sentiment classification model training method that another aspect of the present invention provides
The method that this sentiment classification model carries out text emotion classification, includes the following steps:
Obtain includes comment text, the test set text with the associated subject and object information of the comment text;
It will be predicted to obtain emotional category in test set text input to the optimal text emotion disaggregated model.
A kind of comment text sentiment classification model training device that another aspect of the present invention provides, including:
Acquisition module, it includes comment text, the instruction with the associated subject and object information of the comment text to be used to obtain
Practice collection text;
Sentence-level characteristic extracting module, for word in the comment text of training set text to be transformed into term vector expression,
It is input in first layer Bi-LSTM networks, to hidden state output vector to be multiplied by word level comment text associated in conjunction with front and back
The weight that subject and object information attention mechanism trains extracts Sentence-level character representation;
Documentation level characteristic extracting module, for Sentence-level character representation to be input in second layer Bi-LSTM networks, in conjunction with
It is front and back to be multiplied by what the associated subject and object information attention mechanism of Sentence-level comment text trained to hidden state output vector
Weight extracts the documentation level character representation of comment text;
And training module, for documentation level semantic feature to obtain the final product to be reflected using tanh nonlinear mapping function
It is incident upon emotional category space, is classified using softmax, the parameter in model is trained, obtains optimal text emotion classification
Model.
A kind of text obtained using above-mentioned comment text sentiment classification model training method that another aspect of the present invention provides
This sentiment classification model carries out the device of text emotion classification, including:
Acquisition module, it includes comment text, the survey with the associated subject and object information of the comment text to be used to obtain
Examination collection text;
And classification prediction module, for by test set text input to the optimal text emotion disaggregated model into
Row prediction obtains emotional category.
The present invention also provides a kind of computer equipment, including memory, processor and storage on a memory and can located
The computer program run on reason device, the processor realize the comment text sentiment classification model when executing described program
Training method realizes the text sentiment classification method.
Advantageous effect:Compared with prior art, the present invention has the following technical effects:
The present invention is based on the documentation level text sentiment classification method of hierarchical B i-LSTM and attention mechanism, using two-way
LSTM models the semantic feature of Sentence-level and text level.Positive LSTM captures semantic feature information above, reversed LSTM
The semantic feature information of capture hereafter, while incorporating comment text host-guest relevant information and reinforcing text semantic expression, effect
The semantic steady perception of context and the semantic meaning representation of text not only may be implemented in fruit, can also significantly improve text emotion classification
Robustness, while the accuracy of text emotion classification can also be effectively improved.
Description of the drawings
Fig. 1 is the flow chart of comment text sentiment classification model training method in the embodiment of the present invention.
Fig. 2 is the hierarchical B i-LSTM model schematics proposed in the embodiment of the present invention.
Fig. 3 is the training of comment text sentiment classification model and sensibility classification method principle schematic in the embodiment of the present invention.
Fig. 4 is comment text sensibility classification method flow chart in the embodiment of the present invention.
Fig. 5 is comment text sentiment classification model training device structure chart in the embodiment of the present invention.
Fig. 6 is comment text emotional semantic classification structure drawing of device in the embodiment of the present invention.
Specific implementation mode
Technical scheme of the present invention is described in detail below in conjunction with the accompanying drawings:
As shown in Figure 1, a kind of comment text sentiment classification model training method disclosed by the embodiments of the present invention, includes mainly
Following steps:
(1) training set text is obtained, each sample text wherein in training set includes comment text itself, and is commented with this
The associated comment main body of paper sheet (i.e. commentator or linked groups) and comment object (objects such as comment product, news).Comment
Text and relative subject and the information of object can be obtained from internet.
(2) word in the comment text of training set is transformed into term vector to indicate, is input to first layer Bi-LSTM networks
In, in conjunction with front and back the associated subject and object information attention mechanism of word level comment text is multiplied by hidden layer output vector
The weight trained extracts Sentence-level character representation.
(3) Sentence-level character representation is input in second layer Bi-LSTM networks, in conjunction with front and back to hidden layer output vector
It is multiplied by the weight that the associated subject and object information attention mechanism of Sentence-level comment text trains, extracts the text of comment text
Shelf grade character representation.
In step (2), each respective word in comment text is converted using word embedded technology (Word Embedding, WE)
It indicates, and is input in Bi-LSTM networks to extract word level feature representation at feature vector.Word embedded technology will be in text
Word is mapped to for the size of vocabulary compared with the real vector on lower dimensional space.In the semantic feature table of word level
In showing, all words are all made of a word embeded matrix Lw∈Rn×|V|It indicating, wherein n indicates the dimension of word, | V | it is all lists
The quantity of word.LwIn term vectorIt is generated by word2vector word embedding grammar pre-training.
In the first constructed level Bi-LSTM models, two-way LSTM (Bi-LSTM) networks introduce just simultaneously in hidden layer
To LSTM units and reversed LSTM units;Positive LSTM elements captures characteristic information above, and reversed LSTM elements captures are hereafter
Characteristic information;Compared with unidirectional LSTM, the characteristic information of more robusts can be captured.According to obtaining the word level mark sheet of text
Show, the Sentence-level character representation of input Bi-LSTM networks output extraction text;Second Bi- is inputted using similar strategy cascade
LSTM networks obtain the documentation level character representation of text by the Sentence-level character representation output extraction of text and are based on two layers with realizing
The hierarchical B i-LSTM model constructions of Bi-LSTM network aware text document grade features.
It will be in the word level character representation input first layer Bi-LSTM networks of text, it is assumed that a sentence yiIn have T
Word, j-th of word areBy sentence yiRegard a sequence as, the word in sentence is the component part of sentence sequence.Point
The output of the forward direction LSTM and backward LSTM models in the Bi-LSTM networks are indescribably taken, and the Sentence-level for being considered as corresponding text is special
Sign indicates:
In conjunction withWithForm sentence yiSemantic feature:
Sentence-level character representation to obtain the final product is inputted in second layer Bi-LSTM networks in step (3), similarly, it is assumed that one
It is y to have M sentence, each sentence in text bi, i ∈ [0, M] regard text b as a sequence, and the sentence in text is text
The component part of sequence.The forward direction LSTM in the Bi-LSTM networks and the output of backward LSTM models are extracted respectively, and are considered as corresponding
The documentation level character representation of text:
In conjunction withWithObtain the semantic feature of final text b:
More accurate comment text emotional semantic indicates in order to obtain, will descriptive text related with comment subject and object
This information trains word level and the weight of the hidden state output of Sentence-level using attention mechanism, in the semanteme for obtaining comment text
When grade character representation, selection and the higher word of the comment subject and object degree of correlation or sentence are more literary to obtain
This emotional semantic indicates.
Further being strengthened using word level attention mechanism (word attention) in step (2) is had with emotional factor
Associated Sentence-level character representation assigns the power of semantic more relevant word bigger in conjunction with comment subject and object preference information
Weight.Assuming that using vectorWithIt is indicated respectively with the related text message of comment main body and related with object is commented on
Text message, wherein duAnd dpThe dimension with comment main body and the term vector for commenting on object related text is indicated respectively.Such as Fig. 2
It is shown, in Sentence-level character representation, it is assumed that siIndicate that the Sentence-level extracted based on word level attention mechanism is semantic special
Sign, then
WhereinIndicate the significance level of j-th of word in i-th of sentence.DefinitionCalculating:
Wherein, e is to calculate wordImportance degree function:
Wherein, WH、WUAnd WPHidden layer output is indicated respectivelyComment on main text information u and comment visitor
The relevant parameter matrix of body text message p, v are portrayed wordsImportance degree parameter vector, vTIt is the transposed vector of v, b
It is bias vector, wherein parameter matrix WH、WU、WPIt continues to optimize to obtain optimal value in the training process with vector v, b.
Using the further reinforcing of Sentence-level attention mechanism (sentence attention) and emotional factor in step (3)
Related documentation level character representation assigns the power of semantic more relevant sentence bigger in conjunction with comment subject and object preference information
Weight.In documentation level character representation, it is assumed that d is the documentation level character representation extracted based on Sentence-level attention mechanism, that
:
Wherein, βiIndicate i-th of sentence y in documentiSignificance level, computational methods withUnanimously:Wherein, e (hi, u, p) and=vT tanh(WHhi+WUu+WPP+b), formula e (hi, u, p) and it indicatesMain text information u and comment object text message p are commented on to sentence yiThe calculating of significance level corresponds to
Parameter is respectively WH、WUAnd WP, v is to portray sentence yiImportance degree parameter vector, vTIt is the transposed vector of v, in formula
All parameters are continued to optimize to obtain optimal value in the training process with bias vector.βiIt is by softmax to e (hi, u, p) into
Row normalization indicates.Then to get d be exactly final text semantic character representation.
To sum up, the step of Text character extraction is:Related words in text are used into word embedded technology (such as first
Word2vector tools) it is converted into real vector expression, as first layer Bi-LSTM network inputs, and extract the Bi-LSTM
The front and back output vector to hidden layer of networkMultiplied by with by word level attention mechanism (word
Attention the weight) trainedObtain the Sentence-level character representation of corresponding text:Then again by the sentence
Sub- grade feature siAs the input of second layer Bi-LSTM networks, and extract the Bi-LSTM networks it is front and back to hidden layer export to
AmountMultiplied by the weight beta to be trained by Sentence-level attention mechanism (sentence attention) mechanismi, obtain
Go out the documentation level character representation of corresponding text:
(4) use tanh nonlinear mapping function that documentation level semantic feature to obtain the final product is mapped to emotional category sky
Between, classified using softmax, the parameter in model is trained, obtains optimal text emotion disaggregated model.
Step (3) obtains the documentation level character representation d of whole text, can be by it directly as the feature of text classifier
Input.First, documentation level semantic feature d to obtain the final product is mapped to by feelings using tanh tanh nonlinear mapping functions
Feel the emotional category space that class number is C, calculation formula:
Wherein, WcIt is the parameter matrix of file characteristics d, bcIt is bias vector,Then, classified using softmax
Device decision adjudicates to obtain text emotion classification, calculation formula:
Wherein,Be document semantic Feature Mapping it is the expression that emotional category is c, pcBe text emotion classification be c it is pre-
Survey probability.Optimization aim used here as cross entropy loss function as model training is calculated by backpropagation BP algorithm and is damaged
It loses functional gradient and updates model parameter simultaneously, loss function calculates formula:
Wherein, training datasetbiIt is the i-th sample text that training data is concentrated, yiIt is sample bi's
The affective tag of benchmark, N are training set sizes.I.e. if text emotion is classified as c,Value be
1, otherwiseValue be 0.
Disaggregated model is built according to above step, by the training set text input in data set to the model built
In, (AdaDelta) method Optimized model parameter is adjusted using autoadapted learning rate and adjusting model parameter obtains on verification collection
Optimal model parameters, to obtain optimal text emotion disaggregated model.Model training and the detailed schematic diagram of classification are as shown in Figure 3
As shown in figure 4, a kind of comment text sentiment classification model training device disclosed by the embodiments of the present invention, including:It obtains
Modulus block, it includes comment text, the training set text with the associated subject and object information of the comment text to be used to obtain;Sentence
Sub- grade characteristic extracting module indicates for word in the comment text of training set text to be transformed into term vector, is input to first
In layer Bi-LSTM networks, the associated subject and object of word level comment text is multiplied by hidden state output vector in conjunction with front and back
The weight that information attention mechanism trains extracts Sentence-level character representation;Documentation level characteristic extracting module is used for Sentence-level
Character representation is input in second layer Bi-LSTM networks, and Sentence-level comment text is multiplied by hidden state output vector in conjunction with front and back
The weight that associated subject and object information attention mechanism trains, extracts the documentation level character representation of comment text;And
Training module, for documentation level semantic feature to obtain the final product to be mapped to emotional category sky using tanh nonlinear mapping function
Between, classified using softmax, the parameter in model is trained, obtains optimal text emotion disaggregated model.
As shown in figure 5, a kind of comment text emotional semantic classification device disclosed by the embodiments of the present invention, including:Acquisition module is used
In obtain include comment text, and the associated subject and object information of the comment text test set text;And classification is pre-
Module is surveyed, for being predicted test set text input to the optimal text emotion disaggregated model to obtain emotional category.
Above-mentioned comment text sentiment classification model training device embodiment can be used for executing above-mentioned comment text emotion point
Class model training method embodiment, comment text emotional semantic classification device embodiment can be used for executing above-mentioned comment text emotion point
Class embodiment of the method, technical principle, it is solved the technical issues of and generation technique effect it is similar, the comment of foregoing description text
The specific work process and related explanation of this emotional semantic classification and model training can refer to pair in aforementioned correlation method embodiment
Process is answered, details are not described herein.
It will be understood by those skilled in the art that can carry out adaptively changing and it to the module in embodiment
Be arranged in the one or more equipment different from the embodiment.Can in embodiment module or unit or component combine
At a module or unit or component, and it can be divided into multiple submodule or subelement or sub-component in addition.
Based on technical concept identical with embodiment of the method, the embodiment of the present invention additionally provides a kind of computer equipment, should
Computer equipment may include memory, processor and storage on a memory and the computer journey that can run on a processor
Sequence.Wherein, above-mentioned comment text sensibility classification method embodiment or comment text are realized when computer program is loaded on processor
Each step in this sentiment classification model training method embodiment.
Above example is merely illustrative of the invention's technical idea, and protection scope of the present invention cannot be limited with this, every
According to technological thought proposed by the present invention, any change done on the basis of technical solution each falls within the scope of the present invention
Within.
Claims (10)
1. a kind of comment text sentiment classification model training method, which is characterized in that include the following steps:
(1) obtain includes comment text, the training set text with the associated subject and object information of the comment text;
(2) word in the comment text of training set is transformed into term vector to indicate, is input in first layer Bi-LSTM networks, then
In conjunction with front and back the associated subject and object information attention mechanism training of word level comment text is multiplied by hidden layer output vector
The weight gone out extracts Sentence-level character representation;
(3) Sentence-level character representation is input in second layer Bi-LSTM networks, is multiplied by hidden layer output vector in conjunction with front and back
The weight that the associated subject and object information attention mechanism of Sentence-level comment text trains, extracts the documentation level of comment text
Character representation;
(4) documentation level semantic feature to obtain the final product is mapped to by emotional category space using tanh nonlinear mapping function, adopted
Classified with softmax, the parameter in model is trained, obtains optimal text emotion disaggregated model.
2. comment text sentiment classification model training method according to claim 1, which is characterized in that in step (2), base
In the Sentence-level semantic feature for i-th of sentence that word level attention mechanism is extractedWherein,Indicate the
J-th of word in i sentenceSignificance level, T is the quantity of word in i-th of sentence,For first layer Bi-
The output of forward direction LSTM models in LSTM networksWith the output of backward LSTM modelsCombination;Wherein,Formula
It indicatesMain text information u and comment object text message p are commented on to wordThe calculating of significance level, wherein parameter matrix
WH、WU、WPIt continues to optimize to obtain optimal value in the training process with vector v, b.
3. comment text sentiment classification model training method according to claim 1, which is characterized in that in step (3), base
In the documentation level feature that Sentence-level attention mechanism is extractedWherein, βiIndicate i-th of sentence y in documenti's
Significance level, M are the quantity of sentence in document,For the defeated of the forward direction LSTM models in second layer Bi-LSTM networks
Go outWith the output of backward LSTM modelsCombination;Wherein, e (hi, u, p) and=vTtanh
(WHhi+WUu+WPP+b), formula e (hi, u, p) and indicate hi, comment main text information u and comment object text message p to sentence
yiThe calculating of significance level, wherein parameter matrix WH、WU、WPIt continues to optimize to obtain optimal value in the training process with vector v, b.
4. comment text sentiment classification model training method according to claim 1, which is characterized in that packet in step (4)
It includes:
(4.1) documentation level semantic feature d to obtain the final product is mapped to by emotional category using tanh tanh nonlinear mapping functions
Number is the emotional category space of C,Wherein WcIt is the parameter matrix of file characteristics d, bcDeviation to
Amount, parameter matrix and bias vector obtain optimal value by training process optimization;
(4.2) it adjudicates to obtain text emotion classification using softmax grader decisions, using cross entropy loss function as model
Trained optimization aim is constantly updated relevant parameter in model simultaneously by backpropagation BP algorithm counting loss functional gradient and is obtained
To optimal models.
5. comment text sentiment classification model training method according to claim 4, which is characterized in that utilize softmax
The calculation formula that grader decision adjudicates to obtain text emotion classification is:
Wherein,Be document semantic Feature Mapping it is the expression that emotional category is c, pcIt is that the prediction that text emotion classification is c is general
Rate.
6. comment text sentiment classification model training method according to claim 5, which is characterized in that the cross entropy damage
Lose function calculation formula be:
Wherein, training setbiIt is i-th of sample text in training set, yiIt is sample biBenchmark emotion
Label, N are training set sizes;I.e. if text emotion is classified as c,Value be 1, otherwise
Value be 0;pcIt is the prediction probability that text emotion classification is c.
7. a kind of text emotion obtained using comment text sentiment classification model training method according to claim 1 point
The method that class model carries out text emotion classification, which is characterized in that include the following steps:
Obtain includes comment text, the test set text with the associated subject and object information of the comment text;
It will be predicted to obtain emotional category in test set text input to the optimal text emotion disaggregated model.
8. a kind of comment text sentiment classification model training device, which is characterized in that including:
Acquisition module, it includes comment text, the training set with the associated subject and object information of the comment text to be used to obtain
Text;
Sentence-level characteristic extracting module indicates, input for word in the comment text of training set text to be transformed into term vector
Into first layer Bi-LSTM networks, the associated main body of word level comment text is multiplied by hidden state output vector in conjunction with front and back
The weight trained with object information attention mechanism extracts Sentence-level character representation;
Documentation level characteristic extracting module, for Sentence-level character representation to be input in second layer Bi-LSTM networks, in conjunction with front and back
It is multiplied by the weight that the associated subject and object information attention mechanism of Sentence-level comment text trains to hidden state output vector,
Extract the documentation level character representation of comment text;
And training module, for documentation level semantic feature to obtain the final product to be mapped to using tanh nonlinear mapping function
Emotional category space, is classified using softmax, is trained to the parameter in model, and optimal text emotion disaggregated model is obtained.
9. a kind of text emotion obtained using comment text sentiment classification model training method according to claim 1 point
Class model carries out the device of text emotion classification, which is characterized in that including:
Acquisition module, it includes comment text, the test set with the associated subject and object information of the comment text to be used to obtain
Text;
And classification prediction module, it is pre- for carrying out test set text input to the optimal text emotion disaggregated model
Measure emotional category.
10. a kind of computer equipment, including memory, processor and storage are on a memory and the meter that can run on a processor
Calculation machine program, which is characterized in that the processor realizes the comment text emotion described in claim 1-6 when executing described program
Text sentiment classification method described in disaggregated model training method or realization claim 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810086816.9A CN108363753B (en) | 2018-01-30 | 2018-01-30 | Comment text emotion classification model training and emotion classification method, device and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810086816.9A CN108363753B (en) | 2018-01-30 | 2018-01-30 | Comment text emotion classification model training and emotion classification method, device and equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108363753A true CN108363753A (en) | 2018-08-03 |
CN108363753B CN108363753B (en) | 2020-05-19 |
Family
ID=63007586
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810086816.9A Active CN108363753B (en) | 2018-01-30 | 2018-01-30 | Comment text emotion classification model training and emotion classification method, device and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108363753B (en) |
Cited By (81)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109145112A (en) * | 2018-08-06 | 2019-01-04 | 北京航空航天大学 | A kind of comment on commodity classification method based on global information attention mechanism |
CN109241283A (en) * | 2018-08-08 | 2019-01-18 | 广东工业大学 | A kind of file classification method based on multi-angle capsule network |
CN109284506A (en) * | 2018-11-29 | 2019-01-29 | 重庆邮电大学 | A kind of user comment sentiment analysis system and method based on attention convolutional neural networks |
CN109472470A (en) * | 2018-10-23 | 2019-03-15 | 重庆誉存大数据科技有限公司 | In conjunction with the corporate news data classification of risks method of deep learning and logic rules |
CN109492229A (en) * | 2018-11-23 | 2019-03-19 | 中国科学技术大学 | A kind of cross-cutting sensibility classification method and relevant apparatus |
CN109508375A (en) * | 2018-11-19 | 2019-03-22 | 重庆邮电大学 | A kind of social affective classification method based on multi-modal fusion |
CN109543030A (en) * | 2018-10-12 | 2019-03-29 | 平安科技(深圳)有限公司 | Customer service machine conference file classification method and device, equipment, storage medium |
CN109558585A (en) * | 2018-10-26 | 2019-04-02 | 深圳点猫科技有限公司 | A kind of answer Automatic-searching method and electronic equipment based on educational system |
CN109597997A (en) * | 2018-12-07 | 2019-04-09 | 上海宏原信息科技有限公司 | Based on comment entity, aspect grade sensibility classification method and device and its model training |
CN109597995A (en) * | 2018-12-04 | 2019-04-09 | 国网江西省电力有限公司信息通信分公司 | A kind of document representation method based on BM25 weighted combination term vector |
CN109670542A (en) * | 2018-12-11 | 2019-04-23 | 田刚 | A kind of false comment detection method based on comment external information |
CN109670169A (en) * | 2018-11-16 | 2019-04-23 | 中山大学 | A kind of deep learning sensibility classification method based on feature extraction |
CN109784280A (en) * | 2019-01-18 | 2019-05-21 | 江南大学 | Human bodys' response method based on Bi-LSTM-Attention model |
CN109858034A (en) * | 2019-02-25 | 2019-06-07 | 武汉大学 | A kind of text sentiment classification method based on attention model and sentiment dictionary |
CN109918501A (en) * | 2019-01-18 | 2019-06-21 | 平安科技(深圳)有限公司 | Method, apparatus, equipment and the storage medium of news article classification |
CN109992780A (en) * | 2019-03-29 | 2019-07-09 | 哈尔滨理工大学 | One kind being based on deep neural network specific objective sensibility classification method |
CN110032645A (en) * | 2019-04-17 | 2019-07-19 | 携程旅游信息技术(上海)有限公司 | Text emotion recognition methods, system, equipment and medium |
CN110046353A (en) * | 2019-04-22 | 2019-07-23 | 重庆理工大学 | Aspect level emotion analysis method based on multi-language level mechanism |
CN110083702A (en) * | 2019-04-15 | 2019-08-02 | 中国科学院深圳先进技术研究院 | A kind of aspect rank text emotion conversion method based on multi-task learning |
CN110134966A (en) * | 2019-05-21 | 2019-08-16 | 中电健康云科技有限公司 | A kind of sensitive information determines method and device |
CN110175237A (en) * | 2019-05-14 | 2019-08-27 | 华东师范大学 | It is a kind of towards multi-class secondary sensibility classification method |
CN110188202A (en) * | 2019-06-06 | 2019-08-30 | 北京百度网讯科技有限公司 | Training method, device and the terminal of semantic relation identification model |
CN110222178A (en) * | 2019-05-24 | 2019-09-10 | 新华三大数据技术有限公司 | Text sentiment classification method, device, electronic equipment and readable storage medium storing program for executing |
CN110263165A (en) * | 2019-06-14 | 2019-09-20 | 中山大学 | A kind of user comment sentiment analysis method based on semi-supervised learning |
CN110309503A (en) * | 2019-05-21 | 2019-10-08 | 昆明理工大学 | A kind of subjective item Rating Model and methods of marking based on deep learning BERT--CNN |
CN110348016A (en) * | 2019-07-15 | 2019-10-18 | 昆明理工大学 | Text snippet generation method based on sentence association attention mechanism |
CN110399547A (en) * | 2018-04-17 | 2019-11-01 | 百度在线网络技术(北京)有限公司 | For updating the method, apparatus, equipment and storage medium of model parameter |
CN110517121A (en) * | 2019-09-23 | 2019-11-29 | 重庆邮电大学 | Method of Commodity Recommendation and the device for recommending the commodity based on comment text sentiment analysis |
CN110598786A (en) * | 2019-09-09 | 2019-12-20 | 京东方科技集团股份有限公司 | Neural network training method, semantic classification method and semantic classification device |
CN110600047A (en) * | 2019-09-17 | 2019-12-20 | 南京邮电大学 | Perceptual STARGAN-based many-to-many speaker conversion method |
CN110751208A (en) * | 2018-10-29 | 2020-02-04 | 山东大学 | Criminal emotion recognition method for multi-mode feature fusion based on self-weight differential encoder |
CN110765769A (en) * | 2019-08-27 | 2020-02-07 | 电子科技大学 | Entity attribute dependency emotion analysis method based on clause characteristics |
CN110826340A (en) * | 2019-11-06 | 2020-02-21 | 广东三维家信息科技有限公司 | Evaluation text generation method and device and electronic equipment |
CN110851601A (en) * | 2019-11-08 | 2020-02-28 | 福州大学 | Cross-domain emotion classification system and method based on layered attention mechanism |
CN110879938A (en) * | 2019-11-14 | 2020-03-13 | 中国联合网络通信集团有限公司 | Text emotion classification method, device, equipment and storage medium |
CN111046233A (en) * | 2019-12-24 | 2020-04-21 | 浙江大学 | Video label determination method based on video comment text |
CN111078833A (en) * | 2019-12-03 | 2020-04-28 | 哈尔滨工程大学 | Text classification method based on neural network |
CN111125354A (en) * | 2018-10-31 | 2020-05-08 | 北京国双科技有限公司 | Text classification method and device |
CN111159400A (en) * | 2019-12-19 | 2020-05-15 | 苏州大学 | Product comment emotion classification method and system |
CN111241842A (en) * | 2018-11-27 | 2020-06-05 | 阿里巴巴集团控股有限公司 | Text analysis method, device and system |
CN111275521A (en) * | 2020-01-16 | 2020-06-12 | 华南理工大学 | Commodity recommendation method based on user comment and satisfaction level embedding |
CN111291189A (en) * | 2020-03-10 | 2020-06-16 | 北京芯盾时代科技有限公司 | Text processing method and device and computer readable storage medium |
CN111325027A (en) * | 2020-02-19 | 2020-06-23 | 东南大学 | Sparse data-oriented personalized emotion analysis method and device |
CN111339440A (en) * | 2020-02-19 | 2020-06-26 | 东南大学 | Social emotion ordering method for news text based on hierarchical state neural network |
CN111368082A (en) * | 2020-03-03 | 2020-07-03 | 南京信息工程大学 | Emotion analysis method for domain adaptive word embedding based on hierarchical network |
CN111414122A (en) * | 2019-12-26 | 2020-07-14 | 腾讯科技(深圳)有限公司 | Intelligent text processing method and device, electronic equipment and storage medium |
CN111522956A (en) * | 2020-05-08 | 2020-08-11 | 河南理工大学 | Text emotion classification method based on double channels and hierarchical attention network |
CN111639183A (en) * | 2020-05-19 | 2020-09-08 | 民生科技有限责任公司 | Financial industry consensus public opinion analysis method and system based on deep learning algorithm |
CN111858944A (en) * | 2020-07-31 | 2020-10-30 | 电子科技大学 | Entity aspect level emotion analysis method based on attention mechanism |
CN111859978A (en) * | 2020-06-11 | 2020-10-30 | 南京邮电大学 | Emotion text generation method based on deep learning |
CN111984931A (en) * | 2020-08-20 | 2020-11-24 | 上海大学 | Public opinion calculation and deduction method and system for social event web text |
CN111985214A (en) * | 2020-08-19 | 2020-11-24 | 四川长虹电器股份有限公司 | Human-computer interaction negative emotion analysis method based on bilstm and attention |
CN112347269A (en) * | 2020-11-11 | 2021-02-09 | 重庆邮电大学 | Method for recognizing argument pairs based on BERT and Att-BilSTM |
CN112434516A (en) * | 2020-12-18 | 2021-03-02 | 安徽商信政通信息技术股份有限公司 | Self-adaptive comment emotion analysis system and method fusing text information |
CN112597302A (en) * | 2020-12-18 | 2021-04-02 | 东北林业大学 | False comment detection method based on multi-dimensional comment representation |
CN112685558A (en) * | 2019-10-18 | 2021-04-20 | 普天信息技术有限公司 | Emotion classification model training method and device |
CN112699244A (en) * | 2021-03-16 | 2021-04-23 | 成都信息工程大学 | Deep learning-based method and system for classifying defect texts of power transmission and transformation equipment |
CN112905739A (en) * | 2021-02-05 | 2021-06-04 | 北京邮电大学 | False comment detection model training method, detection method and electronic equipment |
CN112950019A (en) * | 2021-03-01 | 2021-06-11 | 昆明电力交易中心有限责任公司 | Electricity selling company evaluation emotion classification method based on combined attention mechanism |
CN113051367A (en) * | 2021-03-22 | 2021-06-29 | 北京智慧星光信息技术有限公司 | Deep learning early warning method and system based on semantic feature enhancement and electronic equipment |
CN113065577A (en) * | 2021-03-09 | 2021-07-02 | 北京工业大学 | Multi-modal emotion classification method for targets |
CN113159831A (en) * | 2021-03-24 | 2021-07-23 | 湖南大学 | Comment text sentiment analysis method based on improved capsule network |
CN113255360A (en) * | 2021-04-19 | 2021-08-13 | 国家计算机网络与信息安全管理中心 | Document rating method and device based on hierarchical self-attention network |
CN113297383A (en) * | 2021-06-22 | 2021-08-24 | 苏州大学 | Knowledge distillation-based speech emotion classification method |
CN113297838A (en) * | 2021-05-21 | 2021-08-24 | 华中科技大学鄂州工业技术研究院 | Relationship extraction method based on graph neural network |
CN113361617A (en) * | 2021-06-15 | 2021-09-07 | 西南交通大学 | Aspect level emotion analysis modeling method based on multivariate attention correction |
WO2021174824A1 (en) * | 2020-03-05 | 2021-09-10 | 苏州浪潮智能科技有限公司 | Sentence-level convolution lstm training method, and device and readable medium |
CN113407721A (en) * | 2021-06-29 | 2021-09-17 | 哈尔滨工业大学(深圳) | Method, device and computer storage medium for detecting log sequence abnormity |
CN113553828A (en) * | 2021-07-21 | 2021-10-26 | 南京邮电大学 | Semantic code-based hierarchical level remote supervision relation extraction method |
CN113762343A (en) * | 2021-08-04 | 2021-12-07 | 德邦证券股份有限公司 | Method, device and storage medium for processing public opinion information and training classification model |
CN113779249A (en) * | 2021-08-31 | 2021-12-10 | 华南师范大学 | Cross-domain text emotion classification method and device, storage medium and electronic equipment |
CN113806545A (en) * | 2021-09-24 | 2021-12-17 | 重庆理工大学 | Comment text emotion classification method based on label description generation |
CN113849651A (en) * | 2021-09-28 | 2021-12-28 | 平安科技(深圳)有限公司 | Document-level emotional tendency-based emotion classification method, device, equipment and medium |
CN113869065A (en) * | 2021-10-15 | 2021-12-31 | 梧州学院 | Emotion classification method and system based on 'word-phrase' attention mechanism |
US20220223144A1 (en) * | 2019-05-14 | 2022-07-14 | Dolby Laboratories Licensing Corporation | Method and apparatus for speech source separation based on a convolutional neural network |
CN114791951A (en) * | 2022-05-13 | 2022-07-26 | 青岛文达通科技股份有限公司 | Emotion classification method and system based on capsule network |
CN115544260A (en) * | 2022-12-05 | 2022-12-30 | 湖南工商大学 | Comparison optimization coding and decoding model and method for text emotion analysis |
WO2023069017A3 (en) * | 2021-10-19 | 2023-06-01 | Grabtaxi Holdings Pte. Ltd. | System and method for recognizing sentiment of user's feedback |
CN116484005A (en) * | 2023-06-25 | 2023-07-25 | 北京中关村科金技术有限公司 | Classification model construction method, device and storage medium |
WO2023204759A1 (en) * | 2022-04-22 | 2023-10-26 | Lemon Inc. | Attribute and rating co-extraction |
CN113869065B (en) * | 2021-10-15 | 2024-04-12 | 梧州学院 | Emotion classification method and system based on 'word-phrase' attention mechanism |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103678607A (en) * | 2013-12-16 | 2014-03-26 | 合肥工业大学 | Building method of emotion marking system |
CN106570179A (en) * | 2016-11-10 | 2017-04-19 | 中国科学院信息工程研究所 | Evaluative text-oriented kernel entity identification method and apparatus |
CN107133211A (en) * | 2017-04-26 | 2017-09-05 | 中国人民大学 | A kind of composition methods of marking based on notice mechanism |
CN107316654A (en) * | 2017-07-24 | 2017-11-03 | 湖南大学 | Emotion identification method based on DIS NV features |
CN107451118A (en) * | 2017-07-21 | 2017-12-08 | 西安电子科技大学 | Sentence-level sensibility classification method based on Weakly supervised deep learning |
CN107544957A (en) * | 2017-07-05 | 2018-01-05 | 华北电力大学 | A kind of Sentiment orientation analysis method of business product target word |
CN107590134A (en) * | 2017-10-26 | 2018-01-16 | 福建亿榕信息技术有限公司 | Text sentiment classification method, storage medium and computer |
-
2018
- 2018-01-30 CN CN201810086816.9A patent/CN108363753B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103678607A (en) * | 2013-12-16 | 2014-03-26 | 合肥工业大学 | Building method of emotion marking system |
CN106570179A (en) * | 2016-11-10 | 2017-04-19 | 中国科学院信息工程研究所 | Evaluative text-oriented kernel entity identification method and apparatus |
CN107133211A (en) * | 2017-04-26 | 2017-09-05 | 中国人民大学 | A kind of composition methods of marking based on notice mechanism |
CN107544957A (en) * | 2017-07-05 | 2018-01-05 | 华北电力大学 | A kind of Sentiment orientation analysis method of business product target word |
CN107451118A (en) * | 2017-07-21 | 2017-12-08 | 西安电子科技大学 | Sentence-level sensibility classification method based on Weakly supervised deep learning |
CN107316654A (en) * | 2017-07-24 | 2017-11-03 | 湖南大学 | Emotion identification method based on DIS NV features |
CN107590134A (en) * | 2017-10-26 | 2018-01-16 | 福建亿榕信息技术有限公司 | Text sentiment classification method, storage medium and computer |
Cited By (116)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110399547A (en) * | 2018-04-17 | 2019-11-01 | 百度在线网络技术(北京)有限公司 | For updating the method, apparatus, equipment and storage medium of model parameter |
CN109145112B (en) * | 2018-08-06 | 2021-08-06 | 北京航空航天大学 | Commodity comment classification method based on global information attention mechanism |
CN109145112A (en) * | 2018-08-06 | 2019-01-04 | 北京航空航天大学 | A kind of comment on commodity classification method based on global information attention mechanism |
CN109241283A (en) * | 2018-08-08 | 2019-01-18 | 广东工业大学 | A kind of file classification method based on multi-angle capsule network |
CN109543030A (en) * | 2018-10-12 | 2019-03-29 | 平安科技(深圳)有限公司 | Customer service machine conference file classification method and device, equipment, storage medium |
CN109543030B (en) * | 2018-10-12 | 2023-04-07 | 平安科技(深圳)有限公司 | Method, device, equipment and storage medium for classifying session texts of customer service robot |
CN109472470A (en) * | 2018-10-23 | 2019-03-15 | 重庆誉存大数据科技有限公司 | In conjunction with the corporate news data classification of risks method of deep learning and logic rules |
CN109558585A (en) * | 2018-10-26 | 2019-04-02 | 深圳点猫科技有限公司 | A kind of answer Automatic-searching method and electronic equipment based on educational system |
CN110751208B (en) * | 2018-10-29 | 2020-06-30 | 山东大学 | Criminal emotion recognition method for multi-mode feature fusion based on self-weight differential encoder |
CN110751208A (en) * | 2018-10-29 | 2020-02-04 | 山东大学 | Criminal emotion recognition method for multi-mode feature fusion based on self-weight differential encoder |
CN111125354A (en) * | 2018-10-31 | 2020-05-08 | 北京国双科技有限公司 | Text classification method and device |
CN109670169A (en) * | 2018-11-16 | 2019-04-23 | 中山大学 | A kind of deep learning sensibility classification method based on feature extraction |
CN109508375A (en) * | 2018-11-19 | 2019-03-22 | 重庆邮电大学 | A kind of social affective classification method based on multi-modal fusion |
CN109492229A (en) * | 2018-11-23 | 2019-03-19 | 中国科学技术大学 | A kind of cross-cutting sensibility classification method and relevant apparatus |
CN109492229B (en) * | 2018-11-23 | 2020-10-27 | 中国科学技术大学 | Cross-domain emotion classification method and related device |
CN111241842A (en) * | 2018-11-27 | 2020-06-05 | 阿里巴巴集团控股有限公司 | Text analysis method, device and system |
CN111241842B (en) * | 2018-11-27 | 2023-05-30 | 阿里巴巴集团控股有限公司 | Text analysis method, device and system |
CN109284506B (en) * | 2018-11-29 | 2023-09-29 | 重庆邮电大学 | User comment emotion analysis system and method based on attention convolution neural network |
CN109284506A (en) * | 2018-11-29 | 2019-01-29 | 重庆邮电大学 | A kind of user comment sentiment analysis system and method based on attention convolutional neural networks |
CN109597995A (en) * | 2018-12-04 | 2019-04-09 | 国网江西省电力有限公司信息通信分公司 | A kind of document representation method based on BM25 weighted combination term vector |
CN109597997A (en) * | 2018-12-07 | 2019-04-09 | 上海宏原信息科技有限公司 | Based on comment entity, aspect grade sensibility classification method and device and its model training |
CN109597997B (en) * | 2018-12-07 | 2023-05-02 | 上海宏原信息科技有限公司 | Comment entity and aspect-level emotion classification method and device and model training thereof |
CN109670542A (en) * | 2018-12-11 | 2019-04-23 | 田刚 | A kind of false comment detection method based on comment external information |
CN109918501A (en) * | 2019-01-18 | 2019-06-21 | 平安科技(深圳)有限公司 | Method, apparatus, equipment and the storage medium of news article classification |
CN109784280A (en) * | 2019-01-18 | 2019-05-21 | 江南大学 | Human bodys' response method based on Bi-LSTM-Attention model |
CN109858034B (en) * | 2019-02-25 | 2023-02-03 | 武汉大学 | Text emotion classification method based on attention model and emotion dictionary |
CN109858034A (en) * | 2019-02-25 | 2019-06-07 | 武汉大学 | A kind of text sentiment classification method based on attention model and sentiment dictionary |
CN109992780A (en) * | 2019-03-29 | 2019-07-09 | 哈尔滨理工大学 | One kind being based on deep neural network specific objective sensibility classification method |
CN109992780B (en) * | 2019-03-29 | 2022-07-01 | 哈尔滨理工大学 | Specific target emotion classification method based on deep neural network |
CN110083702A (en) * | 2019-04-15 | 2019-08-02 | 中国科学院深圳先进技术研究院 | A kind of aspect rank text emotion conversion method based on multi-task learning |
CN110083702B (en) * | 2019-04-15 | 2021-04-09 | 中国科学院深圳先进技术研究院 | Aspect level text emotion conversion method based on multi-task learning |
CN110032645A (en) * | 2019-04-17 | 2019-07-19 | 携程旅游信息技术(上海)有限公司 | Text emotion recognition methods, system, equipment and medium |
CN110032645B (en) * | 2019-04-17 | 2021-02-09 | 携程旅游信息技术(上海)有限公司 | Text emotion recognition method, system, device and medium |
CN110046353B (en) * | 2019-04-22 | 2022-05-13 | 重庆理工大学 | Aspect level emotion analysis method based on multi-language level mechanism |
CN110046353A (en) * | 2019-04-22 | 2019-07-23 | 重庆理工大学 | Aspect level emotion analysis method based on multi-language level mechanism |
CN110175237B (en) * | 2019-05-14 | 2023-02-03 | 华东师范大学 | Multi-category-oriented secondary emotion classification method |
CN110175237A (en) * | 2019-05-14 | 2019-08-27 | 华东师范大学 | It is a kind of towards multi-class secondary sensibility classification method |
US20220223144A1 (en) * | 2019-05-14 | 2022-07-14 | Dolby Laboratories Licensing Corporation | Method and apparatus for speech source separation based on a convolutional neural network |
CN110134966A (en) * | 2019-05-21 | 2019-08-16 | 中电健康云科技有限公司 | A kind of sensitive information determines method and device |
CN110309503A (en) * | 2019-05-21 | 2019-10-08 | 昆明理工大学 | A kind of subjective item Rating Model and methods of marking based on deep learning BERT--CNN |
CN110222178A (en) * | 2019-05-24 | 2019-09-10 | 新华三大数据技术有限公司 | Text sentiment classification method, device, electronic equipment and readable storage medium storing program for executing |
CN110188202A (en) * | 2019-06-06 | 2019-08-30 | 北京百度网讯科技有限公司 | Training method, device and the terminal of semantic relation identification model |
CN110263165A (en) * | 2019-06-14 | 2019-09-20 | 中山大学 | A kind of user comment sentiment analysis method based on semi-supervised learning |
CN110348016B (en) * | 2019-07-15 | 2022-06-14 | 昆明理工大学 | Text abstract generation method based on sentence correlation attention mechanism |
CN110348016A (en) * | 2019-07-15 | 2019-10-18 | 昆明理工大学 | Text snippet generation method based on sentence association attention mechanism |
CN110765769A (en) * | 2019-08-27 | 2020-02-07 | 电子科技大学 | Entity attribute dependency emotion analysis method based on clause characteristics |
US11934790B2 (en) * | 2019-09-09 | 2024-03-19 | Boe Technology Group Co., Ltd. | Neural network training method and apparatus, semantic classification method and apparatus and medium |
US20220075955A1 (en) * | 2019-09-09 | 2022-03-10 | Boe Technology Group Co., Ltd. | Neural network training method and apparatus, semantic classification method and apparatus and medium |
CN110598786B (en) * | 2019-09-09 | 2022-01-07 | 京东方科技集团股份有限公司 | Neural network training method, semantic classification method and semantic classification device |
CN110598786A (en) * | 2019-09-09 | 2019-12-20 | 京东方科技集团股份有限公司 | Neural network training method, semantic classification method and semantic classification device |
CN110600047A (en) * | 2019-09-17 | 2019-12-20 | 南京邮电大学 | Perceptual STARGAN-based many-to-many speaker conversion method |
CN110517121A (en) * | 2019-09-23 | 2019-11-29 | 重庆邮电大学 | Method of Commodity Recommendation and the device for recommending the commodity based on comment text sentiment analysis |
CN112685558A (en) * | 2019-10-18 | 2021-04-20 | 普天信息技术有限公司 | Emotion classification model training method and device |
CN110826340A (en) * | 2019-11-06 | 2020-02-21 | 广东三维家信息科技有限公司 | Evaluation text generation method and device and electronic equipment |
CN110851601A (en) * | 2019-11-08 | 2020-02-28 | 福州大学 | Cross-domain emotion classification system and method based on layered attention mechanism |
CN110879938A (en) * | 2019-11-14 | 2020-03-13 | 中国联合网络通信集团有限公司 | Text emotion classification method, device, equipment and storage medium |
CN111078833B (en) * | 2019-12-03 | 2022-05-20 | 哈尔滨工程大学 | Text classification method based on neural network |
CN111078833A (en) * | 2019-12-03 | 2020-04-28 | 哈尔滨工程大学 | Text classification method based on neural network |
CN111159400B (en) * | 2019-12-19 | 2023-09-26 | 苏州大学 | Product comment emotion classification method and system |
CN111159400A (en) * | 2019-12-19 | 2020-05-15 | 苏州大学 | Product comment emotion classification method and system |
CN111046233A (en) * | 2019-12-24 | 2020-04-21 | 浙江大学 | Video label determination method based on video comment text |
CN111046233B (en) * | 2019-12-24 | 2022-05-13 | 浙江大学 | Video label determination method based on video comment text |
CN111414122A (en) * | 2019-12-26 | 2020-07-14 | 腾讯科技(深圳)有限公司 | Intelligent text processing method and device, electronic equipment and storage medium |
CN111275521B (en) * | 2020-01-16 | 2022-06-14 | 华南理工大学 | Commodity recommendation method based on user comment and satisfaction level embedding |
CN111275521A (en) * | 2020-01-16 | 2020-06-12 | 华南理工大学 | Commodity recommendation method based on user comment and satisfaction level embedding |
CN111339440B (en) * | 2020-02-19 | 2024-01-23 | 东南大学 | Social emotion sequencing method based on hierarchical state neural network for news text |
CN111339440A (en) * | 2020-02-19 | 2020-06-26 | 东南大学 | Social emotion ordering method for news text based on hierarchical state neural network |
CN111325027A (en) * | 2020-02-19 | 2020-06-23 | 东南大学 | Sparse data-oriented personalized emotion analysis method and device |
CN111368082A (en) * | 2020-03-03 | 2020-07-03 | 南京信息工程大学 | Emotion analysis method for domain adaptive word embedding based on hierarchical network |
WO2021174824A1 (en) * | 2020-03-05 | 2021-09-10 | 苏州浪潮智能科技有限公司 | Sentence-level convolution lstm training method, and device and readable medium |
CN111291189A (en) * | 2020-03-10 | 2020-06-16 | 北京芯盾时代科技有限公司 | Text processing method and device and computer readable storage medium |
CN111522956A (en) * | 2020-05-08 | 2020-08-11 | 河南理工大学 | Text emotion classification method based on double channels and hierarchical attention network |
CN111639183A (en) * | 2020-05-19 | 2020-09-08 | 民生科技有限责任公司 | Financial industry consensus public opinion analysis method and system based on deep learning algorithm |
CN111639183B (en) * | 2020-05-19 | 2023-11-28 | 民生科技有限责任公司 | Financial co-industry public opinion analysis method and system based on deep learning algorithm |
CN111859978B (en) * | 2020-06-11 | 2023-06-20 | 南京邮电大学 | Deep learning-based emotion text generation method |
CN111859978A (en) * | 2020-06-11 | 2020-10-30 | 南京邮电大学 | Emotion text generation method based on deep learning |
CN111858944B (en) * | 2020-07-31 | 2022-11-22 | 电子科技大学 | Entity aspect level emotion analysis method based on attention mechanism |
CN111858944A (en) * | 2020-07-31 | 2020-10-30 | 电子科技大学 | Entity aspect level emotion analysis method based on attention mechanism |
CN111985214A (en) * | 2020-08-19 | 2020-11-24 | 四川长虹电器股份有限公司 | Human-computer interaction negative emotion analysis method based on bilstm and attention |
CN111984931A (en) * | 2020-08-20 | 2020-11-24 | 上海大学 | Public opinion calculation and deduction method and system for social event web text |
CN111984931B (en) * | 2020-08-20 | 2022-06-03 | 上海大学 | Public opinion calculation and deduction method and system for social event web text |
CN112347269A (en) * | 2020-11-11 | 2021-02-09 | 重庆邮电大学 | Method for recognizing argument pairs based on BERT and Att-BilSTM |
CN112434516A (en) * | 2020-12-18 | 2021-03-02 | 安徽商信政通信息技术股份有限公司 | Self-adaptive comment emotion analysis system and method fusing text information |
CN112597302A (en) * | 2020-12-18 | 2021-04-02 | 东北林业大学 | False comment detection method based on multi-dimensional comment representation |
CN112597302B (en) * | 2020-12-18 | 2022-04-29 | 东北林业大学 | False comment detection method based on multi-dimensional comment representation |
CN112905739A (en) * | 2021-02-05 | 2021-06-04 | 北京邮电大学 | False comment detection model training method, detection method and electronic equipment |
CN112950019A (en) * | 2021-03-01 | 2021-06-11 | 昆明电力交易中心有限责任公司 | Electricity selling company evaluation emotion classification method based on combined attention mechanism |
CN112950019B (en) * | 2021-03-01 | 2024-03-29 | 昆明电力交易中心有限责任公司 | Electricity selling company evaluation emotion classification method based on joint attention mechanism |
CN113065577A (en) * | 2021-03-09 | 2021-07-02 | 北京工业大学 | Multi-modal emotion classification method for targets |
CN112699244A (en) * | 2021-03-16 | 2021-04-23 | 成都信息工程大学 | Deep learning-based method and system for classifying defect texts of power transmission and transformation equipment |
CN113051367B (en) * | 2021-03-22 | 2023-11-21 | 北京智慧星光信息技术有限公司 | Deep learning early warning method and system based on semantic feature reinforcement and electronic equipment |
CN113051367A (en) * | 2021-03-22 | 2021-06-29 | 北京智慧星光信息技术有限公司 | Deep learning early warning method and system based on semantic feature enhancement and electronic equipment |
CN113159831A (en) * | 2021-03-24 | 2021-07-23 | 湖南大学 | Comment text sentiment analysis method based on improved capsule network |
CN113255360A (en) * | 2021-04-19 | 2021-08-13 | 国家计算机网络与信息安全管理中心 | Document rating method and device based on hierarchical self-attention network |
CN113297838A (en) * | 2021-05-21 | 2021-08-24 | 华中科技大学鄂州工业技术研究院 | Relationship extraction method based on graph neural network |
CN113361617A (en) * | 2021-06-15 | 2021-09-07 | 西南交通大学 | Aspect level emotion analysis modeling method based on multivariate attention correction |
CN113297383B (en) * | 2021-06-22 | 2023-08-04 | 苏州大学 | Speech emotion classification method based on knowledge distillation |
CN113297383A (en) * | 2021-06-22 | 2021-08-24 | 苏州大学 | Knowledge distillation-based speech emotion classification method |
CN113407721A (en) * | 2021-06-29 | 2021-09-17 | 哈尔滨工业大学(深圳) | Method, device and computer storage medium for detecting log sequence abnormity |
CN113553828B (en) * | 2021-07-21 | 2023-06-16 | 南京邮电大学 | Hierarchical remote supervision relation extraction method based on original sense code |
CN113553828A (en) * | 2021-07-21 | 2021-10-26 | 南京邮电大学 | Semantic code-based hierarchical level remote supervision relation extraction method |
CN113762343A (en) * | 2021-08-04 | 2021-12-07 | 德邦证券股份有限公司 | Method, device and storage medium for processing public opinion information and training classification model |
CN113762343B (en) * | 2021-08-04 | 2024-03-15 | 德邦证券股份有限公司 | Method, device and storage medium for processing public opinion information and training classification model |
CN113779249A (en) * | 2021-08-31 | 2021-12-10 | 华南师范大学 | Cross-domain text emotion classification method and device, storage medium and electronic equipment |
CN113806545A (en) * | 2021-09-24 | 2021-12-17 | 重庆理工大学 | Comment text emotion classification method based on label description generation |
CN113806545B (en) * | 2021-09-24 | 2022-06-17 | 重庆理工大学 | Comment text emotion classification method based on label description generation |
CN113849651A (en) * | 2021-09-28 | 2021-12-28 | 平安科技(深圳)有限公司 | Document-level emotional tendency-based emotion classification method, device, equipment and medium |
CN113849651B (en) * | 2021-09-28 | 2024-04-09 | 平安科技(深圳)有限公司 | Emotion classification method, device, equipment and medium based on document-level emotion tendencies |
CN113869065A (en) * | 2021-10-15 | 2021-12-31 | 梧州学院 | Emotion classification method and system based on 'word-phrase' attention mechanism |
CN113869065B (en) * | 2021-10-15 | 2024-04-12 | 梧州学院 | Emotion classification method and system based on 'word-phrase' attention mechanism |
WO2023069017A3 (en) * | 2021-10-19 | 2023-06-01 | Grabtaxi Holdings Pte. Ltd. | System and method for recognizing sentiment of user's feedback |
WO2023204759A1 (en) * | 2022-04-22 | 2023-10-26 | Lemon Inc. | Attribute and rating co-extraction |
CN114791951A (en) * | 2022-05-13 | 2022-07-26 | 青岛文达通科技股份有限公司 | Emotion classification method and system based on capsule network |
CN115544260A (en) * | 2022-12-05 | 2022-12-30 | 湖南工商大学 | Comparison optimization coding and decoding model and method for text emotion analysis |
CN116484005B (en) * | 2023-06-25 | 2023-09-08 | 北京中关村科金技术有限公司 | Classification model construction method, device and storage medium |
CN116484005A (en) * | 2023-06-25 | 2023-07-25 | 北京中关村科金技术有限公司 | Classification model construction method, device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN108363753B (en) | 2020-05-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108363753A (en) | Comment text sentiment classification model is trained and sensibility classification method, device and equipment | |
Abdullah et al. | SEDAT: sentiment and emotion detection in Arabic text using CNN-LSTM deep learning | |
CN108492200B (en) | User attribute inference method and device based on convolutional neural network | |
CN109933664B (en) | Fine-grained emotion analysis improvement method based on emotion word embedding | |
Lai et al. | Fine-grained emotion classification of Chinese microblogs based on graph convolution networks | |
CN109145112A (en) | A kind of comment on commodity classification method based on global information attention mechanism | |
CN109284506A (en) | A kind of user comment sentiment analysis system and method based on attention convolutional neural networks | |
CN109614487B (en) | Sentiment classification method based on tensor fusion mode | |
Sivakumar et al. | Review on word2vec word embedding neural net | |
CN107247702A (en) | A kind of text emotion analysis and processing method and system | |
CN110096575B (en) | Psychological portrait method facing microblog user | |
CN111274398A (en) | Method and system for analyzing comment emotion of aspect-level user product | |
CN103034626A (en) | Emotion analyzing system and method | |
CN108052505A (en) | Text emotion analysis method and device, storage medium, terminal | |
CN110765769B (en) | Clause feature-based entity attribute dependency emotion analysis method | |
CN105809186A (en) | Emotion classification method and system | |
CN107688870A (en) | A kind of the classification factor visual analysis method and device of the deep neural network based on text flow input | |
CN112256866A (en) | Text fine-grained emotion analysis method based on deep learning | |
CN110297986A (en) | A kind of Sentiment orientation analysis method of hot microblog topic | |
CN110909529A (en) | User emotion analysis and prejudgment system of company image promotion system | |
CN114417851A (en) | Emotion analysis method based on keyword weighted information | |
CN113934835B (en) | Retrieval type reply dialogue method and system combining keywords and semantic understanding representation | |
Palani et al. | T-BERT--Model for Sentiment Analysis of Micro-blogs Integrating Topic Model and BERT | |
Leng et al. | Deepreviewer: Collaborative grammar and innovation neural network for automatic paper review | |
CN111859955A (en) | Public opinion data analysis model based on deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |