CN113849651B - Emotion classification method, device, equipment and medium based on document-level emotion tendencies - Google Patents

Emotion classification method, device, equipment and medium based on document-level emotion tendencies Download PDF

Info

Publication number
CN113849651B
CN113849651B CN202111158076.3A CN202111158076A CN113849651B CN 113849651 B CN113849651 B CN 113849651B CN 202111158076 A CN202111158076 A CN 202111158076A CN 113849651 B CN113849651 B CN 113849651B
Authority
CN
China
Prior art keywords
sentence
target
vector
emotion classification
coding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111158076.3A
Other languages
Chinese (zh)
Other versions
CN113849651A (en
Inventor
于凤英
王健宗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202111158076.3A priority Critical patent/CN113849651B/en
Publication of CN113849651A publication Critical patent/CN113849651A/en
Application granted granted Critical
Publication of CN113849651B publication Critical patent/CN113849651B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • G06F16/353Clustering; Classification into predefined classes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/205Parsing
    • G06F40/211Syntactic parsing, e.g. based on context-free grammar [CFG] or unification grammars

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Machine Translation (AREA)

Abstract

The application relates to the technical field of artificial intelligence, and discloses an emotion classification method, device, equipment and medium based on document-level emotion tendencies, wherein the method comprises the following steps: determining a sentence set for the target evaluation text; extracting a target entity attribute pair corresponding to the evaluation object from the sentence set; determining a target question according to the target question template and the target entity attribute pair; each sentence in the sentence set is respectively spliced with a target question sentence to obtain a target sentence pair set; inputting the target entity attribute pair and the target sentence pair set into an emotion classification model to carry out emotion classification probability prediction to obtain an emotion classification probability vector, wherein the emotion classification model is a model obtained by combining an attention mechanism, internal aspect consistency and aspect tendency; and obtaining the target emotion classification according to the emotion classification probability vector. And the emotion classification probability prediction is carried out by adopting a model obtained by joint modeling of an attention mechanism, internal aspect consistency and aspect tendency, so that the accuracy of emotion analysis is improved.

Description

Emotion classification method, device, equipment and medium based on document-level emotion tendencies
Technical Field
The application relates to the technical field of artificial intelligence, in particular to an emotion classification method, device, equipment and medium based on document-level emotion tendencies.
Background
The conventional emotion analysis based on evaluation content always regards aspect-level emotion classification as an independent aspect-by-aspect sentence-level classification problem, so that document-level emotion preference information is largely ignored, and the aspect-level emotion classification information is deleted. In practice, sentences in the evaluation content do not appear independently, but sentences with more concentrated meanings and more consistent emotion appear together, and the sentence composition in the evaluation content occasion is often more random, sometimes the sentences themselves cannot provide enough information, the emotion of the sentences can be understood only by referring to the content and even emotion tendencies of other sentences, and the aspect-level emotion classification is regarded as an independent aspect-by-aspect sentence classification problem, so that the accuracy of emotion analysis is reduced.
Disclosure of Invention
The main purpose of the application is to provide an emotion classification method, device, equipment and medium based on document-level emotion tendencies, and aims to solve the technical problems that in the prior art, emotion analysis based on evaluation content, aspect-level emotion classification is regarded as an independent sentence-level classification problem aspect by aspect, document-level emotion preference information is ignored, aspect-level emotion classification information is lost, and accuracy of emotion analysis is reduced.
In order to achieve the above object, the present application proposes an emotion classification method based on document-level emotion tendencies, the method comprising:
acquiring a target evaluation text and an aspect emotion extraction rule, wherein the aspect emotion extraction rule comprises an evaluation object and an evaluation direction;
sentence extraction is carried out on the target evaluation text, and a sentence set is obtained;
extracting a target entity attribute pair corresponding to the evaluation object from the sentence set;
acquiring a target question template corresponding to the evaluation direction, and constructing a question according to the target question template and the target entity attribute pair to obtain a target question;
each sentence in the sentence set is spliced with the target question sentence respectively to obtain a target sentence pair set;
carrying out emotion classification probability prediction on the target entity attribute pair and the target sentence pair set input emotion classification model to obtain an emotion classification probability vector, wherein the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendency;
and carrying out emotion classification determination according to the emotion classification probability vector to obtain target emotion classification.
The application also provides an emotion classification device based on document-level emotion tendencies, which comprises:
the data acquisition module is used for acquiring a target evaluation text and an aspect emotion extraction rule, wherein the aspect emotion extraction rule comprises an evaluation object and an evaluation direction;
the sentence set determining module is used for extracting sentences from the target evaluation text to obtain a sentence set;
the target entity attribute pair determining module is used for extracting target entity attribute pairs corresponding to the evaluation objects from the sentence sets;
the target question determination module is used for acquiring a target question template corresponding to the evaluation direction, and constructing a question according to the target question template and the target entity attribute pair to obtain a target question;
the target sentence pair set determining module is used for respectively splicing each sentence in the sentence set with the target question sentences to obtain a target sentence pair set;
the emotion classification probability vector determining module is used for carrying out emotion classification probability prediction on the target entity attribute pair and the target sentence pair set input emotion classification model to obtain an emotion classification probability vector, wherein the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendencies;
And the target emotion classification determining module is used for carrying out emotion classification determination according to the emotion classification probability vector to obtain target emotion classification.
The present application also proposes a computer device comprising a memory storing a computer program and a processor implementing the steps of any of the methods described above when the processor executes the computer program.
The present application also proposes a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the method of any of the above.
According to the emotion classification method, device, equipment and medium based on document-level emotion tendencies, firstly, sentence extraction is carried out on target evaluation texts to obtain sentence sets, target entity attribute pairs corresponding to evaluation objects are extracted from the sentence sets, target question sentence templates corresponding to evaluation directions are obtained, question sentence construction is carried out according to the target question sentence templates and the target entity attribute pairs to obtain target question sentences, each sentence in the sentence sets is spliced with the target question sentences to obtain target sentence pair sets, then emotion classification probability prediction is carried out on the target entity attribute pairs and the target sentence pair sets to obtain emotion classification probability vectors, wherein the emotion classification probability vectors are models obtained by joint modeling based on attention mechanisms, internal aspect consistency and aspect tendencies, finally emotion classification is carried out according to the emotion classification probability vectors to obtain target emotion classification, sentence classification probability prediction is carried out by adopting the models obtained by joint modeling based on the attention mechanisms, the internal aspect consistency and the aspect tendencies, and accuracy of emotion classification probability prediction is improved by utilizing the correlation aspects to accurately judge by utilizing the emotion classification probability of the correlation aspects, and the accuracy of the overall emotion classification is improved by utilizing the emotion classification probability to accurately judge the emotion classification models.
Drawings
FIG. 1 is a schematic flow chart of an emotion classification method based on document-level emotion tendencies according to an embodiment of the present application;
FIG. 2 is a schematic block diagram of an emotion classification device based on document-level emotion tendencies according to an embodiment of the present application;
fig. 3 is a block diagram schematically illustrating a structure of a computer device according to an embodiment of the present application.
The realization, functional characteristics and advantages of the present application will be further described with reference to the embodiments, referring to the attached drawings.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
Referring to fig. 1, an embodiment of the present application provides an emotion classification method based on document-level emotion tendencies, where the method includes:
s1: acquiring a target evaluation text and an aspect emotion extraction rule, wherein the aspect emotion extraction rule comprises an evaluation object and an evaluation direction;
s2: sentence extraction is carried out on the target evaluation text, and a sentence set is obtained;
s3: extracting a target entity attribute pair corresponding to the evaluation object from the sentence set;
S4: acquiring a target question template corresponding to the evaluation direction, and constructing a question according to the target question template and the target entity attribute pair to obtain a target question;
s5: each sentence in the sentence set is spliced with a target question sentence respectively to obtain a target sentence pair set;
s6: inputting the target entity attribute pair and the target sentence pair set into an emotion classification model for emotion classification probability prediction to obtain an emotion classification probability vector, wherein the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendency;
s7: and carrying out emotion classification determination according to the emotion classification probability vector to obtain target emotion classification.
According to the embodiment, sentence collection is obtained by extracting a sentence from a target evaluation text, a target entity attribute pair corresponding to an evaluation object is extracted from the sentence subset, a target question template corresponding to the evaluation direction is obtained, question sentence construction is carried out according to the target question template and the target entity attribute pair, a target question sentence is obtained, each sentence in the sentence collection is respectively spliced with the target question sentence to obtain a target sentence pair collection, then the target entity attribute pair and the target sentence pair collection are input into an emotion classification model for emotion classification probability prediction to obtain an emotion classification probability vector, wherein the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendentiousness, finally emotion classification is carried out according to the emotion classification probability vector to obtain target emotion classification, emotion classification probability prediction is carried out by adopting a model obtained by joint modeling based on the attention mechanism, the internal aspect consistency and the aspect tendentiousness, accuracy of prediction is improved by utilizing the correlation sentence characterization of the attention mechanism, correct judgment is given by utilizing other sentences, and the emotion classification probability vector can be judged in the whole emotion classification can be improved by utilizing the emotion classification probability of the emotion classification object.
For S1, the target evaluation text input by the user may be obtained, the target evaluation text may be obtained from a database, or the target evaluation text may be obtained from a third party application system.
The target evaluation text, namely one that needs emotion classification. Evaluation text includes, but is not limited to: medical service rating content, commodity purchase rating content, insurance purchase rating content.
It is understood that the evaluation text is a text of one evaluation object by one evaluator.
The evaluation object includes: an entity.
The extraction rule of the aspect emotion is that of the aspect emotion when the emotion classification is carried out on the target evaluation text. The aspect emotion extraction rules include: the evaluation object and the evaluation direction. For example, when the target evaluation text is medical service evaluation content, the aspect emotion extraction rule extracts, as an aspect emotion, a disease-treating field that is good for a doctor (i.e., an evaluation target), that is (an evaluation direction), and the example is not particularly limited.
And S2, extracting sentences from the target evaluation text, and taking all the extracted sentences as a sentence set.
The method for extracting sentences from the target evaluation text is not described herein.
S3, extracting an entity from the sentence set as a target entity according to the evaluation object in the aspect emotion extraction rule, and then extracting an attribute corresponding to the target entity from the sentence set as a target attribute; and splicing the entity attribute pairs of the target entity and the target attribute by adopting a preset splicing format, and taking the spliced data as the target entity attribute pair.
That is, aspects of the present application include entities and attributes.
It can be understood that the concatenation format of the target entity attribute pair is:
[CLS]e entity [SEP]e attribute [SEP]
wherein e entity Is the target entity e attribute Is a target attribute, [ CLS ]]Is a flag bit, [ SEP ]]Is a segmenter.
For example, the entity is "A insurance", the attribute is "company", and the target entity attribute pair is: [ CLS ] A insurance [ SEP ] company [ SEP ], examples of which are not particularly limited herein.
It can be understood that the sentence set includes a plurality of entities and attributes, and the target entity attribute pair corresponding to the evaluation object is extracted from the sentence set, and only the entities and the attributes corresponding to the evaluation object are spliced. Sentences in the sentence collection may or may not include entities and attributes.
Specifically, extracting a target entity attribute pair corresponding to an evaluation object from a sentence subset, carrying out classification searching from a knowledge graph according to the evaluation object, taking each entity corresponding to the searched classification as a reference entity set, carrying out entity recognition on the sentence set to obtain a candidate entity set, acquiring the entity corresponding to the reference entity set from the candidate entity set, and taking each acquired entity as a target entity.
For S4, a target question template corresponding to the evaluation direction in the aspect emotion extraction rule may be obtained from the database, and a target question template corresponding to the evaluation direction may be obtained from the third party application system.
The evaluation directions in the aspect emotion extraction rules are searched in a database, and a question template corresponding to the searched evaluation directions is used as a target question template.
For example, the evaluation direction is a service attitude, and the target question template is: what you feel as the service attitudes of [ entity substitution bits ] and [ attribute substitution bits ], examples are not specifically limited herein.
The entity replacement bit in the target question template is replaced according to the entity in the target entity attribute pair, the attribute replacement bit in the target question template is replaced according to the attribute in the target entity attribute pair, and the replaced target question template is used as a target question.
For example, the target entity in the target entity attribute pair is Zhang Sany, the target attribute in the target entity attribute pair is doctor, and the target question template is: you feel how the service attitudes of the entity replacement bits and the attribute replacement bits, the question structure is performed according to the target question template and the target entity attribute pair to obtain a target question as ' how you feel Zhang Sanhe and the doctor's service attitudes ', and the specific limitation is not given here by way of example.
And S5, adopting a preset sentence splicing rule, respectively splicing each sentence in the target question sentence and the sentence set, and taking each sentence obtained by splicing as a target sentence pair set.
It can be appreciated that the concatenation format of the target sentence pairs in the target sentence pair set is:
[CLS]s i [SEP]question(a k )[SEP]
wherein s is i Is the ith sentence in the sentence collection, query (a k ) Is a target question, [ CLS ]]Is a flag bit, [ SEP ]]Is a segmenter.
That is, sentences in the sentence set correspond one-to-one to target sentence pairs in the target sentence pair set.
For S6, inputting the target entity attribute pairs and the target sentence pairs into the emotion classification model, firstly encoding, secondly encoding the internal aspect consistency by combining an attention mechanism, encoding the aspect tendencies by combining the attention mechanism, then fusing the two encodings, and finally predicting the emotion classification probability according to the sentence fusion result, thereby realizing the emotion classification probability prediction by adopting the model obtained by joint modeling based on the attention mechanism, the internal aspect consistency and the aspect tendencies, and aiming at the aspect emotion extraction rule in the target evaluation text, and taking the vector obtained by the emotion classification probability prediction as the emotion classification probability vector.
And S7, finding out the maximum probability from the emotion classification probability vector, and determining the emotion classification corresponding to the found maximum probability as the target emotion classification.
The target emotion classification is emotion classification of target evaluation texts aiming at aspect emotion extraction rules.
Optionally, the step of performing emotion classification determination according to the emotion classification probability vector to obtain a target emotion classification includes: finding out the maximum probability from the emotion classification probability vector, and taking the found maximum probability as a candidate probability; acquiring a preset probability threshold, and taking the candidate probability as a target probability when the candidate probability is greater than or equal to the preset probability threshold; and determining the emotion classification corresponding to the target probability as the target emotion classification. Therefore, the emotion classification corresponding to the excessively low probability is prevented from being determined as the target emotion classification, and the accuracy of the determined target emotion classification is improved.
Emotion classification is a process of analyzing and reasoning subjective texts with emotion colors, namely analyzing the attitudes of speakers, and leaning towards the front or the back.
In one embodiment, before the step of inputting the target entity attribute pair and the target sentence pair set into the emotion classification model to perform emotion classification probability prediction, the method further includes:
S61: obtaining a plurality of training samples, each training sample comprising: entity attribute calibration values for samples, sentence sample sets and emotion classification probabilities;
s62: taking one training sample in a plurality of training samples as a training sample to be trained;
s63: the entity attribute of the training sample to be trained is used for inputting the sample into the coding layer of the initial model to be coded, and a direction coding vector is obtained;
s64: respectively inputting each sentence pair sample in the sentence pair sample set of training samples to be trained into a coding layer to code so as to obtain a sentence code vector set;
s65: adopting an internal aspect consistency coding layer of the initial model, respectively carrying out attention weight calculation and coding between sentence vertices and sentence vertices on sentence coding vectors in a sentence coding vector set according to the direction coding vectors to obtain a first sentence vector set;
s66: adopting an aspect tendency coding layer of the initial model, respectively carrying out attention weight calculation and coding between sentence vertices according to sentence coding vectors in sentence coding vector sets respectively to obtain a second sentence vector set;
s67: adopting a sentence fusion layer of the initial model, and carrying out feature extraction and feature fusion according to the first sentence vector set and the second sentence vector set to obtain a target sentence vector set;
S68: adopting an emotion classification layer of the initial model, and respectively carrying out emotion classification probability prediction according to each target sentence vector in the target sentence vector set to obtain an emotion classification probability prediction value;
s69: training the initial model according to the emotion classification probability calibration value and the emotion classification probability prediction value of the training sample to be trained;
s610: repeating the step of taking one training sample of the plurality of training samples as the training sample to be trained until reaching the preset model training ending condition, and determining the initial model reaching the preset model training ending condition as the emotion classification model.
According to the emotion classification model obtained by joint modeling of the attention mechanism, the internal aspect consistency coding and the aspect tendency coding, the relevant sentence representation in the aspect is learned by using the attention mechanism so as to improve the prediction accuracy, correct judgment is given out by using the internal aspect consistency through other sentences, emotion of an object can be given out by using the aspect tendency through integral emotion judgment in a hidden and difficult-to-judge text, and the emotion classification probability prediction accuracy of the emotion classification model is improved.
For S61, a plurality of training samples input by the user may be acquired, a plurality of training samples may be acquired from the database, or a plurality of training samples may be acquired from the third party application system.
The splicing format of the entity attribute to the sample is as follows:
[CLS]e entity [SEP]e attribute [SEP]。
the splicing format of each sentence to the sample in the sentence to sample set is as follows:
[CLS]s i [SEP]question(a k )[SEP]。
in the same training sample, the entity attribute pair sample and the sentence pair sample set are data obtained from the same evaluation text, and the emotion classification probability calibration value is a correct result for the emotion classification calibration of the entity attribute pair sample and the sentence pair sample set.
Sentence pair sample sets are data obtained from a set of all sentences in an evaluation text.
For S62, a training sample is obtained from the plurality of training samples, and the obtained training sample is used as the training sample to be trained.
For S63, the coding layer of the initial model is the coding layer obtained based on the BERT (Bidirectional Encoder Representations from Transformers) model.
And (3) coding the entity attribute of the training sample to be trained to the coding layer of the initial model, and obtaining the corresponding output vector of the marker bit [ CLS ] in the coding layer as a direction coding vector.
For S64, each sentence pair sample in the training sample pair sample set to be trained is input into the coding layer to be coded, and each output vector corresponding to the flag bit [ CLS ] in the coding layer is obtained to be used as a sentence coding vector set.
The sentence coding vectors in the sentence coding vector set are in one-to-one correspondence with the sentences in the sentence-to-sample set of training samples to be trained.
For S65, an internal aspect consistency coding layer of the initial model is adopted, first, a sentence coding vector with the same aspect (i.e., an evaluation object) as a direction coding vector in a sentence coding vector set is used as a sentence vertex, then, attention weight of the aspect corresponding to the direction coding vector in the sentence vertex is calculated, each sentence coding vector in the sentence coding vector set is coded according to the attention weight obtained by calculation, each vector obtained by coding is used as a first sentence vector, and all the first sentence vectors are used as a first sentence vector set.
That is, the first sentence vector in the first sentence vector set corresponds one-to-one to the sentence-encoded vectors in the sentence-encoded vector set.
It is understood that sentence vertices are expressed as sentence-encoding vectors.
For S66, an aspect tendency coding layer of an initial model is adopted, attention weight calculation between sentence vertexes and sentence vertexes is respectively carried out according to direction coding vectors and sentence coding vector sets, each sentence coding vector in the sentence coding vector sets is coded according to the attention weight obtained through calculation, each vector obtained through coding is used as one second sentence vector, and all second sentence vectors are used as a second sentence vector set.
That is, the second sentence vectors in the second sentence vector set are in one-to-one correspondence with the sentence-encoded vectors in the sentence-encoded vector set.
And S67, adopting a sentence fusion layer of an initial model, firstly splicing sentence vectors corresponding to the samples of each sentence according to a first sentence vector set and a second sentence vector set, then carrying out feature extraction and feature fusion on each vector obtained by splicing, taking each vector obtained by feature fusion as a target sentence vector, and taking all target sentence vectors as a target sentence vector set.
That is, the target sentence vectors in the target sentence vector set are in one-to-one correspondence with the sentence-encoded vectors in the sentence-encoded vector set.
And S68, adopting an emotion classification layer of the initial model, respectively carrying out emotion classification probability prediction according to each target sentence vector in the target sentence vector set, and taking data obtained by emotion classification probability prediction as an emotion classification probability prediction value.
The emotion classification layer adopts a softmax regression classifier. The softmax regression classifier is a classifier that uses a softmax function (normalized exponential function).
And for S69, training the initial model according to the emotion classification probability calibration value and the emotion classification probability prediction value of the training sample to be trained.
For S610, steps S62 to S610 are repeatedly performed until the preset model training end condition is reached, and the initial model reaching the preset model training end condition is a model conforming to the expected target, so that the initial model reaching the preset model training end condition can be determined as the emotion classification model.
The preset model training ending condition is that the loss value of the initial model reaches a first convergence condition or the iteration number reaches a second convergence condition.
The first convergence condition means that the magnitude of the loss value of the initial model calculated two times in the adjacent way satisfies the lipschitz condition (lipschitz continuous condition).
The iteration number refers to the number of times the initial model is trained, that is, trained once, the iteration number increases by 1.
The second convergence condition is a specific value.
In one embodiment, the step of performing, by the internal aspect consistency coding layer using the initial model, internal aspect consistency coding of the sentence coding vectors according to the direction coding vectors and the sentence coding vector set, respectively, to obtain a first sentence vector set includes:
s651: adopting an internal aspect consistency coding layer, and determining sentence vertex of a sentence coding vector set according to the direction coding vector to obtain a sentence vertex set;
S652: adopting an internal aspect consistency coding layer, and constructing a consistency graph of sentence vertex and aspect according to the direction coding vector and the sentence vertex set to obtain a consistency graph of target aspect;
s653: adopting an internal aspect consistency coding layer, and carrying out attention weight calculation between sentence vertexes and aspects according to a target aspect consistency graph to obtain a first attention weight set;
s654: and adopting an internal aspect consistency coding layer, and respectively coding each sentence vertex according to the first attention weight set and the sentence vertex set to obtain a first sentence vector set.
According to the method, firstly, an internal aspect consistency coding layer of an initial model is adopted, sentence vertex determination is firstly carried out according to a direction coding vector and a sentence coding vector set, secondly, sentence vertex and aspect consistency graph construction is carried out, attention weight calculation between the sentence vertex and the aspect is carried out, finally, coding of each sentence vertex is carried out, the attention mechanism is utilized to learn aspect related sentence characterization so as to improve the accuracy of prediction, correct judgment is given through other sentences by utilizing the internal aspect consistency, and the accuracy of emotion classification probability prediction by an emotion classification model is improved.
For S651, the inside of the inside-aspect consistency means in the same evaluation text.
Internal aspect consistency refers to that sentence vertices in the same evaluation text share an aspect vertex, and then the sentence vertices are considered to have the same emotion in this aspect vertex.
It is understood that aspect vertices are expressed as directional encoding vectors.
And finding out sentence coding vectors which are identical to the direction coding vectors in aspect from the sentence coding vector set by adopting an internal aspect consistency coding layer, taking each sentence coding vector as a sentence vertex, and taking all sentence vertices as a sentence vertex set.
That is, the sentence vertices where the sentence vertices are aggregated share aspects corresponding to the directional encoding vector.
For S652, an internal aspect consistency coding layer is adopted, a consistency graph of sentence vertices and aspects is constructed according to the direction coding vector and the sentence vertex set, and the constructed aspect consistency graph is used as a target aspect consistency graph.
The method comprises the steps of taking a direction coding vector as one node of a target aspect consistency graph, taking each sentence vertex in a sentence vertex set as one node of the target aspect consistency graph, and connecting the node corresponding to each sentence vertex with the node corresponding to the direction coding vector.
For S653, the internal aspect consistency coding layer is adopted, and the preference degree is measured based on the graph attention mechanism, so that the attention weight calculation between each sentence vertex and aspect is respectively performed according to the target aspect consistency graph, each calculated attention weight is used as the first attention weight, and all the first attention weights are used as the first attention weight set.
That is, the first attention weights in the first set of attention weights are in one-to-one correspondence with the nodes corresponding to the sentence vertices in the target aspect consistency graph.
For S654, an internal aspect consistency encoding layer is adopted, each sentence vertex in the sentence vertex set is encoded into an aspect-related sentence vector according to the first attention weight set, each vector obtained by encoding is used as a first sentence vector, and all the first sentence vectors are combined as first sentence vectors.
That is, the first sentence vectors in the first set of sentence vectors are in one-to-one correspondence with the nodes corresponding to the sentence vertices in the target aspect consistency graph.
In one embodiment, the calculation formula a of the first attention weight in the first attention weight set is as follows ik The method comprises the following steps:
wherein a is ik Is the attention weight, v, between the ith sentence vertex in the sentence vertex set and the direction encoding vector i Is the ith sentence vertex in the sentence vertex set, e k Is a direction coding vector, exp () is natural normalAn exponential function with a base of the number e, f () is the LeakyReLU activation function, w 1 、w v And w e For the parameters to be trained of the intra-aspect consistent encoding layer, I is the number of sentence vertices in the sentence vertex set, T is the vector transpose calculation, [ w ] v v i ;w e e k ]Is to vector w v v i And w e e k Splicing;
calculation formula of first sentence vector in first sentence vector setThe method comprises the following steps:
wherein,is the first sentence vector corresponding to the ith sentence vertex in the sentence vertex set, a jk Is the attention weight, v, between the jth sentence vertex in the sentence vertex set and the direction encoding vector j Is the jth sentence vertex in the sentence vertex set, and tanh () is the hyperbolic tangent function, w 2 、b 1 Is a parameter of the intra-aspect consistency coding layer that needs to be trained.
According to the method, the device and the system, attention weight calculation between sentence vertexes and aspects is carried out according to a target aspect consistency graph by adopting an internal aspect consistency coding layer, a first attention weight set is obtained, coding of each sentence vertex is respectively carried out according to the first attention weight set and the sentence vertex set by adopting the internal aspect consistency coding layer, a first sentence vector set is obtained, learning of aspect related sentence characterization by using an attention mechanism is achieved to improve prediction accuracy, correct judgment is given through other sentences by using the internal aspect consistency, and emotion classification probability prediction accuracy of an emotion classification model is improved.
In one embodiment, the step of using the initial model to perform the aspect-oriented encoding of each sentence-oriented vector according to the sentence-oriented vector set to obtain the second sentence-oriented vector set includes:
s661: adopting an aspect tendency coding layer, and constructing an aspect tendency graph according to the sentence vertex set to obtain a target aspect tendency graph;
s662: adopting an aspect tendency coding layer, and carrying out attention weight calculation among sentence vertices according to a target aspect tendency graph to obtain a second attention weight vector set;
s663: and adopting an aspect tendency coding layer, and respectively coding each sentence vertex according to the second attention weight vector set and the sentence vertex set to obtain a second sentence vector set.
According to the method and the device, firstly, the aspect tendency graph is constructed, then the attention weight among the sentence vertexes is calculated, and finally the coding of each sentence vertex is carried out, so that the purpose that the aspect tendency can be utilized to give out the emotion of the object through the integral emotion judgment in the text which is hidden and difficult to judge is achieved, and the emotion analysis accuracy is improved.
In S661, the aspect tendency means that there is an emotion tendency between the sentence vertex and the adjacent sentence vertex in the same evaluation text.
And adopting an aspect tendency coding layer, constructing tendency graphs of sentence vertexes and sentence vertexes according to the sentence vertex set, and taking the constructed aspect tendency graph as a target aspect tendency graph.
And taking each sentence vertex in the sentence vertex set as one node of the target aspect tendency chart, and connecting the nodes corresponding to the adjacent sentence vertices.
For S662, an aspect-oriented encoding layer is employed to measure preference based on a graph attention mechanism, so that attention weight calculation between sentence vertices and sentence vertices is performed according to a target aspect-oriented graph, respectively, all attention weights calculated for each sentence vertex are taken as second attention weight vectors, and all second attention weight vectors are taken as a second attention weight vector set. That is, each vector element in the second attention weight vector represents the attention weight between two sentence vertices.
That is, the second attention weight vectors in the second set of attention weight vectors are in one-to-one correspondence with the nodes corresponding to the sentence vertices in the target aspect tendency chart.
For S663, an aspect-prone encoding layer is adopted, each sentence vertex in the sentence vertex set is encoded into a sentence vector according to the second attention weight set, each vector obtained by encoding is used as a second sentence vector, and all the second sentence vectors are combined as second sentence vectors.
That is, the second sentence vectors in the second sentence vector set are in one-to-one correspondence with sentence vertices in the sentence vertex set.
In one embodiment, a first attention weight a in the first attention weight set ik The calculation formula of (2) is as follows:
wherein a is ij Is the attention weight, v, between the ith sentence vertex and the jth sentence vertex in the sentence vertex set i Is the ith sentence vertex in the sentence vertex set, v j Is the jth sentence vertex in the sentence vertex set, exp () is an exponential function with the natural constant e as the base, f () is a LeakyReLU activation function, w 3 、w 4 And w 5 For the parameters to be trained of the aspect prone coding layer, I is the number of sentence vertices in the sentence vertex set, T is the vector transpose calculation, [ w ] 4 v i ;w 5 v j ]Is to vector w 4 v i And w 5 v j Splicing;
calculation formula of second sentence vector in second sentence vector setThe method comprises the following steps:
wherein, Is the second sentence vector corresponding to the ith sentence vertex in the sentence vertex set, and tanh () is the hyperbolic tangent function, w 6 、b 2 Is a parameter of the aspect prone coding layer that needs to be trained.
According to the method and the device for analyzing the emotion, attention weight calculation among sentence vertexes is achieved, and finally, encoding of each sentence vertex is achieved, so that emotion of an object can be given out through integral emotion judgment in a dull and difficult-to-judge text by utilizing aspect tendencies, and accuracy of emotion analysis is improved.
In one embodiment, the step of performing sentence vector fusion according to the first sentence vector set and the second sentence vector set to obtain the target sentence vector set by using the sentence fusion layer of the initial model includes:
s671: the method comprises the steps of adopting a vector splicing sub-layer of a sentence fusion layer, respectively carrying out sentence vector splicing corresponding to each sentence pair sample according to a first sentence vector set and a second sentence vector set, and obtaining a sentence vector splicing result set;
s672: adopting a plurality of feature extraction sublayers of the sentence fusion layer, and carrying out feature extraction according to each sentence vector splicing result of the sentence vector splicing result set to obtain a sentence vector set to be fused corresponding to each sentence vector splicing result;
S673: adopting a self-adaptive fusion sub-layer of a sentence fusion layer to respectively fuse each sentence vector set to be fused to obtain a target sentence vector set;
wherein, the calculation formula of the sentence vector to be fused in the sentence vector set to be fusedThe method comprises the following steps:
calculation formula r of target sentence vector corresponding to ith sentence vertex in sentence vertex set i The method comprises the following steps:
is the mth sentence vector to be fused of the sentence vector set to be fused corresponding to the ith sentence vertex in the sentence vertex set, M is the mth layer of the plurality of feature extraction sublayers, M is the total layer number of the plurality of feature extraction sublayers, and tanh is the hyperbolic tangent function, w m 、b m 、W r 、b r And q i Is a parameter to be trained of the sentence fusion layer.
According to the embodiment, the sentence vectors corresponding to the samples are spliced according to the first sentence vector set and the second sentence vector set, and then feature extraction and feature fusion are carried out on each spliced vector, so that the internal aspect consistency is accurately judged through other sentences, and the emotion of an object can be fused through integral emotion judgment in a text which is hidden and difficult to judge by utilizing the aspect tendency, and a basis is provided for accurately carrying out emotion classification.
For S671, a sentence pair sample is obtained from a sentence pair sample set of training samples to be trained as a target sentence pair sample; taking a first sentence vector corresponding to the target sentence pair sample in the first sentence vector set as a target first sentence vector; taking a second sentence vector corresponding to the target sentence pair sample in the second sentence vector set as a target second sentence vector; splicing the target first sentence vector and the target second sentence vector to obtain sentence vector splicing results corresponding to the sample by the target sentence; repeatedly executing the step of acquiring a sentence pair sample from a sentence pair sample set of a training sample to be trained as a target sentence pair sample until the sentence pair sample in the sentence pair sample set of the training sample to be trained is acquired; and taking all sentence vector splicing results as a sentence vector splicing result set. It can be appreciated that the above steps are implemented as vector concatenation sub-layers of the sentence fusion layer.
For S672, the plurality of feature extraction sublayers form a pyramid. Each sentence vector splicing result of the sentence vector splicing result set is input into the bottom of a pyramid formed by a plurality of feature extraction sublayers.
The sentence vector splicing result is obtained from the sentence vector splicing result set and is used as a target sentence vector splicing result; inputting the target sentence vector splicing result into a plurality of feature extraction sublayers to extract features, and taking the feature vector extracted by each feature extraction sublayer as one sentence vector to be fused in a sentence vector set to be fused corresponding to the target sentence vector splicing result; repeating the step of obtaining a sentence vector splicing result from the sentence vector splicing result set as a target sentence vector splicing result until the step of obtaining the sentence vector splicing result in the sentence vector splicing result set is completed. That is, the sentence vector to be fused is a feature vector extracted by one feature extraction sub-layer.
That is, the sentence vectors to be fused in the sentence vector set to be fused are in one-to-one correspondence with the sentence vector splicing results in the sentence vector splicing result set.
For S673, an adaptive fusion sub-layer of the sentence fusion layer is adopted, each sentence vector to be fused in the sentence vector set to be fused is fused, each vector obtained by the fusion process is used as a target sentence vector, and all the target sentence vectors are used as a target sentence vector set.
Referring to fig. 2, the application further provides an emotion classification device based on document-level emotion tendencies, where the device includes:
the data acquisition module 100 is configured to acquire a target evaluation text and an aspect emotion extraction rule, where the aspect emotion extraction rule includes an evaluation object and an evaluation direction;
the sentence set determining module 200 is configured to extract sentences from the target evaluation text to obtain a sentence set;
the target entity attribute pair determining module 300 is configured to extract a target entity attribute pair corresponding to the evaluation object from the sentence set;
the target question determination module 400 is configured to obtain a target question template corresponding to the evaluation direction, and perform question construction according to the target question template and the target entity attribute pair to obtain a target question;
the target sentence pair set determining module 500 is configured to splice each sentence in the sentence set with a target question sentence, so as to obtain a target sentence pair set;
the emotion classification probability vector determining module 600 is configured to perform emotion classification probability prediction on the set of target entity attribute pairs and target sentence pairs input into an emotion classification model to obtain an emotion classification probability vector, where the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendency;
The target emotion classification determination module 700 is configured to perform emotion classification determination according to the emotion classification probability vector, so as to obtain a target emotion classification.
According to the embodiment, sentence collection is obtained by extracting a sentence from a target evaluation text, a target entity attribute pair corresponding to an evaluation object is extracted from the sentence subset, a target question template corresponding to the evaluation direction is obtained, question sentence construction is carried out according to the target question template and the target entity attribute pair, a target question sentence is obtained, each sentence in the sentence collection is respectively spliced with the target question sentence to obtain a target sentence pair collection, then the target entity attribute pair and the target sentence pair collection are input into an emotion classification model for emotion classification probability prediction to obtain an emotion classification probability vector, wherein the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendentiousness, finally emotion classification is carried out according to the emotion classification probability vector to obtain target emotion classification, emotion classification probability prediction is carried out by adopting a model obtained by joint modeling based on the attention mechanism, the internal aspect consistency and the aspect tendentiousness, accuracy of prediction is improved by utilizing the correlation sentence characterization of the attention mechanism, correct judgment is given by utilizing other sentences, and the emotion classification probability vector can be judged in the whole emotion classification can be improved by utilizing the emotion classification probability of the emotion classification object.
Referring to fig. 3, a computer device is further provided in the embodiment of the present application, where the computer device may be a server, and the internal structure of the computer device may be as shown in fig. 3. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the computer is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer equipment is used for storing data such as emotion classification methods based on document-level emotion tendencies. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by a processor, implements a method for emotion classification based on document-level emotion tendencies. An emotion classification method based on document-level emotion tendencies comprises the following steps: acquiring a target evaluation text and an aspect emotion extraction rule, wherein the aspect emotion extraction rule comprises an evaluation object and an evaluation direction; sentence extraction is carried out on the target evaluation text, and a sentence set is obtained; extracting a target entity attribute pair corresponding to the evaluation object from the sentence set; acquiring a target question template corresponding to the evaluation direction, and constructing a question according to the target question template and the target entity attribute pair to obtain a target question; each sentence in the sentence set is spliced with a target question sentence respectively to obtain a target sentence pair set; inputting the target entity attribute pair and the target sentence pair set into an emotion classification model for emotion classification probability prediction to obtain an emotion classification probability vector, wherein the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendency; and carrying out emotion classification determination according to the emotion classification probability vector to obtain target emotion classification.
According to the embodiment, sentence collection is obtained by extracting a sentence from a target evaluation text, a target entity attribute pair corresponding to an evaluation object is extracted from the sentence subset, a target question template corresponding to the evaluation direction is obtained, question sentence construction is carried out according to the target question template and the target entity attribute pair, a target question sentence is obtained, each sentence in the sentence collection is respectively spliced with the target question sentence to obtain a target sentence pair collection, then the target entity attribute pair and the target sentence pair collection are input into an emotion classification model for emotion classification probability prediction to obtain an emotion classification probability vector, wherein the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendentiousness, finally emotion classification is carried out according to the emotion classification probability vector to obtain target emotion classification, emotion classification probability prediction is carried out by adopting a model obtained by joint modeling based on the attention mechanism, the internal aspect consistency and the aspect tendentiousness, accuracy of prediction is improved by utilizing the correlation sentence characterization of the attention mechanism, correct judgment is given by utilizing other sentences, and the emotion classification probability vector can be judged in the whole emotion classification can be improved by utilizing the emotion classification probability of the emotion classification object.
An embodiment of the present application further provides a computer readable storage medium having stored thereon a computer program, which when executed by a processor, implements an emotion classification method based on document-level emotion tendencies, including the steps of: acquiring a target evaluation text and an aspect emotion extraction rule, wherein the aspect emotion extraction rule comprises an evaluation object and an evaluation direction; sentence extraction is carried out on the target evaluation text, and a sentence set is obtained; extracting a target entity attribute pair corresponding to the evaluation object from the sentence set; acquiring a target question template corresponding to the evaluation direction, and constructing a question according to the target question template and the target entity attribute pair to obtain a target question; each sentence in the sentence set is spliced with a target question sentence respectively to obtain a target sentence pair set; inputting the target entity attribute pair and the target sentence pair set into an emotion classification model for emotion classification probability prediction to obtain an emotion classification probability vector, wherein the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendency; and carrying out emotion classification determination according to the emotion classification probability vector to obtain target emotion classification.
According to the executing emotion classification method based on the document-level emotion tendencies, sentence collection is obtained by firstly extracting sentences from target evaluation texts, target entity attribute pairs corresponding to evaluation objects are extracted from sentence subsets, target question sentence templates corresponding to evaluation directions are obtained, question sentence construction is carried out according to the target question sentence templates and the target entity attribute pairs, target question sentences are obtained, each sentence in the sentence collection is respectively spliced with the target question sentences to obtain target sentence pair collection, then the target entity attribute pairs and the target sentence pairs are input into an emotion classification model for emotion classification probability prediction to obtain emotion classification probability vectors, wherein the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendencies, finally emotion classification is determined according to the emotion classification probability vectors to obtain target emotion classification, emotion classification probability prediction is achieved by adopting the model obtained by joint modeling based on the attention mechanism, the internal aspect consistency and the aspect tendencies, the accuracy of prediction is improved by utilizing relevant sentence characteristics of the attention mechanism, judgment is provided through judgment of other sentences, and the accuracy of emotion classification in the aspect can be improved by utilizing the aspect to judge the emotion classification probability of the whole object.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium provided herein and used in embodiments may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual speed data rate SDRAM (SSRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, apparatus, article or method that comprises the element.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the claims, and all equivalent structures or equivalent processes using the descriptions and drawings of the present application, or direct or indirect application in other related technical fields are included in the scope of the claims of the present application.

Claims (9)

1. An emotion classification method based on document-level emotion tendencies, the method comprising:
acquiring a target evaluation text and an aspect emotion extraction rule, wherein the aspect emotion extraction rule comprises an evaluation object and an evaluation direction;
Sentence extraction is carried out on the target evaluation text, and a sentence set is obtained;
extracting a target entity attribute pair corresponding to the evaluation object from the sentence set;
acquiring a target question template corresponding to the evaluation direction, and constructing a question according to the target question template and the target entity attribute pair to obtain a target question;
each sentence in the sentence set is spliced with the target question sentence respectively to obtain a target sentence pair set;
carrying out emotion classification probability prediction on the target entity attribute pair and the target sentence pair set input emotion classification model to obtain an emotion classification probability vector, wherein the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendency;
carrying out emotion classification determination according to the emotion classification probability vector to obtain target emotion classification;
before the step of inputting the target entity attribute pair and the target sentence pair set into the emotion classification model to perform emotion classification probability prediction, the method further comprises the following steps:
obtaining a plurality of training samples, each of the training samples comprising: entity attribute calibration values for samples, sentence sample sets and emotion classification probabilities;
Taking one training sample of a plurality of training samples as a training sample to be trained;
the entity attribute of the training sample to be trained is used for coding the coding layer of the initial model input to the sample, and a direction coding vector is obtained;
respectively inputting each sentence pair sample in the sentence pair sample set of the training sample to be trained into the coding layer to code, so as to obtain a sentence coding vector set;
adopting an internal aspect consistency coding layer of the initial model, and respectively carrying out attention weight calculation and coding between sentence vertices and sentence vertices on sentence coding vectors in the sentence coding vector set according to the direction coding vectors to obtain a first sentence vector set;
adopting an aspect tendency coding layer of the initial model, respectively carrying out attention weight calculation and coding between sentence vertices according to sentence coding vectors in the sentence coding vector set to obtain a second sentence vector set;
adopting a sentence fusion layer of the initial model, and carrying out feature extraction and feature fusion according to the first sentence vector set and the second sentence vector set to obtain a target sentence vector set;
Adopting an emotion classification layer of the initial model, and respectively carrying out emotion classification probability prediction according to each target sentence vector in the target sentence vector set to obtain an emotion classification probability prediction value;
training the initial model according to the emotion classification probability calibration value and the emotion classification probability prediction value of the training sample to be trained;
and repeating the step of taking one training sample among the training samples as the training sample to be trained until reaching a preset model training ending condition, and determining the initial model reaching the preset model training ending condition as the emotion classification model.
2. The emotion classification method based on document-level emotion tendencies as set forth in claim 1, wherein said step of employing an internal aspect consistency coding layer of said initial model to respectively perform attention weight calculation and coding between sentence vertices and sentence vertices on sentence coding vectors in said set of sentence coding vectors according to said direction coding vectors, to obtain a first set of sentence vectors includes:
adopting the internal aspect consistency coding layer, and determining sentence vertex of the sentence coding vector set according to the direction coding vector to obtain a sentence vertex set;
Adopting the internal aspect consistency coding layer, and constructing a consistency graph of the sentence vertex and aspect according to the direction coding vector and the sentence vertex set to obtain a consistency graph of the target aspect;
adopting the internal aspect consistency coding layer, and carrying out attention weight calculation between the sentence vertex and the aspect according to the target aspect consistency graph to obtain a first attention weight set;
and adopting the internal aspect consistency coding layer, and respectively coding each sentence vertex according to the first attention weight set and the sentence vertex set to obtain the first sentence vector set.
3. The emotion classification method based on document-level emotion tendencies as recited in claim 2, wherein a calculation formula of a first attention weight in said first set of attention weightsThe method comprises the following steps:
wherein,is the attention weight between the ith of said sentence vertices in said set of sentence vertices and said direction encoding vector,/or->Is the ith sentence vertex in the sentence vertex set,/th sentence vertex>Is saidDirection coding vector, exp () is an exponential function with natural constant e as the base, f () is a LeakyReLU activation function, +. >、/>And->Parameters to be trained for the inner-aspect consistency coding layer, < >>Is the number of the sentence vertices in the sentence vertex set, T is the vector transpose calculation,/->Is to add vector->And->Splicing;
calculation formula of first sentence vector in the first sentence vector setThe method comprises the following steps:
wherein,is the first sentence vector corresponding to the ith sentence vertex in the sentence vertex set,/i>Is the attention weight between the jth of the sentence vertex set and the direction encoding vector,/for the sentence vertex set>Is the j-th sentence vertex in the sentence vertex set,/for>Is a hyperbolic tangent function, ">、/>Is a parameter of the intra-aspect consistency coding layer that needs to be trained.
4. The emotion classification method based on document-level emotion tendencies as set forth in claim 2, wherein said step of employing said initial model aspect-prone coding layer to calculate and code attention weights between sentence vertices and sentence vertices, respectively, according to sentence-encoded vectors in said set of sentence-encoded vectors, to obtain a second set of sentence vectors includes:
adopting the aspect tendency coding layer, constructing a tendency chart of the aspect according to the sentence vertex set, and obtaining a tendency chart of the target aspect;
Adopting the aspect tendency coding layer, and carrying out attention weight calculation among the sentence vertices according to the target aspect tendency graph to obtain a second attention weight vector set;
and adopting the aspect tendency coding layer to respectively code each sentence vertex according to the second attention weight vector set and the sentence vertex set to obtain the second sentence vector set.
5. The emotion classification method based on document-level emotion tendencies of claim 4, wherein a calculation formula of the second attention weight in the second attention weight set is:
wherein,is the attention weight between the ith and jth sentence vertex in the sentence vertex set,/v>Is the ith sentence vertex in the sentence vertex set, <>Is the j-th sentence vertex in the sentence vertex set, exp () is an exponential function with the natural constant e as the base, f () is a LeakyReLU activation function, and->、/>Andparameters to be trained for the aspect prone coding layer, < >>Is the number of the sentence vertices in the sentence vertex set, T is the vector transpose calculation,/- >Is to add vector->And->Splicing;
calculation formula of second sentence vector in the second sentence vector setThe method comprises the following steps:
wherein,is the second sentence vector corresponding to the ith sentence vertex in the sentence vertex set,is a hyperbolic tangent function, ">、/>Is a parameter of the aspect prone coding layer that needs to be trained.
6. The emotion classification method based on document-level emotion tendencies as set forth in claim 4, wherein said step of performing feature extraction and feature fusion by using said sentence fusion layer of said initial model according to said first sentence vector set and said second sentence vector set to obtain a target sentence vector set includes:
the sentence vector splicing sub-layer of the sentence fusion layer is adopted, sentence vector splicing corresponding to each sentence pair sample is respectively carried out according to the first sentence vector set and the second sentence vector set, and a sentence vector splicing result set is obtained;
adopting a plurality of feature extraction sublayers of the sentence fusion layer, and carrying out feature extraction according to each sentence vector splicing result of the sentence vector splicing result set to obtain a sentence vector set to be fused corresponding to each sentence vector splicing result;
Adopting a self-adaptive fusion sub-layer of the sentence fusion layer to fuse each sentence vector set to be fused respectively to obtain the target sentence vector set;
wherein, the calculation formula of the sentence vector to be fused in the sentence vector set to be fusedThe method comprises the following steps:
calculation formula of target sentence vector corresponding to ith sentence vertex in sentence vertex setThe method comprises the following steps:
=/>
is the mth sentence vector to be fused of the set of sentence vectors to be fused corresponding to the ith sentence vertex in the set of sentence vertices, M is the mth layer of the feature extraction sublayers, M is the total layer number of the feature extraction sublayers, and M is the total layer number of the feature extraction sublayers>Is a hyperbolic tangent function, ">、/>、/>、/>And->And parameters to be trained for the sentence fusion layer.
7. An emotion classification device based on document-level emotion tendencies, the device comprising:
the data acquisition module is used for acquiring a target evaluation text and an aspect emotion extraction rule, wherein the aspect emotion extraction rule comprises an evaluation object and an evaluation direction;
the sentence set determining module is used for extracting sentences from the target evaluation text to obtain a sentence set;
The target entity attribute pair determining module is used for extracting target entity attribute pairs corresponding to the evaluation objects from the sentence sets;
the target question determination module is used for acquiring a target question template corresponding to the evaluation direction, and constructing a question according to the target question template and the target entity attribute pair to obtain a target question;
the target sentence pair set determining module is used for respectively splicing each sentence in the sentence set with the target question sentences to obtain a target sentence pair set;
the emotion classification probability vector determining module is used for carrying out emotion classification probability prediction on the target entity attribute pair and the target sentence pair set input emotion classification model to obtain an emotion classification probability vector, wherein the emotion classification model is a model obtained by joint modeling based on an attention mechanism, internal aspect consistency and aspect tendencies;
the target emotion classification determining module is used for performing emotion classification determination according to the emotion classification probability vector to obtain target emotion classification;
before the step of inputting the target entity attribute pair and the target sentence pair set into the emotion classification model to perform emotion classification probability prediction, the method further comprises the following steps:
Obtaining a plurality of training samples, each of the training samples comprising: entity attribute calibration values for samples, sentence sample sets and emotion classification probabilities;
taking one training sample of a plurality of training samples as a training sample to be trained;
the entity attribute of the training sample to be trained is used for coding the coding layer of the initial model input to the sample, and a direction coding vector is obtained;
respectively inputting each sentence pair sample in the sentence pair sample set of the training sample to be trained into the coding layer to code, so as to obtain a sentence coding vector set;
adopting an internal aspect consistency coding layer of the initial model, and respectively carrying out attention weight calculation and coding between sentence vertices and sentence vertices on sentence coding vectors in the sentence coding vector set according to the direction coding vectors to obtain a first sentence vector set;
adopting an aspect tendency coding layer of the initial model, respectively carrying out attention weight calculation and coding between sentence vertices according to sentence coding vectors in the sentence coding vector set to obtain a second sentence vector set;
adopting a sentence fusion layer of the initial model, and carrying out feature extraction and feature fusion according to the first sentence vector set and the second sentence vector set to obtain a target sentence vector set;
Adopting an emotion classification layer of the initial model, and respectively carrying out emotion classification probability prediction according to each target sentence vector in the target sentence vector set to obtain an emotion classification probability prediction value;
training the initial model according to the emotion classification probability calibration value and the emotion classification probability prediction value of the training sample to be trained;
and repeating the step of taking one training sample among the training samples as the training sample to be trained until reaching a preset model training ending condition, and determining the initial model reaching the preset model training ending condition as the emotion classification model.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 6 when the computer program is executed.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
CN202111158076.3A 2021-09-28 2021-09-28 Emotion classification method, device, equipment and medium based on document-level emotion tendencies Active CN113849651B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111158076.3A CN113849651B (en) 2021-09-28 2021-09-28 Emotion classification method, device, equipment and medium based on document-level emotion tendencies

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111158076.3A CN113849651B (en) 2021-09-28 2021-09-28 Emotion classification method, device, equipment and medium based on document-level emotion tendencies

Publications (2)

Publication Number Publication Date
CN113849651A CN113849651A (en) 2021-12-28
CN113849651B true CN113849651B (en) 2024-04-09

Family

ID=78977289

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111158076.3A Active CN113849651B (en) 2021-09-28 2021-09-28 Emotion classification method, device, equipment and medium based on document-level emotion tendencies

Country Status (1)

Country Link
CN (1) CN113849651B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108133038A (en) * 2018-01-10 2018-06-08 重庆邮电大学 A kind of entity level emotional semantic classification system and method based on dynamic memory network
CN108363753A (en) * 2018-01-30 2018-08-03 南京邮电大学 Comment text sentiment classification model is trained and sensibility classification method, device and equipment
WO2019174423A1 (en) * 2018-03-16 2019-09-19 北京国双科技有限公司 Entity sentiment analysis method and related apparatus
CN110472042A (en) * 2019-07-02 2019-11-19 桂林电子科技大学 A kind of fine granularity sensibility classification method
CN113297352A (en) * 2021-06-07 2021-08-24 苏州大学 Attribute-level emotion classification method and device based on multitask network

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108133038A (en) * 2018-01-10 2018-06-08 重庆邮电大学 A kind of entity level emotional semantic classification system and method based on dynamic memory network
CN108363753A (en) * 2018-01-30 2018-08-03 南京邮电大学 Comment text sentiment classification model is trained and sensibility classification method, device and equipment
WO2019174423A1 (en) * 2018-03-16 2019-09-19 北京国双科技有限公司 Entity sentiment analysis method and related apparatus
CN110472042A (en) * 2019-07-02 2019-11-19 桂林电子科技大学 A kind of fine granularity sensibility classification method
CN113297352A (en) * 2021-06-07 2021-08-24 苏州大学 Attribute-level emotion classification method and device based on multitask network

Also Published As

Publication number Publication date
CN113849651A (en) 2021-12-28

Similar Documents

Publication Publication Date Title
CN110765265B (en) Information classification extraction method and device, computer equipment and storage medium
CN111950269A (en) Text statement processing method and device, computer equipment and storage medium
CN109376222B (en) Question-answer matching degree calculation method, question-answer automatic matching method and device
CN113010693A (en) Intelligent knowledge graph question-answering method fusing pointer to generate network
CN110598206A (en) Text semantic recognition method and device, computer equipment and storage medium
CN112052684A (en) Named entity identification method, device, equipment and storage medium for power metering
CN114492423B (en) False comment detection method, system and medium based on feature fusion and screening
CN111078847A (en) Power consumer intention identification method and device, computer equipment and storage medium
CN111178358A (en) Text recognition method and device, computer equipment and storage medium
CN112131883A (en) Language model training method and device, computer equipment and storage medium
US11645363B2 (en) Automatic identification of misclassified elements of an infrastructure model
CN114064852A (en) Method and device for extracting relation of natural language, electronic equipment and storage medium
CN113704436A (en) User portrait label mining method and device based on session scene
CN113849648A (en) Classification model training method and device, computer equipment and storage medium
CN113704392A (en) Method, device and equipment for extracting entity relationship in text and storage medium
CN110413994B (en) Hot topic generation method and device, computer equipment and storage medium
CN112905793B (en) Case recommendation method and system based on bilstm+attention text classification
CN108304568B (en) Real estate public expectation big data processing method and system
CN113849651B (en) Emotion classification method, device, equipment and medium based on document-level emotion tendencies
CN113673225A (en) Method and device for judging similarity of Chinese sentences, computer equipment and storage medium
CN111859979A (en) Ironic text collaborative recognition method, ironic text collaborative recognition device, ironic text collaborative recognition equipment and computer readable medium
CN116257632A (en) Unknown target position detection method and device based on graph comparison learning
CN112989022B (en) Intelligent virtual text selection method and device and computer equipment
CN112364620B (en) Text similarity judging method and device and computer equipment
CN113657496A (en) Information matching method, device, equipment and medium based on similarity matching model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant