CN113762381A - Emotion classification method, system, electronic device and medium - Google Patents
Emotion classification method, system, electronic device and medium Download PDFInfo
- Publication number
- CN113762381A CN113762381A CN202111043917.6A CN202111043917A CN113762381A CN 113762381 A CN113762381 A CN 113762381A CN 202111043917 A CN202111043917 A CN 202111043917A CN 113762381 A CN113762381 A CN 113762381A
- Authority
- CN
- China
- Prior art keywords
- matrix
- entity
- angle
- emotion classification
- interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000008451 emotion Effects 0.000 title claims abstract description 103
- 238000000034 method Methods 0.000 title claims abstract description 35
- 239000011159 matrix material Substances 0.000 claims abstract description 136
- 230000003993 interaction Effects 0.000 claims abstract description 60
- 239000013598 vector Substances 0.000 claims abstract description 47
- 238000011176 pooling Methods 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 10
- 238000013528 artificial neural network Methods 0.000 claims description 7
- 238000004364 calculation method Methods 0.000 claims description 7
- 230000002452 interceptive effect Effects 0.000 claims description 5
- 238000004458 analytical method Methods 0.000 abstract description 24
- 238000010586 diagram Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 230000002996 emotional effect Effects 0.000 description 4
- 238000005065 mining Methods 0.000 description 3
- 238000003058 natural language processing Methods 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000007935 neutral effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008909 emotion recognition Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000013456 study Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/205—Parsing
- G06F40/216—Parsing using statistical methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/289—Phrasal analysis, e.g. finite state techniques or chunking
- G06F40/295—Named entity recognition
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Probability & Statistics with Applications (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The application discloses a sentiment classification method, a system, electronic equipment and a medium, wherein the sentiment classification method comprises the following steps: an entity vector obtaining step: obtaining an angle matrix of an entity angle from a named entity recognition model, and converting the angle matrix into an entity vector; an interaction matrix obtaining step: splicing the angle matrix and the entity vector to obtain a spliced matrix, and then scaling the spliced matrix to obtain an interaction matrix; and (3) emotion classification acquisition: and carrying out Attention interaction on the interaction matrix, and calculating to obtain the emotion classification of the entity angle. According to the method, the emotion of different angles acting on the entity is better and deeply mined through fine-grained emotion analysis facing to the target entity angle.
Description
Technical Field
The present application relates to the field of deep learning technologies, and in particular, to an emotion classification method, system, electronic device, and medium.
Background
With the rapid development of the internet and electronic commerce, more and more consumers can make product comments on the internet platform. In the face of more direct feedback of users, how to integrate feedback information and make quick response to the feedback information becomes a great challenge for enterprises. The social media platforms such as microblogs, WeChat and small redbook are emerging, and a wide data basis is provided for activities such as public opinion analysis and civil survey based on network data. Unlike the overall sentiment analysis, the sentiment analysis based on attributes and functions (Aspect) has finer analysis granularity, and the main purpose of the sentiment analysis is to provide a series of concise expressions according to comment information to illustrate the preference degree of a consumer group for each attribute of a certain product. Therefore, the prior art cannot mine the entity and angle emotions in a finer granularity, so that the fine-grained emotions cannot be deeply analyzed.
Disclosure of Invention
The embodiment of the application provides an emotion classification method, an emotion classification system, electronic equipment and a medium, and at least solves the problems that fine-grained emotion mining cannot deeply analyze entity and angle emotion and the like.
The invention provides an emotion classification method, which comprises the following steps:
an entity vector obtaining step: obtaining an angle matrix of an entity angle from a named entity recognition model, and converting the angle matrix into an entity vector;
an interaction matrix obtaining step: splicing the angle matrix and the entity vector to obtain a spliced matrix, and then scaling the spliced matrix to obtain an interaction matrix;
and (3) emotion classification acquisition: and carrying out Attention interaction on the interaction matrix, and calculating to obtain the emotion classification of the entity angle.
In the emotion classification method, the entity vector obtaining step includes, after an entity angle category to be concerned is predefined, obtaining the angle matrix from the named entity recognition model according to the entity angle, and converting the angle matrix into the entity vector through a maximum pooling or average pooling operation.
In the emotion classification method, the interaction matrix obtaining step includes:
acquiring a splicing matrix: splicing each entity angle in the angle matrix with the entity vector to obtain a splicing matrix of the entity angle;
and an interactive matrix generation step, namely scaling the splicing matrix through a fully-connected neural network to generate the interactive matrix.
In the emotion classification method, the emotion classification acquisition step includes performing the Attention interaction on the interaction matrix, and calculating the emotion classification of the entity angle by a correlation calculation formula.
The invention also provides an emotion classification system, which is suitable for the emotion classification method and comprises the following steps:
an entity vector acquisition unit: obtaining an angle matrix of an entity angle from a named entity recognition model, and converting the angle matrix into an entity vector;
an interaction matrix acquisition unit: splicing the angle matrix and the entity vector to obtain a spliced matrix, and then scaling the spliced matrix to obtain an interaction matrix;
an emotion classification acquisition unit: and carrying out Attention interaction on the interaction matrix, and calculating to obtain the emotion classification of the entity angle.
In the emotion classification system, after the entity angle category needing attention is predefined, the angle matrix is obtained from the named entity identification model according to the entity angle, the angle matrix is converted into the entity vector through maximum pooling or average pooling, and the entity vector is obtained through the entity vector obtaining unit.
In the emotion classification system, the interaction matrix obtaining unit includes:
a mosaic matrix acquisition module: splicing each entity angle in the angle matrix with the entity vector to obtain a splicing matrix of the entity angle;
and the interaction matrix generation module is used for generating the interaction matrix by scaling the splicing matrix through the fully-connected neural network.
In the emotion classification system, the Attention interaction is performed on the interaction matrix, the emotion classification of the entity angle is calculated through a correlation calculation formula, and then the emotion classification of the entity angle is acquired through the emotion classification acquisition unit.
The invention also provides an electronic device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor implements any one of the emotion classification methods when executing the computer program.
The invention also provides an electronic device readable storage medium, which stores computer program instructions, and the computer program instructions realize the emotion classification method according to any one of the above items when executed by the processor.
Compared with the related art, the emotion classification method, the emotion classification system, the electronic equipment and the medium solve the problems that emotion cannot be deeply analyzed in entity and angle fine-grained emotion mining in the prior art, and improve natural language processing capacity.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a flow diagram of a sentiment classification method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an angle matrix according to an embodiment of the present application;
FIG. 3 is a representation of angle matrix vectors according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a stitching matrix according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an interaction matrix according to an embodiment of the application;
FIG. 6 is a schematic diagram of the emotion classification system of the present invention;
fig. 7 is a frame diagram of an electronic device according to an embodiment of the present application.
Wherein the reference numerals are:
an entity vector acquisition unit: 51;
an interaction matrix acquisition unit: 52;
an emotion classification acquisition unit: 53;
a mosaic matrix acquisition module: 521, respectively;
522, interactive matrix generation module;
80 parts of a bus;
a processor: 81;
a memory: 82;
a communication interface: 83.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application.
It is obvious that the drawings in the following description are only examples or embodiments of the present application, and that it is also possible for a person skilled in the art to apply the present application to other similar contexts on the basis of these drawings without inventive effort. Moreover, it should be appreciated that such a development effort might be complex and tedious, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure, and thus should not be construed as a limitation of this disclosure.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as referred to herein means two or more. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
Emotion analysis (also known as opinion mining or emotion AI) refers to the systematic identification, extraction, quantification and study of emotional state and subjective information using natural language processing, text analysis, computational linguistics and biometric identification. One of the basic tasks of emotion analysis is to classify the polarity of a given text at the document, sentence, or feature/aspect level, and to determine whether the sentiment expressed in the document, sentence, or entity feature/aspect is positive, negative, or neutral. Advanced "transcending polarity" emotional categories look at emotional states such as "anger", "sadness", and "happiness". The open source software tool and a series of free and paid emotion analysis tools employ machine learning, statistics, and natural language processing techniques to perform automatic emotion analysis on a large number of texts, including web pages, online news, internet discussion groups, online reviews, web blogs, and social media. On the other hand, knowledge-based systems extract semantic and emotional information related to natural language concepts using common resources. Emotion analysis can also be performed on visual content, i.e. images and video (see one-more-instrument emotion analysis).
The current emotion analysis can be divided into three types according to granularity: emotion analysis at word level: the construction similar to the emotion dictionary is used for constructing a large-scale emotion dictionary, for example, the emotion corresponding to the word of 'car accident' is negative, and the emotion corresponding to the word of 'birthday' is positive; sentiment analysis at sentence/document level: for emotion analysis of sentences or documents, inputting a sentence and returning corresponding positive and negative emotions, but the emotion analysis method does not distinguish which entity or which object in which sentence is positively and negatively aimed; target level sentiment analysis: more granular sentiment analysis for entities or attributes (angles) in sentences.
With the continuous improvement of enterprise public opinion analysis requirements, at present, fine-grained emotion recognition can only locate a certain entity or emotion related to a certain angle, but if finer-grained location is needed to the level of the entity and the angle, the business requirements cannot be well met.
According to the method, the emotion of different angles acting on the entity is better and deeply mined through fine-grained emotion analysis facing to the target entity angle.
The present invention will be described with reference to specific examples.
Example one
The embodiment provides an emotion classification method. Referring to fig. 1 to 5, fig. 1 is a flowchart illustrating an emotion classification method according to an embodiment of the present application; FIG. 2 is a schematic diagram of an angle matrix according to an embodiment of the present application; FIG. 3 is a representation of angle matrix vectors according to an embodiment of the present application; FIG. 4 is a schematic diagram of a stitching matrix according to an embodiment of the present application; fig. 5 is a schematic diagram of an interaction matrix according to an embodiment of the application, and as shown in fig. 1 to 5, the emotion classification method includes the following steps:
entity vector acquisition step S1: obtaining an angle matrix of an entity angle from a named entity recognition model, and converting the angle matrix into an entity vector;
interaction matrix acquisition step S2: splicing the angle matrix and the entity vector to obtain a spliced matrix, and then scaling the spliced matrix to obtain an interaction matrix;
emotion classification acquisition step S3: and carrying out Attention interaction on the interaction matrix, and calculating to obtain the emotion classification of the entity angle.
In an embodiment, the entity vector obtaining step S1 includes, after an entity angle category that needs to be focused is predefined, obtaining the angle matrix from the named entity recognition model according to the entity angle, and then converting the angle matrix into the entity vector through a maximum pooling or average pooling operation.
In a specific implementation, the categories of entity angles of interest are predefined, such as: cost performance, appearance, performance etc, and initialize the relevant emotion angle matrix, for example: and initializing a matrix of M x N if the angle category number is M and the angle Embedding dimension is N. The default emotion category is category 3 (Positive, Neutral, Negative). After the entity angle is predefined, a Span matrix M N corresponding to a related entity in a Named Entity Recognition (NER) model is obtained according to the entity angle, and the Span matrix M N is converted into an entity vector to represent 1N through maximum pooling or average pooling. Wherein, L: entity length, N: the Embedding dimension.
In an embodiment, the interaction matrix obtaining step S2 includes:
a mosaic matrix acquisition step S21: splicing each entity angle in the angle matrix with the entity vector to obtain a splicing matrix of the entity angle;
and an interaction matrix generation step S22, namely, generating the interaction matrix by scaling the splicing matrix through the fully-connected neural network.
In specific implementation, after the angle Embedding and the entity vector in the angle matrix are spliced to obtain a splicing matrix of the entity and the angle, the splicing matrix is scaled to an interaction matrix of M × N through a full-connection neural network.
In an embodiment, the emotion classification obtaining step S3 includes performing the Attention interaction on the interaction matrix, and calculating the emotion classification of the entity angle through a correlation calculation formula.
In specific implementation, Attention interaction is carried out on the interaction matrix, the emotion classification of each entity angle is calculated through the following calculation formula, and the emotion classification of each entity angle is obtained.
Example two
Referring to fig. 6, fig. 6 is a schematic structural diagram of an emotion classification system according to the present invention. As shown in fig. 6, the emotion classification system according to the present invention is applied to the emotion classification method described above, and includes:
the entity vector acquisition unit 51: obtaining an angle matrix of an entity angle from a named entity recognition model, and converting the angle matrix into an entity vector;
the interaction matrix acquisition unit 52: splicing the angle matrix and the entity vector to obtain a spliced matrix, and then scaling the spliced matrix to obtain an interaction matrix;
emotion classification acquisition section 53: and carrying out Attention interaction on the interaction matrix, and calculating to obtain the emotion classification of the entity angle.
In an embodiment, after the entity angle category needing attention is predefined, the angle matrix is obtained from the named entity recognition model according to the entity angle, the angle matrix is converted into the entity vector through a maximum pooling or average pooling operation, and then the entity vector is obtained through the entity vector obtaining unit 51.
In an embodiment, the interaction matrix obtaining unit 52 includes:
the concatenation matrix acquisition module 521: splicing each entity angle in the angle matrix with the entity vector to obtain a splicing matrix of the entity angle;
and the interaction matrix generation module 522 is used for generating the interaction matrix by scaling the splicing matrix through the fully-connected neural network.
In an embodiment, after the Attention interaction is performed on the interaction matrix and the emotion classification of the entity angle is calculated through a correlation calculation formula, the emotion classification of the entity angle is obtained through the emotion classification obtaining unit 53.
EXAMPLE III
Referring to fig. 7, this embodiment discloses an embodiment of an electronic device. The electronic device may include a processor 81 and a memory 82 storing computer program instructions.
Specifically, the processor 81 may include a Central Processing Unit (CPU), or A Specific Integrated Circuit (ASIC), or may be configured to implement one or more Integrated circuits of the embodiments of the present Application.
The memory 82 may be used to store or cache various data files for processing and/or communication use, as well as possible computer program instructions executed by the processor 81.
The processor 81 implements any of the emotion classification methods in the embodiments described above by reading and executing computer program instructions stored in the memory 82.
In some of these embodiments, the electronic device may also include a communication interface 83 and a bus 80. As shown in fig. 7, the processor 81, the memory 82, and the communication interface 83 are connected via the bus 80 to complete communication therebetween.
The communication interface 83 is used for implementing communication between modules, devices, units and/or equipment in the embodiment of the present application. The communication port 83 may also be implemented with other components such as: and data communication is carried out among external equipment, image/abnormal data monitoring equipment, a database, external storage, an image/abnormal data monitoring workstation and the like.
The bus 80 includes hardware, software, or both to couple the components of the electronic device to one another. Bus 80 includes, but is not limited to, at least one of the following: data Bus (Data Bus), Address Bus (Address Bus), Control Bus (Control Bus), Expansion Bus (Expansion Bus), and Local Bus (Local Bus). By way of example, and not limitation, Bus 80 may include an Accelerated Graphics Port (AGP) or other Graphics Bus, an Enhanced Industry Standard Architecture (EISA) Bus, a Front-Side Bus (FSB), a Hyper Transport (HT) Interconnect, an ISA (ISA) Bus, an InfiniBand (InfiniBand) Interconnect, a Low Pin Count (LPC) Bus, a memory Bus, a microchannel Architecture (MCA) Bus, a PCI (Peripheral Component Interconnect) Bus, a PCI-Express (PCI-X) Bus, a Serial Advanced Technology Attachment (SATA) Bus, a Video Electronics Bus (audio Electronics Association), abbreviated VLB) bus or other suitable bus or a combination of two or more of these. Bus 80 may include one or more buses, where appropriate. Although specific buses are described and shown in the embodiments of the application, any suitable buses or interconnects are contemplated by the application.
The electronic device may be coupled to an emotion classification system to implement the method of FIGS. 1-5.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
In summary, compared with the current fine-grained target-level emotion analysis, the invention provides a finer-grained target entity angle-oriented emotion analysis method, which can better and deeply mine the emotion acting on different angles of an entity.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent application shall be subject to the protection scope of the appended claims.
Claims (10)
1. An emotion classification method, comprising:
an entity vector obtaining step: obtaining an angle matrix of an entity angle from a named entity recognition model, and converting the angle matrix into an entity vector;
an interaction matrix obtaining step: splicing the angle matrix and the entity vector to obtain a spliced matrix, and then scaling the spliced matrix to obtain an interaction matrix;
and (3) emotion classification acquisition: and carrying out Attention interaction on the interaction matrix, and calculating to obtain the emotion classification of the entity angle.
2. The emotion classification method of claim 1, wherein the entity vector obtaining step includes, after the entity angle category to be paid attention to is predefined, obtaining the angle matrix from the named entity recognition model according to the entity angle, and then converting the angle matrix into the entity vector through a maximum pooling or average pooling operation.
3. The emotion classification method of claim 1, wherein the interaction matrix acquisition step includes:
acquiring a splicing matrix: splicing each entity angle in the angle matrix with the entity vector to obtain a splicing matrix of the entity angle;
and an interactive matrix generation step, namely scaling the splicing matrix through a fully-connected neural network to generate the interactive matrix.
4. The emotion classification method of claim 1, wherein the emotion classification acquisition step includes performing the attention i on interaction on the interaction matrix and calculating the emotion classification of the entity angle by a correlation calculation formula.
5. An emotion classification system, which is applied to the emotion classification method according to any one of claims 1 to 4, the emotion classification system comprising:
an entity vector acquisition unit: obtaining an angle matrix of an entity angle from a named entity recognition model, and converting the angle matrix into an entity vector;
an interaction matrix acquisition unit: splicing the angle matrix and the entity vector to obtain a spliced matrix, and then scaling the spliced matrix to obtain an interaction matrix;
an emotion classification acquisition unit: and carrying out Attention interaction on the interaction matrix, and calculating to obtain the emotion classification of the entity angle.
6. The emotion classification system of claim 5, wherein after an entity angle category to be concerned is predefined, the entity vector is obtained by the entity vector obtaining unit after the angle matrix is obtained from the named entity recognition model according to the entity angle and the angle matrix is converted into the entity vector through a maximum pooling or average pooling operation.
7. The emotion classification system of claim 6, wherein the interaction matrix acquisition unit comprises:
a mosaic matrix acquisition module: splicing each entity angle in the angle matrix with the entity vector to obtain a splicing matrix of the entity angle;
and the interaction matrix generation module is used for generating the interaction matrix by scaling the splicing matrix through the fully-connected neural network.
8. The emotion classification system of claim 7, wherein after the Attention interaction is performed on the interaction matrix and the emotion classification of the entity angle is calculated through a correlation calculation formula, the emotion classification of the entity angle is obtained through the emotion classification obtaining unit.
9. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the emotion classification method as recited in any of claims 1 to 4 when the computer program is executed.
10. An electronic device readable storage medium having stored thereon computer program instructions which, when executed by the processor, implement the emotion classification method as recited in any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111043917.6A CN113762381B (en) | 2021-09-07 | 2021-09-07 | Emotion classification method, system, electronic equipment and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111043917.6A CN113762381B (en) | 2021-09-07 | 2021-09-07 | Emotion classification method, system, electronic equipment and medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113762381A true CN113762381A (en) | 2021-12-07 |
CN113762381B CN113762381B (en) | 2023-12-19 |
Family
ID=78793416
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111043917.6A Active CN113762381B (en) | 2021-09-07 | 2021-09-07 | Emotion classification method, system, electronic equipment and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113762381B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109213868A (en) * | 2018-11-21 | 2019-01-15 | 中国科学院自动化研究所 | Entity level sensibility classification method based on convolution attention mechanism network |
CN109948165A (en) * | 2019-04-24 | 2019-06-28 | 吉林大学 | Fine granularity feeling polarities prediction technique based on mixing attention network |
CN110704622A (en) * | 2019-09-27 | 2020-01-17 | 北京明略软件系统有限公司 | Text emotion classification method and device and electronic equipment |
CN112232070A (en) * | 2020-10-20 | 2021-01-15 | 北京明略昭辉科技有限公司 | Natural language processing model construction method, system, electronic device and storage medium |
CN112231447A (en) * | 2020-11-21 | 2021-01-15 | 杭州投知信息技术有限公司 | Method and system for extracting Chinese document events |
CN112256866A (en) * | 2020-09-25 | 2021-01-22 | 东北大学 | Text fine-grained emotion analysis method based on deep learning |
CN112732920A (en) * | 2021-01-15 | 2021-04-30 | 北京明略昭辉科技有限公司 | BERT-based multi-feature fusion entity emotion analysis method and system |
-
2021
- 2021-09-07 CN CN202111043917.6A patent/CN113762381B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109213868A (en) * | 2018-11-21 | 2019-01-15 | 中国科学院自动化研究所 | Entity level sensibility classification method based on convolution attention mechanism network |
CN109948165A (en) * | 2019-04-24 | 2019-06-28 | 吉林大学 | Fine granularity feeling polarities prediction technique based on mixing attention network |
CN110704622A (en) * | 2019-09-27 | 2020-01-17 | 北京明略软件系统有限公司 | Text emotion classification method and device and electronic equipment |
CN112256866A (en) * | 2020-09-25 | 2021-01-22 | 东北大学 | Text fine-grained emotion analysis method based on deep learning |
CN112232070A (en) * | 2020-10-20 | 2021-01-15 | 北京明略昭辉科技有限公司 | Natural language processing model construction method, system, electronic device and storage medium |
CN112231447A (en) * | 2020-11-21 | 2021-01-15 | 杭州投知信息技术有限公司 | Method and system for extracting Chinese document events |
CN112732920A (en) * | 2021-01-15 | 2021-04-30 | 北京明略昭辉科技有限公司 | BERT-based multi-feature fusion entity emotion analysis method and system |
Also Published As
Publication number | Publication date |
---|---|
CN113762381B (en) | 2023-12-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6661790B2 (en) | Method, apparatus and device for identifying text type | |
JP2019519019A5 (en) | ||
CN111159409B (en) | Text classification method, device, equipment and medium based on artificial intelligence | |
CN115982376B (en) | Method and device for training model based on text, multimode data and knowledge | |
CN111950279A (en) | Entity relationship processing method, device, equipment and computer readable storage medium | |
Alexandridis et al. | A knowledge-based deep learning architecture for aspect-based sentiment analysis | |
CN115917613A (en) | Semantic representation of text in a document | |
CN111767714B (en) | Text smoothness determination method, device, equipment and medium | |
CN112183102A (en) | Named entity identification method based on attention mechanism and graph attention network | |
CN113887213A (en) | Event detection method and device based on multilayer graph attention network | |
CN110909768B (en) | Method and device for acquiring marked data | |
CN113222022A (en) | Webpage classification identification method and device | |
US11250299B2 (en) | Learning representations of generalized cross-modal entailment tasks | |
CN114048288A (en) | Fine-grained emotion analysis method and system, computer equipment and storage medium | |
CN111523301B (en) | Contract document compliance checking method and device | |
CN113569118A (en) | Self-media pushing method and device, computer equipment and storage medium | |
CN112765357A (en) | Text classification method and device and electronic equipment | |
CN110929647B (en) | Text detection method, device, equipment and storage medium | |
CN109918661B (en) | Synonym acquisition method and device | |
CN110472058B (en) | Entity searching method, related equipment and computer storage medium | |
CN116561320A (en) | Method, device, equipment and medium for classifying automobile comments | |
Baniata et al. | Sentence representation network for Arabic sentiment analysis | |
CN108021609B (en) | Text emotion classification method and device, computer equipment and storage medium | |
CN113762381B (en) | Emotion classification method, system, electronic equipment and medium | |
CN114091451A (en) | Text classification method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |