CN113762381B - Emotion classification method, system, electronic equipment and medium - Google Patents
Emotion classification method, system, electronic equipment and medium Download PDFInfo
- Publication number
- CN113762381B CN113762381B CN202111043917.6A CN202111043917A CN113762381B CN 113762381 B CN113762381 B CN 113762381B CN 202111043917 A CN202111043917 A CN 202111043917A CN 113762381 B CN113762381 B CN 113762381B
- Authority
- CN
- China
- Prior art keywords
- matrix
- entity
- angle
- emotion classification
- interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000008451 emotion Effects 0.000 title claims abstract description 108
- 238000000034 method Methods 0.000 title claims abstract description 31
- 239000011159 matrix material Substances 0.000 claims abstract description 136
- 230000003993 interaction Effects 0.000 claims abstract description 65
- 238000011176 pooling Methods 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 10
- 238000013528 artificial neural network Methods 0.000 claims description 7
- 238000004364 calculation method Methods 0.000 claims description 7
- 238000004458 analytical method Methods 0.000 abstract description 21
- 238000004891 communication Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 9
- 230000008901 benefit Effects 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 3
- 230000002996 emotional effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000005065 mining Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000003058 natural language processing Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 230000008909 emotion recognition Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000007935 neutral effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000802 evaporation-induced self-assembly Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/205—Parsing
- G06F40/216—Parsing using statistical methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/289—Phrasal analysis, e.g. finite state techniques or chunking
- G06F40/295—Named entity recognition
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Computation (AREA)
- Probability & Statistics with Applications (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The application discloses an emotion classification method, an emotion classification system, electronic equipment and media, wherein the emotion classification method comprises the following steps: an entity vector acquisition step: obtaining an angle matrix of the entity angle from the named entity recognition model, and converting the angle matrix into an entity vector; an interaction matrix acquisition step: splicing the angle matrix and the entity vector to obtain a spliced matrix, and scaling the spliced matrix to obtain an interaction matrix; the emotion classification acquisition step: and performing attribute interaction on the interaction matrix, and calculating and obtaining emotion classification of the entity angle. According to the invention, through the emotion analysis of the target entity angle oriented to the fine granularity, emotion acting on different angles of the entity is better deeply mined.
Description
Technical Field
The application relates to the technical field of deep learning, in particular to an emotion classification method, an emotion classification system, electronic equipment and a medium.
Background
With the rapid development of the Internet and electronic commerce, more and more consumers post product comments on an Internet platform. In the face of more direct feedback of users, how to integrate feedback information and respond to the feedback information quickly becomes a great challenge of enterprises. The promotion of social media platforms such as microblogs, weChats, redbooks and the like provides a wide data base for activities such as public opinion analysis, polls and the like based on network data. Unlike the overall emotion analysis, the emotion analysis based on attributes and functions (Aspect) has finer analysis granularity, and the main purpose is to give a series of concise expressions according to comment information, so that the preference degree of a consumer group to each attribute of a certain product is illustrated. Therefore, the prior art cannot mine the emotion of the entity and the angle in a finer granularity, so that the fine granularity emotion cannot be deeply analyzed.
Disclosure of Invention
The embodiment of the application provides an emotion classification method, an emotion classification system, electronic equipment and a medium, so that the problems that fine-grained emotion mining cannot deeply analyze emotion of an entity and an angle and the like are solved.
The invention provides an emotion classification method, which comprises the following steps:
an entity vector acquisition step: obtaining an angle matrix of the entity angle from the named entity recognition model, and converting the angle matrix into an entity vector;
an interaction matrix acquisition step: splicing the angle matrix and the entity vector to obtain a spliced matrix, and scaling the spliced matrix to obtain an interaction matrix;
the emotion classification acquisition step: and performing attribute interaction on the interaction matrix, and calculating and obtaining emotion classification of the entity angle.
In the above emotion classification method, the entity vector obtaining step includes, after predefining an entity angle category to be focused, obtaining the angle matrix from the named entity recognition model according to the entity angle, and converting the angle matrix into the entity vector through a maximum pooling or average pooling operation.
In the above emotion classification method, the interaction matrix obtaining step includes:
a splicing matrix acquisition step: splicing each entity angle in the angle matrix with the entity vector to obtain the splicing matrix of the entity angle;
and generating an interaction matrix, namely scaling the splicing matrix through a fully-connected neural network to generate the interaction matrix.
In the above emotion classification method, the emotion classification obtaining step includes performing the Attention interaction on the interaction matrix, and calculating the emotion classification of the entity angle through a correlation calculation formula.
The invention also provides an emotion classification system, which is suitable for the emotion classification method, and comprises the following steps:
entity vector acquisition unit: obtaining an angle matrix of the entity angle from the named entity recognition model, and converting the angle matrix into an entity vector;
interaction matrix acquisition unit: splicing the angle matrix and the entity vector to obtain a spliced matrix, and scaling the spliced matrix to obtain an interaction matrix;
emotion classification acquisition unit: and performing attribute interaction on the interaction matrix, and calculating and obtaining emotion classification of the entity angle.
In the above emotion classification system, after the entity angle category to be focused is predefined, according to the entity angle, the angle matrix is obtained from the named entity recognition model, and after the angle matrix is converted into the entity vector through the maximum pooling or average pooling operation, the entity vector is obtained through the entity vector obtaining unit.
In the above emotion classification system, the interaction matrix acquisition unit includes:
a splicing matrix acquisition module: splicing each entity angle in the angle matrix with the entity vector to obtain the splicing matrix of the entity angle;
and the interaction matrix generation module is used for generating the interaction matrix by scaling the splicing matrix through the fully-connected neural network.
In the above emotion classification system, the interaction matrix is subjected to the Attention interaction, the emotion classification of the entity angle is calculated through a related calculation formula, and then the emotion classification of the entity angle is obtained through the emotion classification obtaining unit.
The invention also provides electronic equipment, which comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, and is characterized in that the emotion classification method of any one of the above is realized when the processor executes the computer program.
The invention also provides an electronic device readable storage medium, wherein the electronic device readable storage medium stores computer program instructions, and the computer program instructions realize the emotion classification method when executed by the processor.
Compared with the related art, the emotion classification method, system, electronic equipment and medium provided by the invention solve the problems that entity and angle fine granularity emotion mining cannot deeply analyze emotion and the like in the prior art, and improve the natural language processing capability.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the other features, objects, and advantages of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a flow chart of an emotion classification method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an angle matrix according to an embodiment of the present application;
FIG. 3 is a representation of an angle matrix vector according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a splice matrix according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an interaction matrix according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an emotion classification system according to the present invention;
fig. 7 is a frame diagram of an electronic device according to an embodiment of the present application.
Wherein, the reference numerals are as follows:
entity vector acquisition unit: 51;
interaction matrix acquisition unit: 52;
emotion classification acquisition unit: 53;
a splicing matrix acquisition module: 521;
an interaction matrix generation module 522;
80 parts of a bus;
a processor: 81;
a memory: 82;
communication interface: 83.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described and illustrated below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden on the person of ordinary skill in the art based on the embodiments provided herein, are intended to be within the scope of the present application.
It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is possible for those of ordinary skill in the art to apply the present application to other similar situations according to these drawings without inventive effort. Moreover, it should be appreciated that while such a development effort might be complex and lengthy, it would nevertheless be a routine undertaking of design, fabrication, or manufacture for those of ordinary skill having the benefit of this disclosure, and thus should not be construed as having the benefit of this disclosure.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is to be expressly and implicitly understood by those of ordinary skill in the art that the embodiments described herein can be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar terms herein do not denote a limitation of quantity, but rather denote the singular or plural. The terms "comprising," "including," "having," and any variations thereof, are intended to cover a non-exclusive inclusion; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to only those steps or elements but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. The terms "connected," "coupled," and the like in this application are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as used herein refers to two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., "a and/or B" may mean: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The terms "first," "second," "third," and the like, as used herein, are merely distinguishing between similar objects and not representing a particular ordering of objects.
Emotion analysis (also known as opinion mining or emotion AI) refers to the use of natural language processing, text analysis, computational linguistics and biometric recognition to systematically identify, extract, quantify and study emotional states and subjective information. One of the basic tasks of emotion analysis is to classify the polarity of a given text at the document, sentence or feature/aspect level, judging whether the opinion expressed in a document, sentence or entity feature/aspect is positive, negative or neutral. Advanced "polarity override" emotion classification focuses on emotional states such as "anger", "sad", and "happy". The open source software tool and a series of free and paid emotion analysis tools adopt machine learning, statistics and natural language processing technology to automatically analyze a large amount of texts, including web pages, online news, internet discussion groups, online comments, web blogs and social media. On the other hand, knowledge-based systems utilize common resources to extract semantic and emotional information related to natural language concepts. Emotion analysis may also be performed on visual content, i.e. images and videos (see multiple mill emotion analysis).
At present, emotion analysis can be classified into the following three types according to granularity: word-level emotion analysis: the method is similar to the construction of emotion dictionary, how to construct a large-scale emotion dictionary, for example, emotion corresponding to a word of 'car accident' is negative, emotion corresponding to a word of 'birthday' is positive; sentence/document level emotion analysis: inputting a sentence returns the corresponding emotion positive and negative for emotion analysis of the sentence or the document, but does not distinguish which entity or object in which sentence the emotion positive and negative are aimed; target-level emotion analysis: finer granularity emotion analysis for entities or attributes (angles) in sentences.
With the continuous improvement of enterprise public opinion analysis demands, at present, fine-granularity emotion recognition can only be used for positioning to a certain entity or emotion related to a certain angle, but if finer-granularity emotion recognition is required to be used for positioning to the level of the entity and the angle, the business demands cannot be well met.
According to the invention, through the emotion analysis of the target entity angle oriented to the fine granularity, emotion acting on different angles of the entity is better deeply mined.
The invention will now be described with reference to specific examples.
Example 1
The embodiment provides an emotion classification method. Referring to fig. 1 to 5, fig. 1 is a flowchart of an emotion classification method according to an embodiment of the present application; FIG. 2 is a schematic diagram of an angle matrix according to an embodiment of the present application; FIG. 3 is a representation of an angle matrix vector according to an embodiment of the present application; FIG. 4 is a schematic diagram of a splice matrix according to an embodiment of the present application; fig. 5 is a schematic diagram of an interaction matrix according to an embodiment of the present application, and as shown in fig. 1 to 5, the emotion classification method includes the following steps:
entity vector acquisition step S1: obtaining an angle matrix of the entity angle from the named entity recognition model, and converting the angle matrix into an entity vector;
interaction matrix acquisition step S2: splicing the angle matrix and the entity vector to obtain a spliced matrix, and scaling the spliced matrix to obtain an interaction matrix;
and (3) emotion classification acquisition step S3: and performing attribute interaction on the interaction matrix, and calculating and obtaining emotion classification of the entity angle.
In an embodiment, the entity vector obtaining step S1 includes, after predefining an entity angle class to be focused, obtaining the angle matrix from the named entity recognition model according to the entity angle, and converting the angle matrix into the entity vector through a maximum pooling or average pooling operation.
In implementations, categories of entity angles of interest are predefined, such as: cost performance, appearance, performance etc, and initialize the relevant emotion angle matrix, for example: and initializing a matrix of M by N when the number of the angle categories is M and the dimension of the angle Embedding is N. Default emotion classification is category 3 (Positive, neutral, negative). After the entity angle is predefined, according to the entity angle, a Span matrix M x N corresponding to the related entity in a Named Entity Recognition (NER) model is obtained, and the Span matrix M x N is converted into an entity vector representation 1*N through the maximum pooling or average pooling operation. Wherein, L: entity length, N: the dimension of Embedding.
In an embodiment, the interaction matrix obtaining step S2 includes:
a splice matrix acquisition step S21: splicing each entity angle in the angle matrix with the entity vector to obtain the splicing matrix of the entity angle;
and S22, generating an interaction matrix, namely generating the interaction matrix by scaling the splicing matrix through a fully-connected neural network.
In a specific implementation, after each angle component in the angle matrix is spliced with the entity vector to obtain a spliced matrix of the entity and the angle, the spliced matrix is scaled to an interaction matrix of m×n through a fully-connected neural network.
In an embodiment, the emotion classification obtaining step S3 includes performing the Attention interaction on the interaction matrix, and calculating the emotion classification of the entity angle according to a correlation calculation formula.
In specific implementation, the interaction matrix is subjected to Attention interaction, and the emotion classification of each entity angle is calculated through the following calculation formula, so that the emotion classification of each entity angle is obtained.
Example two
Referring to fig. 6, fig. 6 is a schematic structural diagram of an emotion classification system according to the present invention. As shown in fig. 6, the emotion classification system of the present invention is applied to the emotion classification method described above, and includes:
entity vector acquisition unit 51: obtaining an angle matrix of the entity angle from the named entity recognition model, and converting the angle matrix into an entity vector;
interaction matrix acquisition unit 52: splicing the angle matrix and the entity vector to obtain a spliced matrix, and scaling the spliced matrix to obtain an interaction matrix;
emotion classification acquisition section 53: and performing attribute interaction on the interaction matrix, and calculating and obtaining emotion classification of the entity angle.
In an embodiment, after the category of the entity angle to be focused is predefined, the entity vector is obtained by the entity vector obtaining unit 51 after the angle matrix is obtained from the named entity recognition model according to the entity angle and the angle matrix is converted into the entity vector through a maximum pooling or average pooling operation.
In an embodiment, the interaction matrix acquisition unit 52 includes:
splice matrix acquisition module 521: splicing each entity angle in the angle matrix with the entity vector to obtain the splicing matrix of the entity angle;
and the interaction matrix generation module 522 is used for generating the interaction matrix by scaling the splicing matrix through the fully-connected neural network.
In an embodiment, the Attention interaction is performed on the interaction matrix, and after the emotion classification of the entity angle is calculated by a related calculation formula, the emotion classification of the entity angle is obtained by the emotion classification obtaining unit 53.
Example III
Referring to fig. 7, a specific implementation of an electronic device is disclosed in this embodiment. The electronic device may include a processor 81 and a memory 82 storing computer program instructions.
In particular, the processor 81 may include a Central Processing Unit (CPU), or an application specific integrated circuit (Application Specific Integrated Circuit, abbreviated as ASIC), or may be configured to implement one or more integrated circuits of embodiments of the present application.
Memory 82 may include, among other things, mass storage for data or instructions. By way of example, and not limitation, memory 82 may comprise a Hard Disk Drive (HDD), floppy Disk Drive, solid state Drive (Solid State Drive, SSD), flash memory, optical Disk, magneto-optical Disk, tape, or universal serial bus (Universal Serial Bus, USB) Drive, or a combination of two or more of the foregoing. The memory 82 may include removable or non-removable (or fixed) media, where appropriate. The memory 82 may be internal or external to the abnormal data monitoring apparatus, where appropriate. In a particular embodiment, the memory 82 is a Non-Volatile (Non-Volatile) memory. In a particular embodiment, the Memory 82 includes Read-Only Memory (ROM) and random access Memory (Random Access Memory, RAM). Where appropriate, the ROM may be a mask-programmed ROM, a programmable ROM (Programmable Read-Only Memory, abbreviated PROM), an erasable PROM (Erasable Programmable Read-Only Memory, abbreviated FPROM), an electrically erasable PROM (Electrically Erasable Programmable Read-Only Memory, abbreviated EFPROM), an electrically rewritable ROM (Electrically Alterable Read-Only Memory, abbreviated EAROM), or a FLASH Memory (FLASH), or a combination of two or more of these. The RAM may be Static Random-Access Memory (SRAM) or dynamic Random-Access Memory (Dynamic Random Access Memory DRAM), where the DRAM may be a fast page mode dynamic Random-Access Memory (Fast Page Mode Dynamic Random Access Memory FPMDRAM), extended data output dynamic Random-Access Memory (Extended Date Out Dynamic Random Access Memory EDODRAM), synchronous dynamic Random-Access Memory (Synchronous Dynamic Random-Access Memory SDRAM), or the like, as appropriate.
Memory 82 may be used to store or cache various data files that need to be processed and/or communicated, as well as possible computer program instructions for execution by processor 81.
The processor 81 implements any of the emotion classification methods in the above embodiments by reading and executing computer program instructions stored in the memory 82.
In some of these embodiments, the electronic device may also include a communication interface 83 and a bus 80. As shown in fig. 7, the processor 81, the memory 82, and the communication interface 83 are connected to each other through the bus 80 and perform communication with each other.
The communication interface 83 is used to implement communications between various modules, devices, units, and/or units in embodiments of the present application. Communication port 83 may also enable communication with other components such as: and the external equipment, the image/abnormal data monitoring equipment, the database, the external storage, the image/abnormal data monitoring workstation and the like are used for data communication.
Bus 80 includes hardware, software, or both that couple components of the electronic device to one another. Bus 80 includes, but is not limited to, at least one of: data Bus (Data Bus), address Bus (Address Bus), control Bus (Control Bus), expansion Bus (Expansion Bus), local Bus (Local Bus). By way of example, and not limitation, bus 80 may include a graphics acceleration interface (Accelerated Graphics Port), abbreviated AGP, or other graphics Bus, an enhanced industry standard architecture (Extended Industry Standard Architecture, abbreviated EISA) Bus, a Front Side Bus (FSB), a HyperTransport (HT) interconnect, an industry standard architecture (Industry Standard Architecture, ISA) Bus, a wireless bandwidth (InfiniBand) interconnect, a Low Pin Count (LPC) Bus, a memory Bus, a micro channel architecture (Micro Channel Architecture, abbreviated MCa) Bus, a peripheral component interconnect (Peripheral Component Interconnect, abbreviated PCI) Bus, a PCI-Express (PCI-X) Bus, a serial advanced technology attachment (Serial Advanced Technology Attachment, abbreviated SATA) Bus, a video electronics standards association local (Video Electronics Standards Association Local Bus, abbreviated VLB) Bus, or other suitable Bus, or a combination of two or more of the foregoing. Bus 80 may include one or more buses, where appropriate. Although embodiments of the present application describe and illustrate a particular bus, the present application contemplates any suitable bus or interconnect.
The electronic device may be connected to an emotion classification system to implement the method in connection with fig. 1-5.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
In summary, compared with the current fine-grained target-level emotion analysis, the invention provides a finer-grained target-entity-angle-oriented emotion analysis method, which can better deeply mine emotion acting on different angles of an entity.
The above examples merely represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. The scope of the present application is therefore intended to be covered by the appended claims.
Claims (6)
1. An emotion classification method, comprising:
an entity vector acquisition step: obtaining an angle matrix of the entity angle from the named entity recognition model, and converting the angle matrix into an entity vector;
an interaction matrix acquisition step: splicing the angle matrix and the entity vector to obtain a spliced matrix, and scaling the spliced matrix to obtain an interaction matrix;
the emotion classification acquisition step: performing Attention interaction on the interaction matrix, and calculating to obtain emotion classification of the entity angle;
the entity vector obtaining step comprises the steps of after predefining entity angle types to be focused, obtaining the angle matrix from the named entity recognition model according to the entity angles, and converting the angle matrix into the entity vector through maximum pooling or average pooling operation;
the step of obtaining the interaction matrix comprises the following steps:
a splicing matrix acquisition step: splicing each entity angle in the angle matrix with the entity vector to obtain the splicing matrix of the entity angle;
and generating an interaction matrix, namely scaling the splicing matrix through a fully-connected neural network to generate the interaction matrix.
2. The emotion classification method of claim 1, wherein said emotion classification acquisition step includes performing said Attention interaction on said interaction matrix, and calculating said emotion classification of said entity angle by a correlation calculation formula.
3. An emotion classification system suitable for use in the emotion classification method of any one of claims 1 to 2, said emotion classification system comprising:
entity vector acquisition unit: obtaining an angle matrix of the entity angle from the named entity recognition model, and converting the angle matrix into an entity vector;
interaction matrix acquisition unit: splicing the angle matrix and the entity vector to obtain a spliced matrix, and scaling the spliced matrix to obtain an interaction matrix;
emotion classification acquisition unit: performing Attention interaction on the interaction matrix, and calculating to obtain emotion classification of the entity angle;
after the entity angle category to be focused is predefined, according to the entity angle, the angle matrix is obtained from the named entity recognition model, and after the angle matrix is converted into the entity vector through the maximum pooling or average pooling operation, the entity vector is obtained through the entity vector obtaining unit;
wherein the interaction matrix acquisition unit includes:
a splicing matrix acquisition module: splicing each entity angle in the angle matrix with the entity vector to obtain the splicing matrix of the entity angle;
and the interaction matrix generation module is used for generating the interaction matrix by scaling the splicing matrix through the fully-connected neural network.
4. The emotion classification system of claim 3, wherein said emotion classification of said entity angle is obtained by said emotion classification obtaining unit after said interaction matrix is subjected to said Attention interaction and said emotion classification of said entity angle is calculated by a correlation calculation formula.
5. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the emotion classification method of any of claims 1-2 when executing the computer program.
6. An electronic device readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the emotion classification method of any of claims 1 to 2.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111043917.6A CN113762381B (en) | 2021-09-07 | 2021-09-07 | Emotion classification method, system, electronic equipment and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111043917.6A CN113762381B (en) | 2021-09-07 | 2021-09-07 | Emotion classification method, system, electronic equipment and medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113762381A CN113762381A (en) | 2021-12-07 |
CN113762381B true CN113762381B (en) | 2023-12-19 |
Family
ID=78793416
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111043917.6A Active CN113762381B (en) | 2021-09-07 | 2021-09-07 | Emotion classification method, system, electronic equipment and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113762381B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109213868A (en) * | 2018-11-21 | 2019-01-15 | 中国科学院自动化研究所 | Entity level sensibility classification method based on convolution attention mechanism network |
CN109948165A (en) * | 2019-04-24 | 2019-06-28 | 吉林大学 | Fine granularity feeling polarities prediction technique based on mixing attention network |
CN110704622A (en) * | 2019-09-27 | 2020-01-17 | 北京明略软件系统有限公司 | Text emotion classification method and device and electronic equipment |
CN112231447A (en) * | 2020-11-21 | 2021-01-15 | 杭州投知信息技术有限公司 | Method and system for extracting Chinese document events |
CN112232070A (en) * | 2020-10-20 | 2021-01-15 | 北京明略昭辉科技有限公司 | Natural language processing model construction method, system, electronic device and storage medium |
CN112256866A (en) * | 2020-09-25 | 2021-01-22 | 东北大学 | Text fine-grained emotion analysis method based on deep learning |
CN112732920A (en) * | 2021-01-15 | 2021-04-30 | 北京明略昭辉科技有限公司 | BERT-based multi-feature fusion entity emotion analysis method and system |
-
2021
- 2021-09-07 CN CN202111043917.6A patent/CN113762381B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109213868A (en) * | 2018-11-21 | 2019-01-15 | 中国科学院自动化研究所 | Entity level sensibility classification method based on convolution attention mechanism network |
CN109948165A (en) * | 2019-04-24 | 2019-06-28 | 吉林大学 | Fine granularity feeling polarities prediction technique based on mixing attention network |
CN110704622A (en) * | 2019-09-27 | 2020-01-17 | 北京明略软件系统有限公司 | Text emotion classification method and device and electronic equipment |
CN112256866A (en) * | 2020-09-25 | 2021-01-22 | 东北大学 | Text fine-grained emotion analysis method based on deep learning |
CN112232070A (en) * | 2020-10-20 | 2021-01-15 | 北京明略昭辉科技有限公司 | Natural language processing model construction method, system, electronic device and storage medium |
CN112231447A (en) * | 2020-11-21 | 2021-01-15 | 杭州投知信息技术有限公司 | Method and system for extracting Chinese document events |
CN112732920A (en) * | 2021-01-15 | 2021-04-30 | 北京明略昭辉科技有限公司 | BERT-based multi-feature fusion entity emotion analysis method and system |
Also Published As
Publication number | Publication date |
---|---|
CN113762381A (en) | 2021-12-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6661790B2 (en) | Method, apparatus and device for identifying text type | |
JP6601470B2 (en) | NATURAL LANGUAGE GENERATION METHOD, NATURAL LANGUAGE GENERATION DEVICE, AND ELECTRONIC DEVICE | |
CN111159409B (en) | Text classification method, device, equipment and medium based on artificial intelligence | |
JP2019519019A5 (en) | ||
WO2020147409A1 (en) | Text classification method and apparatus, computer device, and storage medium | |
CN115982376B (en) | Method and device for training model based on text, multimode data and knowledge | |
CN111241813B (en) | Corpus expansion method, apparatus, device and medium | |
CN112749300B (en) | Method, apparatus, device, storage medium and program product for video classification | |
US20230206670A1 (en) | Semantic representation of text in document | |
CN111767714B (en) | Text smoothness determination method, device, equipment and medium | |
CN111950279A (en) | Entity relationship processing method, device, equipment and computer readable storage medium | |
CN112183102A (en) | Named entity identification method based on attention mechanism and graph attention network | |
CN113434683A (en) | Text classification method, device, medium and electronic equipment | |
CN110909768B (en) | Method and device for acquiring marked data | |
CN112232070A (en) | Natural language processing model construction method, system, electronic device and storage medium | |
CN112926308B (en) | Method, device, equipment, storage medium and program product for matching text | |
CN114048288A (en) | Fine-grained emotion analysis method and system, computer equipment and storage medium | |
CN113723077A (en) | Sentence vector generation method and device based on bidirectional characterization model and computer equipment | |
CN113569118A (en) | Self-media pushing method and device, computer equipment and storage medium | |
CN116561320A (en) | Method, device, equipment and medium for classifying automobile comments | |
CN113762381B (en) | Emotion classification method, system, electronic equipment and medium | |
CN117195886A (en) | Text data processing method, device, equipment and medium based on artificial intelligence | |
CN108021609B (en) | Text emotion classification method and device, computer equipment and storage medium | |
CN114091451A (en) | Text classification method, device, equipment and storage medium | |
CN113255334A (en) | Method, system, electronic device and storage medium for calculating word vector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |