CN113076753A - Emotion analysis model training optimization method, system and storage medium - Google Patents

Emotion analysis model training optimization method, system and storage medium Download PDF

Info

Publication number
CN113076753A
CN113076753A CN202110236422.9A CN202110236422A CN113076753A CN 113076753 A CN113076753 A CN 113076753A CN 202110236422 A CN202110236422 A CN 202110236422A CN 113076753 A CN113076753 A CN 113076753A
Authority
CN
China
Prior art keywords
text
model
matching degree
emotion
emotional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110236422.9A
Other languages
Chinese (zh)
Inventor
辛永欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Yingxin Computer Technology Co Ltd
Original Assignee
Shandong Yingxin Computer Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Yingxin Computer Technology Co Ltd filed Critical Shandong Yingxin Computer Technology Co Ltd
Priority to CN202110236422.9A priority Critical patent/CN113076753A/en
Publication of CN113076753A publication Critical patent/CN113076753A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

本发明公开了一种情感分析模型训练优化方法,包括如下步骤:第一步,获取文本和情感标签,并输入模型的自注意力机制模块;第二步,通过自注意力机制模块将文本和情感标签进行特征融合,得到融合表示标识;第三步,将融合表示标识输入模型的多层感知器模块并进行计算,得到文本和情感标签的匹配度;基于匹配度优化损失函数使模型达到收敛状态,得到优化模型;第四步,通过优化模型对输入的待预测文本进行情感分析操作;通过上述方式,本发明实现了对文本的情感分析,提高了分析准确度。

Figure 202110236422

The invention discloses a training and optimization method for an emotion analysis model, comprising the following steps: first, acquiring text and emotion labels, and inputting them into a self-attention mechanism module of the model; Sentiment tags perform feature fusion to obtain a fusion representation tag; in the third step, the multi-layer perceptron module of the fusion representation tag input model is fused and calculated to obtain the matching degree of text and emotional tags; based on the matching degree, the loss function is optimized to make the model converge. In the fourth step, the sentiment analysis operation is performed on the input text to be predicted through the optimization model; through the above method, the present invention realizes sentiment analysis on the text and improves the analysis accuracy.

Figure 202110236422

Description

Emotion analysis model training optimization method, system and storage medium
Technical Field
The invention relates to the technical field of emotion analysis, in particular to an emotion analysis model training optimization method, system and storage medium.
Background
Emotion Analysis (Sentiment Analysis) refers to a process of analyzing, processing and extracting subjective texts with emotion colors by utilizing natural language processing and text mining technologies; at present, the text sentiment analysis research covers a plurality of fields including natural language processing, text mining, information retrieval, information extraction, machine learning, ontology and the like, and is concerned by a plurality of scholars and research institutions, and the research is continuously one of the hot problems of the research of the fields of natural language processing and text mining in recent years.
Most of the existing emotion analysis solutions are regarded as a multi-classification task, namely after a text to be analyzed is input, a sentence is expressed into a vector through a BERT model, multi-classification prediction is carried out on the basis of the vector, the probabilities of positive, neutral and negative emotions are output, and finally, the emotion category with the highest probability is taken as an output category; the prior art has the defects that only input text information is used in a representation stage, emotion labels are only used as the basis of probability calculation in a final output layer, and the representation of the emotion labels is lacked, so that emotion analysis is not accurate enough.
Disclosure of Invention
The invention mainly solves the technical problem of providing an emotion analysis model training optimization method, system and storage medium, and can solve the problem of inaccurate emotion analysis caused by lack of expression of emotion labels.
In order to solve the technical problems, the invention adopts a technical scheme that: an emotion analysis model training optimization method is provided, wherein the model comprises a self-attention mechanism module and a multi-layer perceptron module, and the method comprises the following steps:
firstly, acquiring a text and an emotion label, and inputting the text and the emotion label into the self-attention mechanism module of the model;
secondly, performing feature fusion on the text and the emotion label through the self-attention mechanism module to obtain a fusion representation identifier;
inputting the fusion representation identification into the multilayer perceptron module and calculating to obtain the matching degree of the text and the emotion label; optimizing a loss function based on the matching degree to enable the model to reach a convergence state, so as to obtain an optimized model;
and fourthly, performing emotion analysis on the input text to be predicted through the optimization model.
As an improvement, the obtaining the text and the emotion label further comprises the following steps:
the text and emotion labels are concatenated using separators.
Further specifically, the text comprises at least one text and the emotion tag comprises at least one emotion tag.
As an improvement, the feature fusion is performed on the text and the emotion tag through the self-attention mechanism module to obtain a fusion representation identifier, further comprising the following steps:
performing feature fusion on the text through the self-attention mechanism module;
performing feature fusion on the emotion label through the self-attention mechanism module;
and performing feature fusion on the text subjected to feature fusion and the emotion label subjected to feature fusion through the self-attention mechanism module to obtain the fusion representation identifier.
As an improvement, the optimizing the loss function based on the matching degree to make the model reach a convergence state further includes the following steps:
and optimizing a loss function by adopting a gradient back-transmission algorithm based on the matching degree to enable the model to reach a convergence state.
As an improvement, the emotion analysis is performed on the input text to be predicted through the optimization model, and the method further includes the following steps:
acquiring a text to be predicted and a plurality of emotion labels, and outputting the matching degree of the text to be predicted and each emotion label by the optimization model;
and arranging the matching degrees of the text to be predicted and each emotion label in a descending order, and outputting the emotion label corresponding to the first sorted matching degree as the emotion label of the text to be predicted.
The invention also provides an emotion analysis model training optimization system, wherein the model comprises a self-attention mechanism module and a multilayer perceptron module, and the emotion analysis model training optimization system comprises:
a feature fusion unit: the system comprises a self-attention mechanism module, a text and emotion label acquisition module, a self-attention mechanism module and a self-attention mechanism module, wherein the self-attention mechanism module is used for acquiring the text and emotion label;
a model training unit: the fusion representation identifier is input into the multilayer perceptron module and is calculated to obtain the matching degree of the text and the emotion label; optimizing a loss function by adopting a gradient pass-back algorithm based on the matching degree to enable the model to reach a convergence state, so as to obtain an optimized model;
an emotion analysis unit: the method comprises the steps of obtaining a text to be predicted and a plurality of emotion labels, and outputting the matching degree of the text to be predicted and each emotion label through the optimization model; the emotion label sorting module is used for sorting the matching degree of the text to be predicted and each emotion label in a descending order and outputting the emotion label corresponding to the first sorted matching degree.
The invention also provides a computer storage medium for storing computer software instructions for the method for optimizing the training of the emotion analysis model, which comprises a program for executing the method for optimizing the training of the emotion analysis model.
The invention has the beneficial effects that:
1. according to the emotion analysis model training optimization method, the emotion labels and the text are input into the BERT model together, and feature fusion is performed, so that emotion analysis on the text is more accurate.
2. The emotion analysis model training optimization system provided by the invention adopts the feature fusion unit to enable the text information to comprise emotion label information, so that the input information is more sufficient, and the emotion analysis is more accurate.
3. The data transmission storage medium realizes the emotion analysis of the text by executing the emotion analysis model training optimization method.
Drawings
In order to more clearly illustrate the detailed description of the invention or the technical solutions in the prior art, the drawings needed in the detailed description or the prior art description are briefly introduced below; throughout the drawings, like elements or portions are generally identified by like reference numerals; in the drawings, elements or portions are not necessarily drawn to scale.
FIG. 1 is a flowchart of an emotion analysis model training optimization method according to embodiment 1 of the present invention;
FIG. 2 is a schematic view of feature fusion according to example 1 of the present invention;
FIG. 3 is a schematic diagram of an emotion analysis model training optimization system according to embodiment 2 of the present invention.
The parts in the drawings are numbered as follows:
1-feature fusion unit, 2-model training unit, 3-emotion analysis unit and 100-emotion analysis model training optimization system.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is obvious that the described embodiments are some, not all embodiments of the present invention; all other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it is to be noted that, unless otherwise explicitly specified or limited, the term "connected" is to be interpreted broadly, e.g. as a fixed connection, a detachable connection, or an integral connection; can be mechanically or electrically connected; the two elements can be directly connected or indirectly connected through an intermediate medium, or the two elements can be communicated with each other, or the two elements can be in wireless connection or wired connection; the specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In the description of the present invention, it should be noted that, for example, MLP (Muti-Layer probability) is a multi-Layer perceptron module, and bert (bidirectional Encoder Representation from transforms) is a bidirectional coding Representation based on a converter, and is a pre-trained language Representation model.
Example 1
This embodiment 1 provides an emotion analysis model training optimization method, as shown in fig. 1, including the following steps:
in the S100 step, connecting the text X and the emotion label L by using a separator, and inputting a BERT model; the BERT model comprises a self-attention mechanism module and a multi-layer perceptron module; combining a text X and an emotion label L into a sample, and inputting three samples because scoring of three emotion labels L is used during optimization training; the three samples were placed in the same group; in this embodiment, the samples that need to be input are arranged as follows:
sample 1: < 'I like sunny days', 'forward';
sample 2: < 'I like sunny days', 'negative direction';
sample 3: < 'I like sunny days', 'neutral' >.
In the step S200, as shown in fig. 2, the text X inside, the emotion tag L inside, the text X and the emotion tag L are fully fused by the attention mechanism module of the BERT model to obtain a fusion representation identifier CLS, so that the text X information is fused with the emotion tag L information, and the emotion tag L information is also fused with the text X information; after the multi-layer fusion representation of the BERT model, the output aspect characteristics of each coding layer of the BERT model are fused, the key semantic characteristics between layers are extracted through the convolutional layers, the influence of redundant information is reduced, the information learned by each coding layer is fully utilized, and finally the text X information and the emotion label L information are fused by the fusion representation identification CLS;
the fused representation identifier CLS may be represented as:
H[CLS]=BERT(X,L)。
in the step S300, a multi-Layer perceptron Module (MLP) of a BERT model is input into a fusion representation CLS and is calculated to obtain the matching degree (score) of the text X and the emotion label L; the matching degree calculation is expressed as:
Score=MLP(H[CLS]);
optimizing a loss function by adopting a gradient back-transmission algorithm according to the matching degree to enable the BERT model to reach a convergence state, so as to obtain an optimized BERT model; the loss function is expressed as:
Figure RE-GDA0003056157570000061
wherein, Back _ size is the number of input samples, SlFor the degree of match of the text X and the correct emotion label L, SaAnd SbFor the matching degree of the text X and the other two emotion labels L, minimizing Loss means maximizing the difference between the matching degree of the text X and the correct emotion label L and the matching degree of the text X and the wrong emotion label L, so that the matching degree of the BERT model on the correct emotion label L is improved.
In the step S400, positive, neutral and negative emotion labels L and text X are respectively input into the optimized BERT model to calculate the matching degree of the emotion labels L and the text X, and then the emotion label L with the highest matching degree is taken as the emotion label L of the text X to be output; in this embodiment, to predict the emotional tendency of "i feel good on sunny day", there are three inputs:
input 1: < 'I feel good on sunny days', 'forward';
input 2: < 'I feel good in sunny days', 'neutral';
input 3: < 'I feel good on sunny days', 'negative direction';
and (3) fusing the three inputs in a BERT model to obtain a fusion representation identification CLS, inputting the fusion representation identification CLS into an MLP (maximum likelihood ratio) to calculate the matching degree to obtain [0.8, 0.2 and 0.1], wherein the input 1 has the highest matching degree, and then outputting the emotional tendency of 'I feel good in sunny days' as 'forward'.
Example 2
The embodiment 2 provides an emotion analysis model training optimization system, wherein a BERT model comprises a self-attention mechanism module and a multilayer perceptron module; as shown in fig. 3, the emotion analysis model training optimization system 100 includes:
feature fusion unit 1: the system comprises a BERT model, a feature fusion module, a fusion expression identification and a text information fusion module, wherein the BERT model is used for performing feature fusion on the acquired text and emotion labels through the self-attention mechanism module of the BERT model to obtain fusion expression identification, so that the text information is fused with emotion label information, and the emotion label information is also fused with text information; the fusion representation mark fuses text information and emotion label information;
model training unit 2: a multi-Layer perceptron Module (MLP) for inputting the fusion representation identification into the BERT model and calculating to obtain the matching degree of the text and the emotion label; optimizing a loss function by adopting a gradient back-transmission algorithm based on the matching degree to enable the BERT model to reach a convergence state, so as to obtain an optimized BERT model;
emotion analysis section 3: the method comprises the steps that a text to be predicted and a plurality of emotion labels are input into an optimized BERT model, and the optimized BERT model calculates the matching degree of the text to be predicted and each emotion label; and comparing the matching degree of each emotion label with the text to be predicted, outputting the emotion label corresponding to the highest matching degree as the emotion label of the text to be predicted, and realizing emotion analysis of the text to be predicted.
Example 3
This embodiment 3 provides a computer-readable storage medium, storing computer software instructions for implementing the emotion analysis model training optimization method described in embodiment 1 above, and comprising a program designed for executing the emotion analysis model training optimization method; specifically, the executable program may be built in the emotion analysis model training optimization system 100, so that the emotion analysis model training optimization system 100 may implement the emotion analysis model training optimization method of embodiment 1 by executing the built-in executable program.
Furthermore, the computer-readable storage medium provided by the present embodiments may take any combination of one or more readable storage media, where a readable storage medium includes an electronic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof.
The serial numbers of the embodiments disclosed in the above embodiments are merely for description and do not represent the merits of the embodiments.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (8)

1.一种情感分析方法情感分析模型训练优化方法,其特征在于,模型包括自注意力机制模块和多层感知器模块,所述方法包括如下步骤:1. a sentiment analysis method sentiment analysis model training optimization method, is characterized in that, model comprises self-attention mechanism module and multilayer perceptron module, and described method comprises the steps: 获取文本和情感标签,并输入所述模型的所述自注意力机制模块;Obtain text and sentiment labels, and input the self-attention mechanism module of the model; 通过所述自注意力机制模块将所述文本和所述情感标签进行特征融合,得到融合表示标识;Perform feature fusion of the text and the emotion tag through the self-attention mechanism module to obtain a fusion representation identifier; 将所述融合表示标识输入所述多层感知器模块并进行计算,得到所述文本和所述情感标签的匹配度;Inputting the fused representation identification into the multilayer perceptron module and performing calculation to obtain the matching degree between the text and the emotional label; 基于所述匹配度优化损失函数使所述模型达到收敛状态,得到优化模型;Optimizing the loss function based on the matching degree enables the model to reach a convergence state to obtain an optimized model; 通过所述优化模型对输入的待预测文本进行情感分析操作。The sentiment analysis operation is performed on the input text to be predicted by the optimization model. 2.根据权利要求1所述的情感分析模型训练优化方法,其特征在于,所述通过所述自注意力机制模块将所述文本和所述情感标签进行特征融合,得到融合表示标识,进一步包括如下步骤:2. The emotion analysis model training optimization method according to claim 1, wherein the feature fusion is performed on the text and the emotional label through the self-attention mechanism module to obtain a fusion representation mark, further comprising: Follow the steps below: 通过所述自注意力机制模块将所述文本进行特征融合;Perform feature fusion on the text through the self-attention mechanism module; 通过所述自注意力机制模块将所述情感标签进行特征融合;Feature fusion is performed on the emotional tag through the self-attention mechanism module; 通过所述自注意力机制模块将特征融合后的文本和特征融合后的情感标签进行特征融合,得到所述融合表示标识。Through the self-attention mechanism module, feature fusion is performed on the feature-fused text and the feature-fused emotional label to obtain the fusion representation identifier. 3.根据权利要求1所述的情感分析模型训练优化方法,其特征在于,所述基于所述匹配度优化损失函数使所述模型达到收敛状态,进一步包括如下步骤:3. The sentiment analysis model training optimization method according to claim 1, wherein the optimization of the loss function based on the matching degree makes the model reach a convergence state, further comprising the steps of: 基于所述匹配度采用梯度回传算法优化损失函数,使所述模型达到收敛状态。Based on the matching degree, a gradient backhaul algorithm is used to optimize the loss function, so that the model reaches a convergent state. 4.根据权利要求1所述的情感分析模型训练优化方法,其特征在于,所述通过所述优化模型对输入的待预测文本进行情感分析操作,进一步包括如下步骤:4. The sentiment analysis model training optimization method according to claim 1, wherein the sentiment analysis operation is performed on the input text to be predicted by the optimization model, further comprising the steps of: 获取待预测文本和若干情感标签,所述优化模型输出所述待预测文本和每个所述情感标签的匹配度;Obtain the text to be predicted and several emotional tags, and the optimization model outputs the matching degree of the text to be predicted and each of the emotional tags; 将所述待预测文本和每个所述情感标签的匹配度降序排列,输出排序第一的匹配度对应的情感标签,作为所述待预测文本的所述情感标签。Arrange the matching degree of the text to be predicted and each of the emotional tags in descending order, and output the emotional tag corresponding to the matching degree ranked first as the emotional tag of the text to be predicted. 5.根据权利要求1所述的情感分析模型训练优化方法,其特征在于,所述获取文本和情感标签,进一步包括如下步骤:5. emotion analysis model training optimization method according to claim 1, is characterized in that, described acquisition text and emotion label, further comprise the steps: 将文本和情感标签使用分隔符进行连接。Concatenate text and sentiment labels with a delimiter. 6.根据权利要求5所述的情感分析模型训练优化方法,其特征在于,所述文本包括至少一个文本,所述情感标签包括至少一个情感标签。6. The sentiment analysis model training optimization method according to claim 5, wherein the text comprises at least one text, and the sentiment label comprises at least one sentiment label. 7.一种情感分析模型训练优化系统,其特征在于,模型包括自注意力机制模块和多层感知器模块,所述系统包括:7. A sentiment analysis model training optimization system, characterized in that the model comprises a self-attention mechanism module and a multi-layer perceptron module, and the system comprises: 特征融合单元:用于将获取的文本和情感标签通过所述自注意力机制模块进行特征融合,得到融合表示标识;Feature fusion unit: used for feature fusion of the acquired text and emotional labels through the self-attention mechanism module to obtain a fusion representation identifier; 模型训练单元:用于将所述融合表示标识输入所述多层感知器模块并进行计算,得到所述文本和所述情感标签的匹配度;基于所述匹配度采用梯度回传算法优化损失函数使所述模型达到收敛状态,得到优化模型;Model training unit: used to input the fusion representation identifier into the multi-layer perceptron module and perform calculation to obtain the matching degree between the text and the emotional label; based on the matching degree, a gradient return algorithm is used to optimize the loss function Make the model reach a convergence state to obtain an optimized model; 情感分析单元:用于获取待预测文本和若干情感标签,通过所述优化模型输出所述待预测文本和每个所述情感标签的匹配度;用于将所述待预测文本和每个所述情感标签的匹配度降序排列,输出排序第一的匹配度对应的所述情感标签。Sentiment analysis unit: used to obtain the text to be predicted and several emotional tags, and output the matching degree between the text to be predicted and each of the emotional tags through the optimization model; The matching degrees of the emotional tags are arranged in descending order, and the emotional tags corresponding to the first matching degree are output. 8.一种计算机存储介质,其特征在于,用于储存为上述权利要求1-6中任一项所述的情感分析模型训练优化方法所用的计算机软件指令,其包含用于执行上述为情感分析模型训练优化方法所设计的程序。8. A computer storage medium, characterized in that it is used to store computer software instructions used for the sentiment analysis model training and optimization method according to any one of the above claims 1-6, and it comprises a computer software instruction for performing the above-mentioned sentiment analysis. A program designed for model training optimization methods.
CN202110236422.9A 2021-03-03 2021-03-03 Emotion analysis model training optimization method, system and storage medium Pending CN113076753A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110236422.9A CN113076753A (en) 2021-03-03 2021-03-03 Emotion analysis model training optimization method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110236422.9A CN113076753A (en) 2021-03-03 2021-03-03 Emotion analysis model training optimization method, system and storage medium

Publications (1)

Publication Number Publication Date
CN113076753A true CN113076753A (en) 2021-07-06

Family

ID=76609834

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110236422.9A Pending CN113076753A (en) 2021-03-03 2021-03-03 Emotion analysis model training optimization method, system and storage medium

Country Status (1)

Country Link
CN (1) CN113076753A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109492101A (en) * 2018-11-01 2019-03-19 山东大学 File classification method, system and medium based on label information and text feature
US20190163742A1 (en) * 2017-11-28 2019-05-30 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for generating information
CN111680154A (en) * 2020-04-13 2020-09-18 华东师范大学 An attribute-level sentiment analysis method for review text based on deep learning
CN112307757A (en) * 2020-10-28 2021-02-02 中国平安人寿保险股份有限公司 Emotion analysis method, device and equipment based on auxiliary task and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190163742A1 (en) * 2017-11-28 2019-05-30 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for generating information
CN109492101A (en) * 2018-11-01 2019-03-19 山东大学 File classification method, system and medium based on label information and text feature
CN111680154A (en) * 2020-04-13 2020-09-18 华东师范大学 An attribute-level sentiment analysis method for review text based on deep learning
CN112307757A (en) * 2020-10-28 2021-02-02 中国平安人寿保险股份有限公司 Emotion analysis method, device and equipment based on auxiliary task and storage medium

Similar Documents

Publication Publication Date Title
CN110210037B (en) Syndrome-oriented medical field category detection method
Mukhtar et al. Urdu sentiment analysis using supervised machine learning approach
CN107239444B (en) A kind of term vector training method and system merging part of speech and location information
WO2018028077A1 (en) Deep learning based method and device for chinese semantics analysis
CN110188202A (en) Training method, device and the terminal of semantic relation identification model
CN112862569B (en) Product appearance style evaluation method and system based on image and text multi-modal data
CN115017266B (en) A scene text retrieval model, method and computer device based on text detection and semantic matching
CN117151222B (en) Domain knowledge-guided emergency case entity attribute and relationship extraction method, electronic device and storage medium
CN109492105B (en) Text emotion classification method based on multi-feature ensemble learning
CN112784601B (en) Key information extraction method, device, electronic equipment and storage medium
CN113705238A (en) Method and model for analyzing aspect level emotion based on BERT and aspect feature positioning model
CN110750974A (en) Structured processing method and system for referee document
CN114528374B (en) A method and device for sentiment classification of movie reviews based on graph neural network
CN110263165A (en) A kind of user comment sentiment analysis method based on semi-supervised learning
CN114356990A (en) Base named entity recognition system and method based on transfer learning
CN116108840A (en) Text fine granularity emotion analysis method, system, medium and computing device
CN114756678A (en) Unknown intention text identification method and device
Illahi et al. Ensemble machine learning approach for stress detection in social media texts
Shabbir et al. Sentiment analysis from Urdu language-based text using deep learning techniques
CN112836056A (en) A text classification method based on network feature fusion
CN113076753A (en) Emotion analysis model training optimization method, system and storage medium
CN115440330B (en) A named entity recognition method for Chinese electronic medical records based on active learning
CN118093888A (en) Knowledge-graph-based motor equipment fault intelligent diagnosis method
CN111914084A (en) Deep learning-based emotion label text generation and evaluation system
CN115640374A (en) Sentence-level relationship extraction method and device based on deep feature fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210706

RJ01 Rejection of invention patent application after publication