CN108595601A - A kind of long text sentiment analysis method incorporating Attention mechanism - Google Patents

A kind of long text sentiment analysis method incorporating Attention mechanism Download PDF

Info

Publication number
CN108595601A
CN108595601A CN201810357279.7A CN201810357279A CN108595601A CN 108595601 A CN108595601 A CN 108595601A CN 201810357279 A CN201810357279 A CN 201810357279A CN 108595601 A CN108595601 A CN 108595601A
Authority
CN
China
Prior art keywords
hidden layer
hidden
text
weight matrix
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810357279.7A
Other languages
Chinese (zh)
Inventor
郑相涵
郑文妃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuzhou University
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN201810357279.7A priority Critical patent/CN108595601A/en
Publication of CN108595601A publication Critical patent/CN108595601A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/40Processing or translation of natural language
    • G06F40/58Use of machine translation, e.g. for multi-lingual retrieval, for server-side translation for client devices or for real-time translation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Machine Translation (AREA)

Abstract

The present invention relates to a kind of long text sentiment analysis methods incorporating Attention mechanism, and text emotion disaggregated model is established using the two-way thresholding Recognition with Recurrent Neural Network of Attention mechanism is combined(Bi‑Attention model).Attention can allow neural network that can be concerned about the important information in text, while ignore or reducing the influence of the secondary information in text, to reduce the complexity of long article present treatment.On the other hand context vector is generated by two-way thresholding cycling element, updates memory state, to fully consider historical information with Future Information to semantic influence.

Description

A kind of long text sentiment analysis method incorporating Attention mechanism
Technical field
The present invention relates to a kind of long text sentiment analysis methods incorporating Attention mechanism
Background technology
Text emotion analysis (also referred to as opinion mining) is to use natural language processing, text mining and Computational Linguistics The methods of identify and extract the subjective information in essence material.These subjective texts are increased with exponential speed daily, are adopted The emotion that the expression of these subjective texts is automatically analyzed with computer, becomes a hot spot of academic circles at present.Society at present It is mostly short text data to hand over the comment data in network, because longer sequence data or chapter data, may contain abundant Emotion information, it is also possible to include the information unrelated to current sentiment analysis.
The short text comment data in social networks is analyzed usually using unidirectional neural network, due to unidirectional neural network Consider history and current information, Future Information can not also be modeled, with the continuous growth of text sequence, tradition is unidirectional LSTM network trainings can not efficiently solve Long-range dependence problem, and the ability of the contextual information of capture is limited.Thus draw Shen goes out bidirectional circulating neural network (Bi-directional Recurrent Neural Network, Bi-RNN).
Attention mechanism (Attention Mechanism) can be concentrated on image specific part by human vision and be inspired, It is applied to visual pattern field earliest, the neural network machine translation being subsequently applied in natural language processing task In (Neural Machine Translation, NMT).Application in translation is that attention mechanism is introduced into Encoder- In Decoder frames, source language and the target language are effectively associated by perceptron formula, are aligned one by one, obtained general Rate is distributed.Attention mechanism not only obtains effect outstanding, while the mark in terms of image/video, reading reason in machine translation Solution etc. has obtained preferable application.
Invention content
In view of this, the purpose of the present invention is to provide a kind of long text sentiment analysis sides incorporating Attention mechanism Method is used to analyze the emotional color of long article notebook data in social networks.
To achieve the above object, the present invention adopts the following technical scheme that:
A kind of long text sentiment analysis method incorporating Attention mechanism, it is characterised in that:
Step S1:The word in text is initialized with term vector, text is mapped to term vector set, and by word to Input of the duration set as network, and it is input to hidden layer;
Step S2:GRU units are introduced in hidden layer and calculate hidden state, are increased reversed network, are recycled using bidirectional gate Modelon Modeling obtains each term vector in term vector set the information of its context, obtains the current hidden state of hidden layer ht
Step S3:Give hidden layer current hidden state htAddition attention mechanism, determined by way of weighting automatically Input text needs the part paid close attention to, and obtains the probability distribution of sentence vector S;
Step S4:According to the probability distribution of sentence vector S, the general of emotional category is judged with full articulamentum and softmax functions Rate is distributed, and is enable model preferably to characterize long text by the probability distribution of emotional category, is captured text key message.
Further, the step S2 is specifically included:
It is modeled using bidirectional gate cycling element, then the more new formula of GRU units is as follows:
Update door ztCalculation formula:
zt=σ (Wzxt+Uzht-1) (1)
Wherein, xtFor the term vector of the input at current time, σ is logistics functions, WZIt is defeated for current time hidden layer Enter to update door ztWeight matrix, UZIt is input to update door z for last moment hidden layertWeight matrix, ht-1It is hidden layer The hidden state of last moment;
Reset door rtUpdate mode:
rt=σ (Wrxt+Urht-1) (2)
Wherein WrIt is input to resetting door z for current time hidden layerrWeight matrix, UrIt is inputted for last moment hidden layer To resetting door zrWeight matrix, by the historical information before ignoring, do not influence following output when it is 1 to reset gate value;
Node state of the memory reset cell at current time
Wherein, tanh is hyperbolic tangent function,It indicates by element multiplication;W, U are the weight matrix to be trained;
The current hidden state h of hidden layertBy update door zt, resetting door rtWith the node shape at memory reset cell current time StateIt codetermines, formula is:
Wherein, when updating door close to 1, new hidden state almost depends on last state.
Further, the step S3 is specifically included following:
According to the current hidden state h of hidden layertIt can obtain hidden layer expression:
μt=tanh (Wwht+bw) (5)
Wherein, WwIt is the weight matrix of hidden layer, bwIt is bias
Different weight matrix α is distributed for the output of each of last layert
Wherein, μwIt is the context vector of word rank;
The hidden state h current to hidden layertWith weight matrix αtWeighting is averaging, and obtains sentence vector S probability distribution, formula For:
Further, the step S2 can also introduce LSTM units in hidden layer and calculate hidden state, add reversed net Network obtains each word in sentence the information of its context.
The present invention has the advantages that compared with prior art:
The present invention introduces context measurement and attention mechanism, is obtained from long text on the basis of Recognition with Recurrent Neural Network Take more sufficient emotional expression.Text emotion is established using the two-way thresholding Recognition with Recurrent Neural Network of Attention mechanism is combined Disaggregated model (Bi-Attention model).Attention can allow neural network that can be concerned about the important information in text, The influence for ignoring or reducing the secondary information in text simultaneously, to reduce the complexity of long article present treatment.On the other hand pass through Two-way thresholding cycling element generates context vector, updates memory state, to fully consider historical information with Future Information to language The influence of justice.
Description of the drawings
Fig. 1 is the two-way GRU schematic diagrames of the present invention
Fig. 2 is GRU structure charts of the present invention
Fig. 3 is LSTM structure charts of the present invention
Specific implementation mode
The present invention will be further described with reference to the accompanying drawings and embodiments.
Referring to Fig.1 and 2, the present invention provides a kind of long text sentiment analysis method incorporating Attention mechanism, It is characterized in that:
Step S1:The word in text is initialized with term vector, text is mapped to term vector set, and by word to The input as network is measured, and is input to hidden layer;
Step S2:GRU units are introduced in hidden layer and calculate hidden state, increase reversed network, to each of sentence Word obtains the information of its context;It specifically includes:
It is modeled using bidirectional gate cycling element, then the more new formula of GRU units is as follows:
Update door ztCalculation formula:
zt=σ (Wzxt+Uzht-1) (1)
Wherein, xtFor the term vector of the input at current time, σ is logistics functions, WZIt is defeated for current time hidden layer Enter to update door ztWeight matrix, UZIt is input to update door z for last moment hidden layertWeight matrix, ht-1It is hidden layer The hidden state of last moment;
Reset door rtUpdate mode:
rt=σ (Wrxt+Urht-1) (2)
Wherein WrIt is input to resetting door z for current time hidden layerrWeight matrix, UrIt is inputted for last moment hidden layer To resetting door zrWeight matrix, by the historical information before ignoring, do not influence following output when it is 1 to reset gate value;
Node state of the memory reset cell of GRU at current time
Wherein, tanh is hyperbolic tangent function,It indicates by element multiplication;W, U are the weight matrix to be trained;
The current hidden state h of hidden layertBy update door zt, resetting door rtWith the node shape at memory reset cell current time StateIt codetermines, formula is:
Wherein, when updating door close to 1, new hidden state almost depends on last state.
Step S3:Give hidden layer current hidden state htAddition attention mechanism, determined by way of weighting automatically Input text needs the part paid close attention to, and is that the output of each of last layer distributes different weight αstWeighting is averaging, and obtains probability Distribution, calculation formula are as follows:
μt=tanh (Wwht+bw) (5)
Wherein, μtIt is that hidden layer indicates, WwIt is the weight matrix of hidden layer, htIt is the hidden state of t moment, bwIt is bias, μwIt is the context vector of word rank, s is sentence vector;
Step S4:The probability that emotional category is finally judged with full articulamentum and softmax functions, passes through the general of emotional category Rate distribution enables model preferably to characterize long text, captures text key message.
With reference to Fig. 3, in an embodiment of the present invention, further, the step S2 can also be introduced in hidden layer LSTM units calculate hidden state, add reversed network, the information of its context is obtained to each word in sentence.
The foregoing is merely presently preferred embodiments of the present invention, all equivalent changes done according to scope of the present invention patent with Modification should all belong to the covering scope of the present invention.

Claims (4)

1. a kind of long text sentiment analysis method incorporating Attention mechanism, it is characterised in that:
Step S1:The word in text is initialized with term vector, text is mapped to term vector set, and by term vector collection Cooperation is the input of network, and is input to hidden layer;
Step S2:GRU units are introduced in hidden layer and calculate hidden state, increase reversed network, using bidirectional gate cycling element Modeling, obtains each term vector in term vector set the information of its context, obtains the current hidden state h of hidden layert
Step S3:Give hidden layer current hidden state htAddition attention mechanism, determined by way of weighting automatically input text The part that this needs is paid close attention to obtains the probability distribution of sentence vector S;
Step S4:According to the probability distribution of sentence vector S, the probability of emotional category is judged with full articulamentum and softmax functions, is led to Crossing probability distribution enables model preferably to characterize long text, captures text key message.
2. a kind of long text sentiment analysis method incorporating Attention mechanism according to claim 1, feature exist In:The step S2 is specifically included:
It is modeled using bidirectional gate cycling element, then the more new formula of GRU units is as follows:
Update door ztCalculation formula:
zt=σ (Wzxt+Uzht-1) (1)
Wherein, xtFor the term vector of the input at current time, σ is logistics functions, WZIt is input to for current time hidden layer Update door ztWeight matrix, UZIt is input to update door z for last moment hidden layertWeight matrix, ht-1It is one on hidden layer The hidden state at moment;
Reset door rtCalculation:
rt=σ (Wrxt+Urht-1) (2)
Wherein WrIt is input to resetting door z for current time hidden layerrWeight matrix, UrIt is input to weight for last moment hidden layer Set a zrWeight matrix, by the historical information before ignoring, do not influence following output when it is 1 to reset gate value;
Node state of the memory reset cell at current time
Wherein, tanh is hyperbolic tangent function,It indicates by element multiplication;W, U are the weight matrix to be trained;
The current hidden state h of hidden layertBy update door zt, resetting door rtWith the node state at memory reset cell current timeAltogether With decision, formula is:
Wherein, when updating door close to 1, new hidden state almost depends on last state.
3. a kind of long text sentiment analysis method incorporating Attention mechanism according to claim 1, feature exist In:The step S3 specifically includes following:
According to the current hidden state h of hidden layertIt can obtain hidden layer expression:
μt=tanh (Wwht+bw) (5)
Wherein, WwIt is the weight matrix of hidden layer, bwIt is bias
Different weight matrix α is distributed for the output of each of last layert
Wherein, μwIt is the context vector of word rank;
The hidden state h current to hidden layertWith weight matrix αtWeighting is averaging, and obtains sentence vector S probability distribution, formula is:
4. a kind of long text sentiment analysis method incorporating Attention mechanism according to claim 1, feature exist In:The step S2 can also introduce LSTM units in hidden layer and calculate hidden state, add reversed network, to every in sentence A word obtains the information of its context.
CN201810357279.7A 2018-04-20 2018-04-20 A kind of long text sentiment analysis method incorporating Attention mechanism Pending CN108595601A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810357279.7A CN108595601A (en) 2018-04-20 2018-04-20 A kind of long text sentiment analysis method incorporating Attention mechanism

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810357279.7A CN108595601A (en) 2018-04-20 2018-04-20 A kind of long text sentiment analysis method incorporating Attention mechanism

Publications (1)

Publication Number Publication Date
CN108595601A true CN108595601A (en) 2018-09-28

Family

ID=63613403

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810357279.7A Pending CN108595601A (en) 2018-04-20 2018-04-20 A kind of long text sentiment analysis method incorporating Attention mechanism

Country Status (1)

Country Link
CN (1) CN108595601A (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109447129A (en) * 2018-09-29 2019-03-08 平安科技(深圳)有限公司 A kind of multi-mode Emotion identification method, apparatus and computer readable storage medium
CN109508642A (en) * 2018-10-17 2019-03-22 杭州电子科技大学 Ship monitor video key frame extracting method based on two-way GRU and attention mechanism
CN109543180A (en) * 2018-11-08 2019-03-29 中山大学 A kind of text emotion analysis method based on attention mechanism
CN109726745A (en) * 2018-12-19 2019-05-07 北京理工大学 A kind of sensibility classification method based on target incorporating description knowledge
CN109902174A (en) * 2019-02-18 2019-06-18 山东科技大学 A kind of feeling polarities detection method of the memory network relied on based on aspect
CN110134757A (en) * 2019-04-19 2019-08-16 杭州电子科技大学 A kind of event argument roles abstracting method based on bull attention mechanism
CN110147446A (en) * 2019-04-19 2019-08-20 中国地质大学(武汉) A kind of word embedding grammar based on the double-deck attention mechanism, equipment and storage equipment
CN110223429A (en) * 2019-06-19 2019-09-10 上海应用技术大学 Voice access control system
CN110378335A (en) * 2019-06-17 2019-10-25 杭州电子科技大学 A kind of information analysis method neural network based and model
CN110457480A (en) * 2019-08-16 2019-11-15 国网天津市电力公司 The construction method of fine granularity sentiment classification model based on interactive attention mechanism
CN111027313A (en) * 2018-10-08 2020-04-17 中国科学院沈阳计算技术研究所有限公司 BiGRU judgment result tendency analysis method based on attention mechanism
CN111353040A (en) * 2019-05-29 2020-06-30 北京工业大学 GRU-based attribute level emotion analysis method
CN111368524A (en) * 2020-03-05 2020-07-03 昆明理工大学 Microblog viewpoint sentence recognition method based on self-attention bidirectional GRU and SVM
WO2020147409A1 (en) * 2019-01-14 2020-07-23 平安科技(深圳)有限公司 Text classification method and apparatus, computer device, and storage medium
CN111671426A (en) * 2020-05-13 2020-09-18 北京航空航天大学 Human body respiration state monitoring system and method based on flexible sensing and deep learning
WO2020211611A1 (en) * 2019-04-17 2020-10-22 腾讯科技(深圳)有限公司 Method and device for generating hidden state in recurrent neural network for language processing
WO2021057424A1 (en) * 2019-09-23 2021-04-01 腾讯科技(深圳)有限公司 Virtual image behavior control method and device based on text, and medium
CN112782762A (en) * 2021-01-29 2021-05-11 东北大学 Earthquake magnitude determination method based on deep learning
CN117688974A (en) * 2024-02-01 2024-03-12 中国人民解放军总医院 Knowledge graph-based generation type large model modeling method, system and equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107092596A (en) * 2017-04-24 2017-08-25 重庆邮电大学 Text emotion analysis method based on attention CNNs and CCR
CN107239446A (en) * 2017-05-27 2017-10-10 中国矿业大学 A kind of intelligence relationship extracting method based on neutral net Yu notice mechanism
WO2018057809A1 (en) * 2016-09-22 2018-03-29 Salesforce.Com, Inc. Pointer sentinel mixture architecture
CN107909421A (en) * 2017-09-29 2018-04-13 中国船舶重工集团公司第七0九研究所 A kind of implicit feedback of more GRU layers of neutral net based on user's space recommends method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018057809A1 (en) * 2016-09-22 2018-03-29 Salesforce.Com, Inc. Pointer sentinel mixture architecture
CN107092596A (en) * 2017-04-24 2017-08-25 重庆邮电大学 Text emotion analysis method based on attention CNNs and CCR
CN107239446A (en) * 2017-05-27 2017-10-10 中国矿业大学 A kind of intelligence relationship extracting method based on neutral net Yu notice mechanism
CN107909421A (en) * 2017-09-29 2018-04-13 中国船舶重工集团公司第七0九研究所 A kind of implicit feedback of more GRU layers of neutral net based on user's space recommends method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郑雄风等: "基于用户和产品Attention机制的层次BGRU模型", 《计算机工程与应用》 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109447129B (en) * 2018-09-29 2023-04-18 平安科技(深圳)有限公司 Multi-mode emotion recognition method and device and computer readable storage medium
CN109447129A (en) * 2018-09-29 2019-03-08 平安科技(深圳)有限公司 A kind of multi-mode Emotion identification method, apparatus and computer readable storage medium
CN111027313A (en) * 2018-10-08 2020-04-17 中国科学院沈阳计算技术研究所有限公司 BiGRU judgment result tendency analysis method based on attention mechanism
CN109508642A (en) * 2018-10-17 2019-03-22 杭州电子科技大学 Ship monitor video key frame extracting method based on two-way GRU and attention mechanism
CN109508642B (en) * 2018-10-17 2021-08-17 杭州电子科技大学 Ship monitoring video key frame extraction method based on bidirectional GRU and attention mechanism
CN109543180A (en) * 2018-11-08 2019-03-29 中山大学 A kind of text emotion analysis method based on attention mechanism
CN109543180B (en) * 2018-11-08 2020-12-04 中山大学 Text emotion analysis method based on attention mechanism
CN109726745A (en) * 2018-12-19 2019-05-07 北京理工大学 A kind of sensibility classification method based on target incorporating description knowledge
CN109726745B (en) * 2018-12-19 2020-10-09 北京理工大学 Target-based emotion classification method integrating description knowledge
WO2020147409A1 (en) * 2019-01-14 2020-07-23 平安科技(深圳)有限公司 Text classification method and apparatus, computer device, and storage medium
CN109902174B (en) * 2019-02-18 2023-06-20 山东科技大学 Emotion polarity detection method based on aspect-dependent memory network
CN109902174A (en) * 2019-02-18 2019-06-18 山东科技大学 A kind of feeling polarities detection method of the memory network relied on based on aspect
WO2020211611A1 (en) * 2019-04-17 2020-10-22 腾讯科技(深圳)有限公司 Method and device for generating hidden state in recurrent neural network for language processing
CN110134757B (en) * 2019-04-19 2020-04-07 杭州电子科技大学 Event argument role extraction method based on multi-head attention mechanism
CN110134757A (en) * 2019-04-19 2019-08-16 杭州电子科技大学 A kind of event argument roles abstracting method based on bull attention mechanism
CN110147446A (en) * 2019-04-19 2019-08-20 中国地质大学(武汉) A kind of word embedding grammar based on the double-deck attention mechanism, equipment and storage equipment
CN111353040A (en) * 2019-05-29 2020-06-30 北京工业大学 GRU-based attribute level emotion analysis method
CN110378335A (en) * 2019-06-17 2019-10-25 杭州电子科技大学 A kind of information analysis method neural network based and model
CN110223429A (en) * 2019-06-19 2019-09-10 上海应用技术大学 Voice access control system
CN110457480A (en) * 2019-08-16 2019-11-15 国网天津市电力公司 The construction method of fine granularity sentiment classification model based on interactive attention mechanism
WO2021057424A1 (en) * 2019-09-23 2021-04-01 腾讯科技(深圳)有限公司 Virtual image behavior control method and device based on text, and medium
US11714879B2 (en) 2019-09-23 2023-08-01 Tencent Technology (Shenzhen) Company Limited Method and device for behavior control of virtual image based on text, and medium
CN111368524A (en) * 2020-03-05 2020-07-03 昆明理工大学 Microblog viewpoint sentence recognition method based on self-attention bidirectional GRU and SVM
CN111671426A (en) * 2020-05-13 2020-09-18 北京航空航天大学 Human body respiration state monitoring system and method based on flexible sensing and deep learning
CN111671426B (en) * 2020-05-13 2022-07-12 北京航空航天大学 Human body respiration state monitoring system and method based on flexible sensing and deep learning
CN112782762A (en) * 2021-01-29 2021-05-11 东北大学 Earthquake magnitude determination method based on deep learning
CN117688974A (en) * 2024-02-01 2024-03-12 中国人民解放军总医院 Knowledge graph-based generation type large model modeling method, system and equipment
CN117688974B (en) * 2024-02-01 2024-04-26 中国人民解放军总医院 Knowledge graph-based generation type large model modeling method, system and equipment

Similar Documents

Publication Publication Date Title
CN108595601A (en) A kind of long text sentiment analysis method incorporating Attention mechanism
Wang et al. Learning visual relationship and context-aware attention for image captioning
Si et al. Skeleton-based action recognition with hierarchical spatial reasoning and temporal stack learning network
CN108984530A (en) A kind of detection method and detection system of network sensitive content
Ke et al. Leveraging structural context models and ranking score fusion for human interaction prediction
Zhang et al. Object semantics sentiment correlation analysis enhanced image sentiment classification
Wang et al. Discovering attractive segments in the user-generated video streams
WO2020238353A1 (en) Data processing method and apparatus, storage medium, and electronic apparatus
CN108427740B (en) Image emotion classification and retrieval algorithm based on depth metric learning
Yan et al. Cross-domain facial expression recognition based on transductive deep transfer learning
CN110210358A (en) A kind of video presentation generation method and device based on two-way timing diagram
Wang et al. Exploiting topic-based adversarial neural network for cross-domain keyphrase extraction
Xiao et al. Chinese sentiment analysis using bidirectional LSTM with word embedding
CN112733764A (en) Method for recognizing video emotion information based on multiple modes
Shen et al. Emotion analysis of ideological and political education using a GRU deep neural network
CN115934951A (en) Network hot topic user emotion prediction method
CN113244627B (en) Method and device for identifying plug-in, electronic equipment and storage medium
CN117313709B (en) Method for detecting generated text based on statistical information and pre-training language model
Zhu et al. NAGNet: A novel framework for real‐time students' sentiment analysis in the wisdom classroom
Yan et al. Image captioning based on a hierarchical attention mechanism and policy gradient optimization
Ke et al. Spatial, structural and temporal feature learning for human interaction prediction
Peng et al. Unsupervised visual–textual correlation learning with fine-grained semantic alignment
Jasmir et al. Feature Extraction for Improvement Text Classification of Spam YouTube Video Comment using Deep Learning
Li et al. Frame aggregation and multi-modal fusion framework for video-based person recognition
Li et al. [Retracted] Human Sports Action and Ideological and PoliticalEvaluation by Lightweight Deep Learning Model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180928

RJ01 Rejection of invention patent application after publication