CN111144108B - Modeling method and device of emotion tendentiousness analysis model and electronic equipment - Google Patents

Modeling method and device of emotion tendentiousness analysis model and electronic equipment Download PDF

Info

Publication number
CN111144108B
CN111144108B CN201911370735.2A CN201911370735A CN111144108B CN 111144108 B CN111144108 B CN 111144108B CN 201911370735 A CN201911370735 A CN 201911370735A CN 111144108 B CN111144108 B CN 111144108B
Authority
CN
China
Prior art keywords
emotion
training
analysis
model
feature vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911370735.2A
Other languages
Chinese (zh)
Other versions
CN111144108A (en
Inventor
高参
刘昊
何伯磊
肖欣延
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201911370735.2A priority Critical patent/CN111144108B/en
Publication of CN111144108A publication Critical patent/CN111144108A/en
Application granted granted Critical
Publication of CN111144108B publication Critical patent/CN111144108B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Machine Translation (AREA)

Abstract

The application discloses a modeling method and device of an emotion tendentiousness analysis model. The emotion tendentiousness analysis model is used for classifying emotion analysis tasks; the modeling method comprises the following steps: acquiring emotion analysis data conforming to a target emotion analysis task scene; constructing a pre-training model based on emotion analysis data conforming to a target emotion analysis task scene; acquiring emotion analysis training sentences conforming to a target emotion analysis task scene; fine tuning the pre-training model according to emotion analysis training sentences to obtain an emotion tendentiousness analysis model; when the pre-training model is subjected to fine adjustment, different types of emotion feature word information are respectively trained to capture the influence of different types of emotion feature words on emotion classification. Therefore, the classification capability of the model on emotion analysis tasks can be enhanced, and the accuracy of emotion tendency analysis is improved.

Description

Modeling method and device of emotion tendentiousness analysis model and electronic equipment
Technical Field
The present disclosure relates to the field of natural language processing, and in particular, to a modeling method and apparatus for emotion tendentiousness analysis model, an electronic device, and a computer readable storage medium.
Background
Emotion tendentiousness classification is mainly to automatically predict emotion polarity of text. The emotion tendentiousness analysis mode generally adopts an emotion tendentiousness analysis model to analyze text content, so as to obtain corresponding emotion tendentiousness categories. Modeling of emotion tendentiousness analysis models is a main core technology.
In the related art, the emotion tendentiousness analysis model is mainly modeled by using a network model such as LSTM (Long Short-Term Memory network)/CNN (Convolutional Neural Networks, convolutional neural network). However, when the LSTM/CNN model is used for building the model, large-scale training data are needed for model training, and the trained model is poor in analysis effect and low in emotion tendency analysis accuracy.
Disclosure of Invention
The object of the present application is to solve at least to some extent one of the technical problems described above.
Therefore, a first object of the present application is to provide a modeling method of emotion tendencies analysis model. The method can enhance the classification capability of the model on emotion analysis tasks and improve the accuracy of emotion tendency analysis.
A second object of the present application is to provide a modeling apparatus for an emotion tendentiousness analysis model.
A third object of the present application is to propose an electronic device.
A fourth object of the present application is to propose a computer readable storage medium.
In order to achieve the above object, an embodiment of the first aspect of the present application provides a modeling method for an emotion tendentiousness analysis model, where the emotion tendentiousness analysis model is used for classifying emotion analysis tasks, the modeling method includes: acquiring emotion analysis data conforming to a target emotion analysis task scene; constructing a pre-training model based on emotion analysis data conforming to the target emotion analysis task scene; acquiring emotion analysis training sentences conforming to the target emotion analysis task scene; fine tuning the pre-training model according to the emotion analysis training statement to obtain the emotion tendentiousness analysis model; when the pre-training model is subjected to fine adjustment, different types of emotion feature word information are respectively trained to capture the influence of different types of emotion feature words on emotion classification.
According to one embodiment of the application, based on the emotion analysis data conforming to the target emotion analysis task scene, a pre-training model is constructed, including: analyzing the emotion analysis data conforming to the target emotion analysis task scene to determine emotion feature words in the emotion analysis data;
Replacing emotion feature words in the emotion analysis data with masks; pre-training the emotion analysis data subjected to mask replacement by using a deep neural network to obtain the pre-training model; wherein the pre-training is for training a contribution relationship between the emotion feature words and the mask.
According to one embodiment of the present application, fine tuning is performed on the pre-training model according to the emotion analysis training statement, so as to obtain the emotion tendentiousness analysis model, including: analyzing the emotion analysis training sentences to determine emotion feature words in the emotion analysis training sentences; replacing emotion feature words in the emotion analysis training sentences with masks; inputting the emotion analysis training statement into the pre-training model to obtain a first semantic feature vector output by the pre-training model; inputting the emotion analysis training statement subjected to mask replacement into the pre-training model to obtain a second semantic feature vector output by the pre-training model; acquiring a fusion feature word, and representing the fusion feature word as a third semantic feature vector; training the pre-training model according to the first semantic feature vector, the second semantic feature vector and the third semantic feature vector to obtain the emotion tendentiousness analysis model.
According to one embodiment of the present application, the obtaining the fusion feature word includes: determining the type of the emotion feature words in the emotion analysis training statement; and acquiring fusion feature words of the same belonging type according to the belonging type of the emotion feature words in the emotion analysis training statement.
According to one embodiment of the present application, training the pre-training model according to the first semantic feature vector, the second semantic feature vector and the third semantic feature vector to obtain the emotion tendentiousness analysis model includes: generating a combined feature vector according to the second semantic feature vector and the third semantic feature vector; calculating a distance between the first semantic feature vector and the combined feature vector; judging whether the distance between the first semantic feature vector and the combined feature vector is larger than a preset threshold value or not; if the distance between the first semantic feature vector and the combined feature vector is greater than the preset threshold, adjusting model parameters of the pre-training model and continuing training; and ending training until the distance between the first semantic feature vector and the combined feature vector is smaller than or equal to the preset threshold value so as to obtain the emotion tendentiousness analysis model.
To achieve the above object, an embodiment of the second aspect of the present application provides a modeling apparatus for an emotion tendentiousness analysis model, where the emotion tendentiousness analysis model is used for classifying emotion analysis tasks, the modeling apparatus includes: the emotion analysis data acquisition module is used for acquiring emotion analysis data conforming to a target emotion analysis task scene; the pre-training model construction module is used for constructing a pre-training model based on the emotion analysis data conforming to the target emotion analysis task scene; the emotion analysis training statement acquisition module is used for acquiring emotion analysis training statements conforming to the target emotion analysis task scene; the model fine tuning module is used for fine tuning the pre-training model according to the emotion analysis training statement so as to obtain the emotion tendentiousness analysis model; when the pre-training model is subjected to fine adjustment, different types of emotion feature word information are respectively trained to capture the influence of different types of emotion feature words on emotion classification.
To achieve the above object, an electronic device according to an embodiment of a third aspect of the present application includes: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a modeling method of an emotion tendencies analysis model as described in the embodiments of the first aspect of the present application.
To achieve the above object, a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute a modeling method of an emotion tendentiousness analysis model according to an embodiment of the first aspect of the present application is provided in an embodiment of the fourth aspect of the present application.
One embodiment of the above application has the following advantages or benefits: the emotion analysis data which accords with the target emotion analysis task scene can be obtained, a pre-training model is built based on the emotion analysis data which accords with the target emotion analysis task scene, emotion analysis training sentences which accord with the target emotion analysis task scene are obtained, and then fine adjustment is carried out on the pre-training model according to the emotion analysis training sentences so as to obtain an emotion tendency analysis model; when the pre-training model is subjected to fine adjustment, different types of emotion feature word information are respectively trained to capture the influence of different types of emotion feature words on emotion classification. Modeling the model in a pre-training stage by utilizing an auxiliary data set of a specific task, and increasing the understanding capability of the model to the specific task through a large amount of data and longer training; in addition, through training different types of emotion feature words in the fine adjustment stage, language knowledge in the emotion field is enriched, classification capacity of the model on emotion analysis tasks is enhanced, and accuracy of emotion trend analysis is improved; in addition, the pre-training model is migrated to the emotion analysis training statement which accords with the target emotion analysis task scene for fine adjustment, so that the classification capability of the model on the emotion analysis task can be further enhanced.
Other effects of the above alternative will be described below in connection with specific embodiments.
Drawings
The drawings are for better understanding of the present solution and do not constitute a limitation of the present application. Wherein:
FIG. 1 is a flow chart of a method of modeling an emotion tendentiousness analysis model according to one embodiment of the present application;
FIG. 2 is an exemplary diagram of a pre-training model according to an embodiment of the present application;
FIG. 3 is a flow chart of a method of modeling an emotion tendentiousness analysis model according to another embodiment of the present application;
FIG. 4 is an exemplary diagram of a model fine tuning stage according to an embodiment of the present application;
FIG. 5 is a schematic structural diagram of a modeling apparatus of an emotion tendentiousness analysis model according to an embodiment of the present application;
FIG. 6 is a schematic diagram of the structure of a model trim module according to one embodiment of the present application;
FIG. 7 is a block diagram of an electronic device for implementing a modeling method for an emotion tendencies analysis model in accordance with an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present application to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Modeling methods, apparatuses, electronic devices, and computer-readable storage media of emotion tendencies analysis models proposed according to embodiments of the present application are described below with reference to the accompanying drawings.
FIG. 1 is a flow chart of a method of modeling an emotion tendentiousness analysis model according to one embodiment of the present application. It should be noted that, the modeling method of the emotion tendentiousness analysis model according to the embodiments of the present application may be applied to a modeling apparatus of the emotion tendentiousness analysis model, and the modeling apparatus may be configured on an electronic device.
It should also be noted that, in the embodiments of the present application, the emotion tendencies analysis model may be applied to categorize emotion analysis tasks. The emotion analysis task may include, but is not limited to, emotion tendentiousness analysis of criticizing, emotion tendentiousness analysis of comment text, emotion tendentiousness analysis of news stories, emotion tendentiousness analysis of microblog hot topics, and the like.
As shown in fig. 1, the modeling method of the emotion tendentiousness analysis model may include:
and step 101, acquiring emotion analysis data conforming to a target emotion analysis task scene.
In the application, the applicable emotion analysis task scene of the modeling method of the emotion tendentiousness analysis model can be determined, so that emotion analysis data conforming to the emotion analysis task scene can be obtained according to the applicable emotion analysis task scene. In the embodiment of the application, the emotion analysis data may be obtained by using a web crawler technology and crawling corresponding data from the internet according to the applicable emotion analysis task scene; alternatively, a data set including various emotion analysis task scenes may be provided in advance, and emotion analysis data conforming to the target emotion analysis task scene may be acquired from the data set.
And 102, constructing a pre-training model based on emotion analysis data conforming to a target emotion analysis task scene.
Optionally, pretraining is performed with a large amount of emotion analysis data that matches the target emotion analysis task scenario, such that the pretraining stage increases the understanding of the model to the emotion data set. The auxiliary data set is merged in the pre-training stage to capture rich task-related language knowledge, and feature word information such as emotion words is hidden in the pre-training stage to increase the expression capability of text context on the emotion feature words.
In one embodiment of the application, emotion analysis data conforming to a target emotion analysis task scene is analyzed to determine emotion feature words in the emotion analysis data, the emotion feature words in the emotion analysis data are replaced by masks, and then the emotion analysis data subjected to mask replacement are pre-trained by using a deep neural network to obtain a pre-training model; wherein, the pre-training is used for training the contribution relation between the emotion feature words and the mask.
That is, the emotion analysis data that meets the target emotion analysis task scene may be semantically analyzed to determine which word or words in the emotion analysis data are emotion feature words, or the emotion analysis data that meets the target emotion analysis task scene may be segmented based on a segmentation tool and it may be determined which word or words in the segmentation are emotion feature words. Then, the emotion feature words in the emotion analysis data can be replaced by MASKs, for example, the emotion feature words in the emotion analysis data can be replaced by symbols [ MASK ], and then the emotion analysis data after MASK replacement is pre-trained by using a deep neural network, so that a pre-training model is obtained. In the embodiment of the present application, the deep neural network may be a BERT (Bidirectional Encoder Representations from Transformers, a pre-training method in the field of natural language processing technology) model, or may also be an ERNIE (a pre-training method in the field of natural language processing technology) model, or the like.
For example, taking one emotion analysis data in emotion analysis data conforming to a target emotion analysis task scene as shown in fig. 2, assuming that the one emotion analysis data is "It's amazing what we can remember with a little prompting", the emotion analysis data may be analyzed to determine that an emotion feature word in the emotion analysis data is "amazing", at this time, a mask may be substituted for the emotion feature word so that the emotion analysis data becomes "It's [ mask ] what we can remember with a little prompting", and then the emotion analysis data after the mask substitution is input into a BERT model for pretraining to train out a contribution relationship between the emotion feature word "amazing" and the mask [ mask ]. Therefore, the representation capability of the text context on the emotion feature words can be increased through the feature word information such as mask emotion words in the pre-training stage.
And step 103, acquiring emotion analysis training sentences conforming to the target emotion analysis task scene.
It can be appreciated that after the pre-training model is constructed based on emotion analysis data conforming to the target emotion analysis task scene, the pre-training model is further subjected to fine tuning to obtain an emotion tendentiousness analysis model for the emotion analysis task. Therefore, before the pre-training model is fine-tuned, emotion analysis training sentences conforming to the target emotion analysis task scene are acquired. In the embodiment of the present application, the emotion analysis training statement is labeled data, that is, it is labeled which word or words in the emotion analysis training statement are emotion feature words.
Step 104, fine tuning is carried out on the pre-training model according to emotion analysis training sentences so as to obtain an emotion tendency analysis model; when the pre-training model is subjected to fine adjustment, different types of emotion feature word information are respectively trained to capture the influence of different types of emotion feature words on emotion classification.
In the embodiment of the application, the pre-training model can be migrated to the emotion analysis training statement which accords with the target emotion analysis task scene for fine adjustment, and then the fine-adjusted model is determined to be the emotion tendentiousness analysis model. The emotion tendentiousness analysis can be used as a classification task to train in a fine adjustment stage, that is, different types of emotion feature word information can be respectively trained in the fine adjustment stage so as to capture the influence of different types of emotion feature words on emotion classification.
In one embodiment of the present application, as shown in fig. 3, the specific implementation process of fine tuning the pre-training model according to the emotion analysis training statement to obtain the emotion tendencies analysis model may include the following steps:
step 301, analyzing the emotion analysis training sentences to determine emotion feature words in the emotion analysis training sentences.
Step 302, replacing emotion feature words in the emotion analysis training statement with masks.
Step 303, inputting the emotion analysis training statement into the pre-training model to obtain a first semantic feature vector output by the pre-training model.
And step 304, inputting the emotion analysis training statement subjected to mask replacement into the pre-training model to obtain a second semantic feature vector output by the pre-training model.
Step 305, obtaining the fusion feature words, and representing the fusion feature words as a third semantic feature vector.
In the embodiment of the present application, the fusion feature word may be understood as an emotion feature word. That is, in the fine tuning stage, emotion feature words in some emotion fields can be merged, and then the influence of the merged emotion feature words on emotion classification can be captured. After the fused feature words are obtained, the fused feature words can be represented as corresponding semantic feature vectors by a vector feature representation layer.
In order to enhance the classifying capability of the model to emotion analysis tasks, different types of emotion feature words can be merged in a fine tuning stage, and then the influence of the different types of emotion feature words on emotion classification can be captured. Optionally, in an embodiment of the present application, the type of the emotion feature word in the emotion analysis training sentence may be determined, and the fusion feature word of the same type may be obtained according to the type of the emotion feature word in the emotion analysis training sentence. Among these types may include, but are not limited to, positive directionality of the affective feature words, negative directionality of the affective feature words, negativity, augmentation, etc.
That is, the type of the emotion feature word in the emotion analysis training sentence can be determined, and emotion feature words having the same type can be obtained according to the type of the emotion feature word. It should be noted that, when the types of emotion feature words in the emotion analysis training sentence are one or more, and when the types of emotion feature words are multiple, it is necessary to obtain, for each type of emotion feature word, an emotion feature word having the same current type, that is, the role of the emotion feature word in the emotion analysis training sentence may be forward, or may be enhanced, and at this time, it is necessary to obtain a forward emotion feature word and an enhanced emotion feature word. For example, taking an emotion analysis training sentence as "It's amazing what we can remember with a little prompting", wherein the type of the emotion feature word "training" is forward and enhanced, at this time, according to the type, forward emotion feature words, such as "admiration", and enhanced emotion feature words, such as "rising", may be obtained respectively.
Step 306, training the pre-training model according to the first semantic feature vector, the second semantic feature vector and the third semantic feature vector to obtain an emotion tendentiousness analysis model.
Optionally, training the pre-training model according to the first semantic feature vector, the second semantic feature vector and the third semantic feature vector, ending model training if the currently trained model meets the target requirement, and determining the trained model as the emotion tendentiousness analysis model.
In the embodiment of the application, a combined feature vector can be generated according to the second semantic feature vector and the third semantic feature vector, the distance between the first semantic feature vector and the combined feature vector is calculated, and whether the distance between the first semantic feature vector and the combined feature vector is larger than a preset threshold value or not is judged; if the distance between the first semantic feature vector and the combined feature vector is greater than a preset threshold, adjusting model parameters of the pre-training model and continuing training; and ending training until the distance between the first semantic feature vector and the combined feature vector is smaller than or equal to a preset threshold value so as to obtain an emotion tendentiousness analysis model. Wherein a KL distance between the first semantic feature vector and the combined feature vector may be calculated using a KL diversity (relative entropy) algorithm.
For example, as shown in fig. 4, taking an emotion training sentence as "It's amazing what we can remember with a little prompting", wherein the emotion feature word in the emotion training sentence is "amazing", at this time, the emotion feature word may be replaced with a mask so that the emotion analysis data becomes "It's [ mask ] what we can remember with a little prompting". And inputting the original emotion analysis training statement (It's amazing what we can remember with a little prompting) into the pre-training BERT model to obtain a first semantic feature vector corresponding to the original emotion analysis training statement. And inputting the emotion analysis training statement ' It's [ mask ] what we can remember with a little prompting ' subjected to mask replacement into a pre-training BERT model to obtain a corresponding second semantic feature vector. The fusion feature words of the emotion feature word "amazing" are respectively an emotion feature word of "admiration" (which belongs to a positive emotion type) and an emotion feature word of "surviving" (which belongs to an enhanced emotion type). Different emotion types can be trained respectively, namely, for the emotion analysis training statement "It's amazing what we can remember with a little prompting", semantic feature vectors of fusion feature words "admiration" can be obtained, the semantic feature vectors of the admiration and the second semantic feature vectors are combined in a vector manner to obtain combined feature vectors, and model training is carried out based on the combined feature vectors and the first semantic feature vectors; meanwhile, the semantic feature vector of the fusion feature word 'surviving' is also required to be obtained, the semantic feature vector of the fusion feature word 'surviving' and the second semantic feature vector are subjected to vector combination to obtain a combined feature vector, and model training is further carried out based on the combined feature vector and the first semantic feature vector, so that the influence of different types of emotion feature words on emotion classification can be captured through respectively modeling different types of emotion feature words. For example, positive emotion words are more likely to express positive emotion of a text, similarly negative emotion words are more likely to express negative emotion of a text, negative words are more likely to reverse the tendency of a text, and enhancement words are more likely to enhance the tendency of a text.
According to the modeling method of the emotion tendentiousness analysis model, emotion analysis data conforming to a target emotion analysis task scene can be obtained, a pre-training model is built based on the emotion analysis data conforming to the target emotion analysis task scene, emotion analysis training sentences conforming to the target emotion analysis task scene are obtained, and then fine adjustment is carried out on the pre-training model according to the emotion analysis training sentences so as to obtain the emotion tendentiousness analysis model; when the pre-training model is subjected to fine adjustment, different types of emotion feature word information are respectively trained to capture the influence of different types of emotion feature words on emotion classification. Modeling the model in a pre-training stage by utilizing an auxiliary data set of a specific task, and increasing the understanding capability of the model to the specific task through a large amount of data and longer training; in addition, through training different types of emotion feature words in the fine adjustment stage, language knowledge in the emotion field is enriched, classification capacity of the model on emotion analysis tasks is enhanced, and accuracy of emotion trend analysis is improved; in addition, the pre-training model is migrated to the emotion analysis training statement which accords with the target emotion analysis task scene for fine adjustment, so that the classification capability of the model on the emotion analysis task can be further enhanced.
In correspondence with the modeling method of the emotion tendentiousness analysis model provided in the above embodiments, an embodiment of the present invention further provides a modeling apparatus of the emotion tendentiousness analysis model, and since the modeling apparatus of the emotion tendentiousness analysis model provided in the embodiment of the present invention corresponds to the modeling method of the emotion tendentiousness analysis model provided in the above embodiments, implementation of the foregoing modeling method of the emotion tendentiousness analysis model is also applicable to the modeling apparatus of the emotion tendentiousness analysis model provided in the present embodiment, and will not be described in detail in the present embodiment. Fig. 5 is a schematic structural diagram of a modeling apparatus of an emotion tendentiousness analysis model according to an embodiment of the present application. In the embodiment of the present application, the emotion tendentiousness analysis model is used for classifying emotion analysis tasks.
As shown in fig. 5, the modeling apparatus 500 of the emotion tendentiousness analysis model may include: the emotion analysis data acquisition module 510, the pre-training model construction module 520, the emotion analysis training statement acquisition module 530, and the model fine adjustment module 540.
Specifically, the emotion analysis data acquisition module 510 is configured to acquire emotion analysis data that conforms to a target emotion analysis task scene.
The pre-training model construction module 520 is configured to construct a pre-training model based on emotion analysis data that conforms to a target emotion analysis task scenario. As an example, the pre-training model construction module 520 is specifically configured to: analyzing emotion analysis data conforming to a target emotion analysis task scene to determine emotion feature words in the emotion analysis data; replacing emotion feature words in emotion analysis data with masks; pre-training the emotion analysis data subjected to mask replacement by using a deep neural network to obtain a pre-training model; wherein, the pre-training is used for training the contribution relation between the emotion feature words and the mask.
The emotion analysis training statement acquisition module 530 is configured to acquire emotion analysis training statements that conform to a target emotion analysis task scene.
The model fine tuning module 540 is configured to fine tune the pre-training model according to the emotion analysis training statement, so as to obtain an emotion tendentiousness analysis model; when the pre-training model is subjected to fine adjustment, different types of emotion feature word information are respectively trained to capture the influence of different types of emotion feature words on emotion classification.
In one embodiment of the present application, as shown in fig. 6, the model fine tuning module 540 may include: an emotion feature word determination unit 541, a replacement unit 542, a first semantic feature vector generation unit 543, a second semantic feature vector generation unit 544, a fusion feature word acquisition unit 545, a third semantic feature vector generation unit 546, and a training unit 547. The emotion feature word determining unit 541 is configured to analyze the emotion analysis training statement to determine an emotion feature word in the emotion analysis training statement; the replacing unit 542 is configured to replace emotion feature words in the emotion analysis training statement with masks; the first semantic feature vector generating unit 543 is configured to input an emotion analysis training sentence into the pre-training model, so as to obtain a first semantic feature vector output by the pre-training model; the second semantic feature vector generating unit 544 is configured to input the emotion analysis training statement after mask substitution to the pre-training model, so as to obtain a second semantic feature vector output by the pre-training model; the fusion feature word obtaining unit 545 is used for obtaining fusion feature words; the third semantic feature vector generating unit 546 is configured to represent the fused feature word as a third semantic feature vector; the training unit 547 is configured to train the pre-training model according to the first semantic feature vector, the second semantic feature vector, and the third semantic feature vector to obtain an emotion tendentiousness analysis model.
In the embodiment of the present application, the specific implementation process of the fusion feature word obtaining unit 545 obtaining the fusion feature word may be as follows: determining the type of emotion feature words in emotion analysis training sentences; and acquiring fusion feature words of the same belonging type according to the belonging type of the emotion feature words in the emotion analysis training statement.
In the embodiment of the present application, the training unit 547 trains the pre-training model according to the first semantic feature vector, the second semantic feature vector and the third semantic feature vector to obtain the emotion tendentiousness analysis model according to the following specific implementation process: generating a combined feature vector according to the second semantic feature vector and the third semantic feature vector; calculating a distance between the first semantic feature vector and the combined feature vector; judging whether the distance between the first semantic feature vector and the combined feature vector is larger than a preset threshold value or not; if the distance between the first semantic feature vector and the combined feature vector is greater than a preset threshold, adjusting model parameters of the pre-training model and continuing training; and ending training until the distance between the first semantic feature vector and the combined feature vector is smaller than or equal to a preset threshold value so as to obtain an emotion tendentiousness analysis model.
According to the modeling device of the emotion tendentiousness analysis model, emotion analysis data conforming to a target emotion analysis task scene can be obtained, a pre-training model is built based on the emotion analysis data conforming to the target emotion analysis task scene, emotion analysis training sentences conforming to the target emotion analysis task scene are obtained, and then fine adjustment is carried out on the pre-training model according to the emotion analysis training sentences so as to obtain the emotion tendentiousness analysis model; when the pre-training model is subjected to fine adjustment, different types of emotion feature word information are respectively trained to capture the influence of different types of emotion feature words on emotion classification. Modeling the model in a pre-training stage by utilizing an auxiliary data set of a specific task, and increasing the understanding capability of the model to the specific task through a large amount of data and longer training; in addition, through training different types of emotion feature words in the fine adjustment stage, language knowledge in the emotion field is enriched, classification capacity of the model on emotion analysis tasks is enhanced, and accuracy of emotion trend analysis is improved; in addition, the pre-training model is migrated to the emotion analysis training statement which accords with the target emotion analysis task scene for fine adjustment, so that the classification capability of the model on the emotion analysis task can be further enhanced.
According to embodiments of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 7, a block diagram of an electronic device for implementing a modeling method of an emotion tendencies analysis model according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the application described and/or claimed herein.
As shown in fig. 7, the electronic device includes: one or more processors 701, memory 702, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the electronic device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In other embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple electronic devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 701 is illustrated in fig. 7.
Memory 702 is a non-transitory computer-readable storage medium provided herein. Wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform a modeling method of the emotion tendentiousness analysis model provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to execute a modeling method of an emotion tendentiousness analysis model provided by the present application.
Memory 702, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules (e.g., emotion analysis data acquisition module 510, pre-training model construction module 520, emotion analysis training statement acquisition module 530, and model fine adjustment module 540 shown in fig. 5) corresponding to a modeling method of an emotion tendencies analysis model in an embodiment of the present application. The processor 701 executes various functional applications of the server and data processing, that is, a modeling method implementing the emotion tendencies analysis model in the above-described method embodiment, by running a non-transitory software program, instructions, and modules stored in the memory 702.
Memory 702 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created according to the use of an electronic device to implement a modeling method of an emotion tendentiousness analysis model, or the like. In addition, the memory 702 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some embodiments, memory 702 optionally includes memory remotely located relative to processor 701, which may be connected via a network to an electronic device for implementing a modeling method for emotion tendencies analysis models. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device for implementing the modeling method of the emotion tendentiousness analysis model may further include: an input device 703 and an output device 704. The processor 701, the memory 702, the input device 703 and the output device 704 may be connected by a bus or otherwise, in fig. 7 by way of example.
Input device 703 may receive input numeric or character information and generate key signal inputs related to user settings and function controls of the electronic device used to implement the modeling method of the emotion tendencies analysis model, such as a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointer stick, one or more mouse buttons, a track ball, a joystick, and the like. The output device 704 may include a display apparatus, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibration motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device may be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASIC (application specific integrated circuit), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computing programs (also referred to as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions disclosed in the present application can be achieved, and are not limited herein.
The above embodiments do not limit the scope of the application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application are intended to be included within the scope of the present application.

Claims (10)

1. A modeling method of an emotion tendentiousness analysis model for classifying emotion analysis tasks, the modeling method comprising:
acquiring emotion analysis data conforming to a target emotion analysis task scene;
constructing a pre-training model based on emotion analysis data conforming to the target emotion analysis task scene;
acquiring emotion analysis training sentences conforming to the target emotion analysis task scene;
fine tuning the pre-training model according to the emotion analysis training statement to obtain the emotion tendentiousness analysis model; when the pre-training model is subjected to fine adjustment, the affective characteristic word information of different types is respectively trained to capture the influence of the affective characteristic words of different types on affective classification;
And fine tuning the pre-training model according to the emotion analysis training statement to obtain the emotion tendentiousness analysis model, wherein the fine tuning comprises the following steps:
analyzing the emotion analysis training sentences to determine emotion feature words in the emotion analysis training sentences;
replacing emotion feature words in the emotion analysis training sentences with masks;
inputting the emotion analysis training statement into the pre-training model to obtain a first semantic feature vector output by the pre-training model;
inputting the emotion analysis training statement subjected to mask replacement into the pre-training model to obtain a second semantic feature vector output by the pre-training model;
acquiring a fusion feature word, and representing the fusion feature word as a third semantic feature vector;
training the pre-training model according to the first semantic feature vector, the second semantic feature vector and the third semantic feature vector to obtain the emotion tendentiousness analysis model.
2. The method of claim 1, wherein constructing a pre-training model based on emotion analysis data of the target emotion analysis task scene comprises:
Analyzing the emotion analysis data conforming to the target emotion analysis task scene to determine emotion feature words in the emotion analysis data;
replacing emotion feature words in the emotion analysis data with masks;
pre-training the emotion analysis data subjected to mask replacement by using a deep neural network to obtain the pre-training model; wherein the pre-training is for training a contribution relationship between the emotion feature words and the mask.
3. The method of claim 1, wherein the obtaining the fusion feature word comprises:
determining the type of the emotion feature words in the emotion analysis training statement;
and acquiring fusion feature words of the same belonging type according to the belonging type of the emotion feature words in the emotion analysis training statement.
4. The method of claim 1, wherein training a pre-training model based on the first, second, and third semantic feature vectors to obtain the emotion tendencies analysis model comprises:
generating a combined feature vector according to the second semantic feature vector and the third semantic feature vector;
Calculating a distance between the first semantic feature vector and the combined feature vector;
judging whether the distance between the first semantic feature vector and the combined feature vector is larger than a preset threshold value or not;
if the distance between the first semantic feature vector and the combined feature vector is greater than the preset threshold, adjusting model parameters of the pre-training model and continuing training;
and ending training until the distance between the first semantic feature vector and the combined feature vector is smaller than or equal to the preset threshold value so as to obtain the emotion tendentiousness analysis model.
5. A modeling apparatus of an emotion tendentiousness analysis model for classifying emotion analysis tasks, the modeling apparatus comprising:
the emotion analysis data acquisition module is used for acquiring emotion analysis data conforming to a target emotion analysis task scene;
the pre-training model construction module is used for constructing a pre-training model based on the emotion analysis data conforming to the target emotion analysis task scene;
the emotion analysis training statement acquisition module is used for acquiring emotion analysis training statements conforming to the target emotion analysis task scene;
The model fine tuning module is used for fine tuning the pre-training model according to the emotion analysis training statement so as to obtain the emotion tendentiousness analysis model; when the pre-training model is subjected to fine adjustment, the affective characteristic word information of different types is respectively trained to capture the influence of the affective characteristic words of different types on affective classification;
the model fine tuning module comprises:
the emotion feature word determining unit is used for analyzing the emotion analysis training sentences to determine emotion feature words in the emotion analysis training sentences;
a replacing unit, configured to replace emotion feature words in the emotion analysis training statement with masks;
the first semantic feature vector generation unit is used for inputting the emotion analysis training statement into the pre-training model to obtain a first semantic feature vector output by the pre-training model;
the second semantic feature vector generation unit is used for inputting the emotion analysis training statement subjected to mask replacement into the pre-training model to obtain a second semantic feature vector output by the pre-training model;
the fusion feature word acquisition unit is used for acquiring fusion feature words;
The third semantic feature vector generation unit is used for representing the fusion feature words into third semantic feature vectors;
the training unit is used for training the pre-training model according to the first semantic feature vector, the second semantic feature vector and the third semantic feature vector so as to obtain the emotion tendentiousness analysis model.
6. The apparatus of claim 5, wherein the pre-training model building module is specifically configured to:
analyzing the emotion analysis data conforming to the target emotion analysis task scene to determine emotion feature words in the emotion analysis data;
replacing emotion feature words in the emotion analysis data with masks;
pre-training the emotion analysis data subjected to mask replacement by using a deep neural network to obtain the pre-training model; wherein the pre-training is for training a contribution relationship between the emotion feature words and the mask.
7. The apparatus according to claim 5, wherein the fusion feature word obtaining unit is specifically configured to:
determining the type of the emotion feature words in the emotion analysis training statement;
and acquiring fusion feature words of the same belonging type according to the belonging type of the emotion feature words in the emotion analysis training statement.
8. The apparatus of claim 5, wherein the training unit is specifically configured to:
generating a combined feature vector according to the second semantic feature vector and the third semantic feature vector;
calculating a distance between the first semantic feature vector and the combined feature vector;
judging whether the distance between the first semantic feature vector and the combined feature vector is larger than a preset threshold value or not;
if the distance between the first semantic feature vector and the combined feature vector is greater than the preset threshold, adjusting model parameters of the pre-training model and continuing training;
and ending training until the distance between the first semantic feature vector and the combined feature vector is smaller than or equal to the preset threshold value so as to obtain the emotion tendentiousness analysis model.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the modeling method of the emotion tendencies analysis model of any one of claims 1 to 4.
10. A non-transitory computer-readable storage medium storing computer instructions for causing the computer to execute the modeling method of the emotion tendencies analysis model of any one of claims 1 to 4.
CN201911370735.2A 2019-12-26 2019-12-26 Modeling method and device of emotion tendentiousness analysis model and electronic equipment Active CN111144108B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911370735.2A CN111144108B (en) 2019-12-26 2019-12-26 Modeling method and device of emotion tendentiousness analysis model and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911370735.2A CN111144108B (en) 2019-12-26 2019-12-26 Modeling method and device of emotion tendentiousness analysis model and electronic equipment

Publications (2)

Publication Number Publication Date
CN111144108A CN111144108A (en) 2020-05-12
CN111144108B true CN111144108B (en) 2023-06-27

Family

ID=70520641

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911370735.2A Active CN111144108B (en) 2019-12-26 2019-12-26 Modeling method and device of emotion tendentiousness analysis model and electronic equipment

Country Status (1)

Country Link
CN (1) CN111144108B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111859908B (en) * 2020-06-30 2024-01-19 北京百度网讯科技有限公司 Emotion learning pre-training method and device, electronic equipment and readable storage medium
CN112001180A (en) 2020-07-14 2020-11-27 北京百度网讯科技有限公司 Multi-mode pre-training model acquisition method and device, electronic equipment and storage medium
CN112200664A (en) * 2020-10-29 2021-01-08 上海畅圣计算机科技有限公司 Repayment prediction method based on ERNIE model and DCNN model
CN112560088B (en) * 2020-12-11 2024-05-28 同盾控股有限公司 Knowledge federation-based data security exchange method, device and storage medium
CN112966106A (en) * 2021-03-05 2021-06-15 平安科技(深圳)有限公司 Text emotion recognition method, device and equipment and storage medium
CN113836297B (en) * 2021-07-23 2023-04-14 北京三快在线科技有限公司 Training method and device for text emotion analysis model
CN114357204B (en) * 2021-11-25 2024-03-26 腾讯科技(深圳)有限公司 Media information processing method and related equipment
CN114357989B (en) * 2022-01-10 2023-09-26 北京百度网讯科技有限公司 Video title generation method and device, electronic equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140049680A (en) * 2012-10-18 2014-04-28 한국항공대학교산학협력단 Sentiment classification system using rule-based multi agents
CN105354183A (en) * 2015-10-19 2016-02-24 Tcl集团股份有限公司 Analytic method, apparatus and system for internet comments of household electrical appliance products
CN105893344A (en) * 2016-03-28 2016-08-24 北京京东尚科信息技术有限公司 User semantic sentiment analysis-based response method and device
CN106502989A (en) * 2016-10-31 2017-03-15 东软集团股份有限公司 Sentiment analysis method and device
CN107193801A (en) * 2017-05-21 2017-09-22 北京工业大学 A kind of short text characteristic optimization and sentiment analysis method based on depth belief network
WO2019037391A1 (en) * 2017-08-24 2019-02-28 平安科技(深圳)有限公司 Method and apparatus for predicting customer purchase intention, and electronic device and medium
CN110083702A (en) * 2019-04-15 2019-08-02 中国科学院深圳先进技术研究院 A kind of aspect rank text emotion conversion method based on multi-task learning
CN110232109A (en) * 2019-05-17 2019-09-13 深圳市兴海物联科技有限公司 A kind of Internet public opinion analysis method and system
CN110377740A (en) * 2019-07-22 2019-10-25 腾讯科技(深圳)有限公司 Feeling polarities analysis method, device, electronic equipment and storage medium
AU2019101147A4 (en) * 2019-09-30 2019-10-31 Han, Haoran MR A sentimental analysis system for film review based on deep learning
CN110532386A (en) * 2019-08-12 2019-12-03 新华三大数据技术有限公司 Text sentiment classification method, device, electronic equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140049680A (en) * 2012-10-18 2014-04-28 한국항공대학교산학협력단 Sentiment classification system using rule-based multi agents
CN105354183A (en) * 2015-10-19 2016-02-24 Tcl集团股份有限公司 Analytic method, apparatus and system for internet comments of household electrical appliance products
CN105893344A (en) * 2016-03-28 2016-08-24 北京京东尚科信息技术有限公司 User semantic sentiment analysis-based response method and device
CN106502989A (en) * 2016-10-31 2017-03-15 东软集团股份有限公司 Sentiment analysis method and device
CN107193801A (en) * 2017-05-21 2017-09-22 北京工业大学 A kind of short text characteristic optimization and sentiment analysis method based on depth belief network
WO2019037391A1 (en) * 2017-08-24 2019-02-28 平安科技(深圳)有限公司 Method and apparatus for predicting customer purchase intention, and electronic device and medium
CN110083702A (en) * 2019-04-15 2019-08-02 中国科学院深圳先进技术研究院 A kind of aspect rank text emotion conversion method based on multi-task learning
CN110232109A (en) * 2019-05-17 2019-09-13 深圳市兴海物联科技有限公司 A kind of Internet public opinion analysis method and system
CN110377740A (en) * 2019-07-22 2019-10-25 腾讯科技(深圳)有限公司 Feeling polarities analysis method, device, electronic equipment and storage medium
CN110532386A (en) * 2019-08-12 2019-12-03 新华三大数据技术有限公司 Text sentiment classification method, device, electronic equipment and storage medium
AU2019101147A4 (en) * 2019-09-30 2019-10-31 Han, Haoran MR A sentimental analysis system for film review based on deep learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Yu L C 等.Using a contextual entropy model to expand emotion words and their intensity for the sentiment classification of stock market news.《 Knowledge-Based Systems》.2013,全文. *
邵良杉 等.基于语义规则与RNN模型的在线评论情感分类研究.《中文信息学报》.2019,第第33卷卷(第第33卷期),全文. *

Also Published As

Publication number Publication date
CN111144108A (en) 2020-05-12

Similar Documents

Publication Publication Date Title
CN111144108B (en) Modeling method and device of emotion tendentiousness analysis model and electronic equipment
KR102577512B1 (en) Method and apparatus for extracting event from text, electronic device, and storage medium
KR102484617B1 (en) Method and apparatus for generating model for representing heterogeneous graph node, electronic device, storage medium and program
CN111737994B (en) Method, device, equipment and storage medium for obtaining word vector based on language model
CN111144115B (en) Pre-training language model acquisition method, device, electronic equipment and storage medium
CN111079442B (en) Vectorization representation method and device of document and computer equipment
JP2021190087A (en) Text recognition processing method, device, electronic apparatus, and storage medium
CN111967256B (en) Event relation generation method and device, electronic equipment and storage medium
CN111539227B (en) Method, apparatus, device and computer storage medium for training semantic representation model
CN112001180A (en) Multi-mode pre-training model acquisition method and device, electronic equipment and storage medium
CN111709234B (en) Training method and device for text processing model and electronic equipment
CN111832701B (en) Model distillation method, model distillation device, electronic equipment and storage medium
CN111259671B (en) Semantic description processing method, device and equipment for text entity
JP7149993B2 (en) Pre-training method, device and electronic device for sentiment analysis model
CN111859997B (en) Model training method and device in machine translation, electronic equipment and storage medium
CN111950292B (en) Training method of text error correction model, text error correction processing method and device
CN111950293B (en) Semantic representation model generation method and device, electronic equipment and storage medium
CN112001190A (en) Training method, device and equipment of natural language processing model and storage medium
CN111126061B (en) Antithetical couplet information generation method and device
CN112580822B (en) Countermeasure training method device for machine learning model, electronic equipment and medium
CN111709252A (en) Model improvement method and device based on pre-trained semantic model
CN111738015B (en) Article emotion polarity analysis method and device, electronic equipment and storage medium
CN113723278A (en) Training method and device of form information extraction model
US20220004867A1 (en) Optimizer learning method and apparatus, electronic device and readable storage medium
CN112580723B (en) Multi-model fusion method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant