CN112131391B - Power supply service client appeal text classification method based on capsule network - Google Patents

Power supply service client appeal text classification method based on capsule network Download PDF

Info

Publication number
CN112131391B
CN112131391B CN202011332961.4A CN202011332961A CN112131391B CN 112131391 B CN112131391 B CN 112131391B CN 202011332961 A CN202011332961 A CN 202011332961A CN 112131391 B CN112131391 B CN 112131391B
Authority
CN
China
Prior art keywords
text
word
vector
power supply
appeal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011332961.4A
Other languages
Chinese (zh)
Other versions
CN112131391A (en
Inventor
杨志新
周宇
王成现
潘留兴
洪昕
丁淙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Electric Power Information Technology Co Ltd
Original Assignee
Jiangsu Electric Power Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Electric Power Information Technology Co Ltd filed Critical Jiangsu Electric Power Information Technology Co Ltd
Priority to CN202011332961.4A priority Critical patent/CN112131391B/en
Publication of CN112131391A publication Critical patent/CN112131391A/en
Application granted granted Critical
Publication of CN112131391B publication Critical patent/CN112131391B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The invention discloses a power supply service client appeal text classification method based on a capsule network, which comprises the following steps of: preprocessing a power supply service client appeal text; generating a power supply service client appeal text word vector to solve the problem of word ambiguity; acquiring a dependency relationship between words based on the encoding and dependency relationship acquisition of the word vector; obtaining a fusion vector based on the appeal text feature fusion of the characters and the word vectors; obtaining the interdependence relation between word vectors by adopting a self-attention mechanism; adopting an EM-Routing dynamic Routing algorithm to aggregate appeal text capsules to obtain semantic information relation among words; and finally, calculating loss values of the positive class and the negative class of the appeal text by adopting an interval loss function. The text classification method and the device improve the text important feature extraction capability and finally improve the accuracy of text classification of power supply appeal in the power customer service worksheet.

Description

Power supply service client appeal text classification method based on capsule network
Technical Field
The invention relates to the technical field of electric power, in particular to power supply service client appeal text classification based on a capsule network.
Background
The power supply service quality is an important index for measuring the service level of the power enterprise, and along with the rapid development of the internet, big data and a social platform, the improvement of the power supply service quality is imperative. 95598 as a service hot line of an electric power enterprise, communication and exchange with an electric power client are performed every day through the hot line, and a large amount of client appeal text data is formed. The best method for improving 95598 customer service capability is to clearly understand customer service requirements and problems represented in the customer appeal texts, so that accurate, active and differentiated power supply service is provided, customer power utilization satisfaction is improved, good image and brand value of power supply enterprises are improved, and stability of power utilization customers is guaranteed. However, the power work order client in the power supply service appeals that the text data has the characteristics of large information amount and unstructured data, and challenges are brought to development of refined power supply services.
In the traditional text classification based on the Convolutional Neural Network (CNN), when modeling is performed on spatial text information, too many training model parameters are generated, the model training efficiency is reduced, a lot of valuable text feature information is lost in a pooling layer, the feature understanding capability is not strong, and word position information, grammatical structures and semantic information in the text cannot be fully expressed due to the invariance of the spatial position of the CNN. Text classification based on Capsule networks (CapsNets) is a relatively new topic at present, and the defect of CNN model training is effectively improved. The capsule network can give consideration to the semantic and word order information of the text globally, improves the feature expression capability of each word in the text, and is very suitable for analyzing the unstructured power supply service client appeal text. Compared with a Long Short-Term Memory (LSTM) network, the method can break through the limitation of text distance (such as Long text) to obtain the semantic and word order information of the context. Compared with the traditional self-attention mechanism and CNN pooling operation, the capsule network adopts an Expectation-Maximization (EM-Routing) algorithm to overcome the defect of information aggregation in a bottom-up and passive mode, guide active clustering of task text information, mine the word order and semantic information of the power supply service client appeal text, and improve the classification accuracy of the client appeal text.
In summary, because the content of the power supply service client appeal is mainly in a text form, unstructured and large in information amount, and there is a word ambiguity, the method for classifying the power supply service client appeal text based on the traditional neural network has at least the following 3 problems: (1) the CNN pooling layer loses a lot of valuable information and passively performs information aggregation, so that the feature understanding capability is reduced; (2) the CNN has weak spatial relationship recognition capability, and cannot fully express word order, semantic information and grammatical structure in the power supply service appeal text; (3) LSTM cannot obtain contextual semantic and lexical information for long text solicited by power service customers.
Disclosure of Invention
The invention aims to provide a power supply service client appeal text classification method based on a capsule network, which improves the text important feature extraction capability and improves the accuracy of power supply appeal text classification in a 95598 power supply client service worksheet, and aims to overcome the defects that the feature expression capability of CNN and LSTM networks is not strong, the limitation of text distance cannot be broken through, and context semantics and language sequence information can be obtained, and the power supply service client appeal text data is unstructured, large in information quantity and ambiguous.
The invention is realized by the following technical scheme:
1) power supply service customer appeal text preprocessing: the method mainly comprises the steps of obtaining power supply service client appeal texts, removing stop words and performing word segmentation processing, wherein client appeal contents in a 95598 system are divided into reporting, suggesting, showing, complaint, inquiry, repair, past business and reflecting categories according to text forms, one line of texts represents one work order appeal and is added with text category labels, all symbols and stop words except letters, numbers and Chinese characters are deleted, and a jieba word segmentation tool is used for performing word segmentation on the work order appeal texts of each line.
2) Generating a text word vector by a power supply service client appeal: each line of power supply service client appeal text after preprocessing
Figure DEST_PATH_IMAGE001
Inputting the result into a bert pre-training model, solving the problem of word ambiguity, and obtaining a dynamic word vector based on semantics
Figure DEST_PATH_IMAGE002
Is shown asiThe dimension corresponding to each word isD 1The vector of (a) is determined,w iin the text for representing the appeal of each line of power supply clientFirst, theiThe number of the individual words is,L 1representing the number of words in each line of the power service client appeal text.
3) And obtaining the coding and dependency relationship based on the word vector: the word vector-based encoding is different from the traditional word vector-based encoding, the finer-grained encoding is realized, and the word set in the text of each line of power supply service customer appeal is assumed
Figure DEST_PATH_IMAGE003
Wherein
Figure DEST_PATH_IMAGE004
The first in the text of each line of work orderiWords, mapping the words to a high-dimensional space to obtain a word vector
Figure DEST_PATH_IMAGE005
Wherein
Figure DEST_PATH_IMAGE006
Is shown asiThe dimension corresponding to each word isD 2The vector of words of (a) is,L 2and the number of words in the appeal text of each line of the power supply service client is represented.
In the sentence, because each word is relatively independent, the mutual dependency relationship among the words in the power supply client appeal text is obtained by adopting N-garm convolution operation to generate phrase characteristics, and then important semantic characteristics are obtained through maximum pooling operation and are spliced to obtain the word-based power supply service client appeal text characteristic vector.
4) Text feature fusion based on word and word vectors: fusing the text characteristic vectors of the power supply service client appeal based on characters and words to obtain a fusion vector
Figure DEST_PATH_IMAGE007
Is shown asiThe corresponding logical vector representation of the individual words.
5) Obtaining the correlation among the word vectors: and acquiring the interrelationship between the text words of each power supply service client appeal by adopting a self-attention mechanism.
6) Power supply client appeal text capsule aggregation: and aggregating the word capsules into power supply service client appeal text capsules by adopting an EM-Routing dynamic Routing algorithm to obtain semantic information relation among the words.
7) Calculating a loss value: the distance between the positive class and the negative class of each line of power supply service client appeal text is calculated by adopting an interval loss functionmLoss value ofL iAnd adding the loss values of the positive class and the negative class to obtain a total loss valueL. The smaller the loss function L is, the higher the classification accuracy is, and finally the classification accuracy of the appeal text of the power supply service client is improved.
The advantages of the method are as follows: the method overcomes the defect of difficulty in obtaining long text semantics, expresses global semantics and word order characteristics of the appeal text in a finer granularity by coding based on word vectors, aggregates each word into a capsule, further obtains semantic information relation between the words, improves text important characteristic extraction capability, and finally improves accuracy of appeal text classification in a customer service work order.
Drawings
Fig. 1 is a frame diagram of a power supply service client appeal text classification method based on a capsule network according to the present invention.
FIG. 2 is a specific process of text classification implemented by the matrix capsule network according to the present invention.
Detailed Description
The method of the present invention is further described with reference to the accompanying drawings and the detailed description.
Fig. 1 is a framework diagram of a power supply service client appeal text classification method based on a capsule network according to the present invention, and the method includes: (1) preprocessing a large amount of power supply service client appeal texts; (2) training to generate a client appeal text word vector; (3) encoding the word vectors and obtaining the interdependence relation between the words and the word vectors; (4) text feature fusion based on the word and word vector; (5) obtaining the interdependence relation between words and word vectors; (6) aggregating the word capsules into text capsules using an EM-Routing algorithm; (7) the loss values for the positive and negative classes are calculated. And finally, the classification accuracy of the power supply client appeal text is improved. FIG. 2 is a specific process of text classification implemented by the matrix capsule network according to the present invention. The specific implementation mode is as follows:
step 1, preprocessing power supply service client appeal text
All the client appeal contents of 95598 are divided into reports, suggestions, prawns, complaints, queries, repair reports, past services and reflection categories according to text forms, one line of text represents one power supply client appeal text and is added with a text category label, and a jieba word segmentation tool is used for segmenting the power supply client appeal text of each line.
Step 2, generating text word vectors for power supply service customer appeal
Appeal text of each line of preprocessed work order
Figure DEST_PATH_IMAGE008
Inputting the dynamic word vector into a bert pre-training model to obtain a dynamic word vector based on semantics
Figure DEST_PATH_IMAGE009
And then:
Figure DEST_PATH_IMAGE010
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE011
is shown asiThe dimension corresponding to each word isD 1The vector of (a) is determined,w ithe first in the text of each line of power supply client appealiThe number of the individual words is,L 1and the number of words in the appeal text of each line of the power supply client is represented.
Step 3, obtaining coding and dependency relation based on word vector
Different from the traditional encoding based on word vectors, the encoding with finer granularity is realized, and the mutual dependency relationship among the words in the power supply client appeal text is obtained by adopting the N-garm convolution operation.
3.1 word vector based coding: set of words in per-line power-supply client appeal text
Figure DEST_PATH_IMAGE012
Wherein
Figure DEST_PATH_IMAGE013
The first in the text of each line of power supply client appealiWords, mapping the words to a high-dimensional space to obtain a word vector
Figure DEST_PATH_IMAGE014
Wherein
Figure DEST_PATH_IMAGE015
Is shown asiThe dimension corresponding to each word isD 2The vector of words of (a) is,L 2and the number of the words in the appeal text of each line of the power supply client is represented.
3.2 word-based text feature vector acquisition
3.2.1 generating phrase features: in the sentence, because each word is relatively independent, the mutual dependency relationship between the words in the power supply client appeal text, namely the local features of the client appeal text, is obtained by adopting an N-garm convolution operation to generate the phrase features. Suppose that
Figure DEST_PATH_IMAGE016
Representing word vectors
Figure DEST_PATH_IMAGE017
The process of the volume of the words is as follows:
Figure DEST_PATH_IMAGE018
wherein
Figure DEST_PATH_IMAGE019
kWhich represents the size of the convolution kernel and,b 1a bias term is represented as a function of,
Figure DEST_PATH_IMAGE020
the generated local phrase features are represented as,D 3indicates the number of phrases to be generated,frepresenting a ReLU nonlinear activation function, adopting convolution operation at different positions of a sentence, and generating a phrase with the characteristics of
Figure DEST_PATH_IMAGE021
3.2.2 obtaining semantic features of phrase features: obtaining important semantic features through maximum pooling operation, wherein the pooling process comprises the following steps:
Figure DEST_PATH_IMAGE022
wherein
Figure DEST_PATH_IMAGE023
Is shown asiAnd obtaining a characteristic value of the characteristic extracted by the convolution kernel through a maximum pooling operation.
3.2.3 eigenvalue stitching: splicing the obtained characteristic values to obtain an appeal text characteristic vector generated based on the word vector
Figure DEST_PATH_IMAGE024
WhereinD 3The number of the characteristic values is represented,
Figure DEST_PATH_IMAGE025
a text feature vector representing a word-based power client appeal.
Step 4, appeal text feature fusion based on characters and word vectors
Fusing the power supply client appeal text characteristic vectors based on characters and words to obtain a fusion vector
Figure DEST_PATH_IMAGE026
Is shown asiA logical vector representation corresponding to a word, then
Figure DEST_PATH_IMAGE027
Step 5, obtaining the interdependence relation between the word vectors
And obtaining the interdependence relation between each power supply client appeal text word by adopting a self-attention mechanism.
5.1 obtaining semantic vectors between words
Figure DEST_PATH_IMAGE028
: first, calculate theiA word and ajSemantic information vector between words
Figure DEST_PATH_IMAGE029
Wherein
Figure DEST_PATH_IMAGE030
The information is represented by a transformation matrix of information,
Figure DEST_PATH_IMAGE031
representing a fused vector
Figure DEST_PATH_IMAGE032
Is/are as followsQA vector of values is generated by a vector of values,
Figure DEST_PATH_IMAGE033
representing a fused vectorKA vector of values.
5.2 normalization calculation: computing collaborative weights for semantic information vectors by normalization operations
Figure DEST_PATH_IMAGE034
Fused vector
Figure DEST_PATH_IMAGE035
Is shown asjWord vectorhTo obtain a vector of values of
Figure DEST_PATH_IMAGE036
Step 6, the power supply service client appeal text capsule aggregation
And adopting an EM-Routing dynamic Routing algorithm to aggregate the word capsules into power supply client appeal text capsules to obtain the semantic information relationship among the words.
6.1 calculating the probability Density
Figure DEST_PATH_IMAGE037
: calculating high-level capsule asjIn time, the bottom layer capsuleiIn the first placehProbability density in one dimension
Figure DEST_PATH_IMAGE038
To represent
Figure 840237DEST_PATH_IMAGE028
In thathThe value in the dimension(s) is,
Figure DEST_PATH_IMAGE039
indicating high-level capsulesjIn thathMean value in dimension
Figure DEST_PATH_IMAGE040
Indicating high-level capsulesjIn thathThe variance σ in the dimension;
6.2 logarithm of probability Density
Figure DEST_PATH_IMAGE041
6.3 encapsulating each of the bottom layersiIn thathVector value in dimension to high level capsulejVector value summation
Figure DEST_PATH_IMAGE042
Figure DEST_PATH_IMAGE043
Wherein
Figure DEST_PATH_IMAGE044
Indicating allocation to higher level capsulesjThe total amount of data of (c);
6.4 calculate high level CapsulejActivation value of
Figure DEST_PATH_IMAGE045
Figure DEST_PATH_IMAGE046
Also indicating a high level capsulejThe probability of the presence of the object,
Figure DEST_PATH_IMAGE047
are all hyper-parameters.
Step 7, calculating loss value
The distance between the positive class and the negative class of each line of power supply client appeal text is calculated by adopting an interval loss functionnLoss value of
Figure DEST_PATH_IMAGE048
The positive class is represented by a positive number,
Figure 445793DEST_PATH_IMAGE049
representing negative classes, and then adding the loss values to obtain a total loss value
Figure DEST_PATH_IMAGE050

Claims (6)

1. A power supply service client appeal text classification method based on a capsule network is characterized by comprising the following steps:
1) power supply service customer appeal text preprocessing: firstly, acquiring an appeal text, adding a text category label, deleting the stop and performing word segmentation;
2) generating a text word vector by a power supply service client appeal: inputting each line of the preprocessed appealing text into a bert pre-training model to solve the problem of word ambiguity;
3) and obtaining the coding and dependency relationship based on the word vector: adopting a word-based finer-grained appeal text coding mode and adopting N-garm convolution operation to obtain the interdependence relation between words in the appeal text;
4) appeal text feature fusion based on word and word vectors: fusing appeal text characteristic vectors based on characters and words to obtain a fused vector;
5) obtaining the interdependence relation between word vectors: obtaining the interdependence relation between each appeal text word by adopting a self-attention mechanism;
6) power service customer appeal text capsule aggregation: adopting an EM-Routing dynamic Routing algorithm to aggregate the word capsules into appeal text capsules to obtain semantic information relation among words;
7) calculating a loss value: calculating loss values of the positive class and the negative class of the appeal text by adopting an interval loss function;
the process of the bert pre-training model in the step 2) is as follows: each line of power supply service client appeal text after preprocessing
Figure FDA0003193733270000011
Inputting the data into a bert pre-training model; firstly, dynamic word vectors based on semantics are obtained
Figure FDA0003193733270000012
Figure FDA0003193733270000013
The dimension representing the ith word is D1Vector of (a), wdIndicating the d-th word, L, in each line of power-supplying customer appeal text1Representing the number of words in each line of power supply service client appeal text;
the encoding method based on the word vector in the step 3) comprises the following steps: set of words in per-line power-supply client appeal text
Figure FDA0003193733270000014
Figure FDA0003193733270000015
Wherein c issExpressing the s-th word in the appeal text of each line of power supply client, mapping the words to a high-dimensional space to obtain a word vector
Figure FDA0003193733270000016
Wherein eoThe dimension corresponding to the o-th word is D2Word vector of, L2Representing the number of words in each line of power supply client appeal text;
acquiring the dependency relationship between the characters in the step 3), which is specifically as follows:
3.1 generating phrase features: in a sentence, because each word is relatively independent, firstly, an N-garm convolution operation is adopted to obtain the interdependence relation between the words in the power supply client appeal text, namely the local features of the client appeal text, so as to generate phrase features; suppose Xi:jRepresenting a word vector xi,xi+1,…,xjThe process of the volume of the words is as follows:
Figure FDA0003193733270000017
Figure FDA0003193733270000018
wherein
Figure FDA0003193733270000019
WEiRepresenting a weight matrix, biRepresenting the bias term, K the size of the convolution kernel,
Figure FDA0003193733270000021
representing the generated local phrase features, D3Representing the number of generated phrases, f representing a ReLU nonlinear activation function, adopting convolution operation at different positions of the sentence, and generating the phrases with the characteristics of
Figure FDA0003193733270000022
Figure FDA0003193733270000023
zmRepresents the mth feature phrase;
3.2 obtaining semantic features of phrase features: obtaining important semantic features through maximum pooling operation, wherein the pooling process comprises the following steps:
Figure FDA0003193733270000024
wherein
Figure FDA0003193733270000025
Representing a characteristic value obtained by the maximum pooling operation of the features extracted by the mth convolution kernel;
3.3 eigenvalue stitching: splicing the obtained characteristic values to obtain an appeal text characteristic vector generated based on the word vector
Figure FDA0003193733270000026
Figure FDA0003193733270000027
A text feature vector representing a word-based power client appeal.
2. The capsule network-based power service client appeal text classification method of claim 1, wherein: the appealing text preprocessing process in the step 1) comprises the following steps: firstly, dividing all client appeal contents into reports, suggestions, prawns, complaints, queries, repair reports, past services and reflection categories according to text forms, wherein one line of text represents one client appeal text, text category labels are added, all symbols and stop words except letters, numbers and Chinese characters are deleted, and a jieba word segmentation tool is used for segmenting words of each line of appeal text.
3. The method for appealing to the client of the power supply service based on the capsule network as claimed in claim 1, wherein: the appeal text feature fusion method based on the word and word vector in the step 4) comprises the following steps: fusing the power supply client appeal text characteristic vectors based on characters and words to obtain a fusion vector gi
Figure FDA0003193733270000028
Is shown asiThe logical vector representation corresponding to a word, then xi=MLP(xi),
Figure FDA0003193733270000029
4. The method for appealing to the client of the power supply service based on the capsule network as claimed in claim 1, wherein: step 5) obtaining the interdependence relation among the word vectors, which is as follows:
5.1 obtaining semantic vectors g between wordsij: firstly, calculating semantic information vector between ith word and jth word
Figure FDA00031937332700000210
Wherein
Figure FDA00031937332700000211
Representing an information transformation matrix, giwQRepresents the fusion vector giVector of Q values, giwKRepresents the fusion vector giA vector of K values of;
5.2 normalization calculation: computing collaborative weights for semantic information vectors by normalization operations
Figure FDA00031937332700000212
k1Is a variable, 1 is less than or equal to k1≤L1Fused vector
Figure FDA00031937332700000213
hjWCRepresenting the C value vector of the jth word vector h to finally obtain a semantic information vector
Figure FDA00031937332700000214
5. The method for appealing to the client of the power supply service based on the capsule network as claimed in claim 1, wherein: step 6), the power supply service client appeals for text capsule aggregation, and the method specifically comprises the following steps:
6.1 calculating the probability Density
Figure FDA0003193733270000031
Computing the high layerProbability density of bottom layer capsule i in h dimension when the capsule is j
Figure FDA0003193733270000032
Figure FDA0003193733270000033
Denotes gijThe value in the h-dimension is,
Figure FDA0003193733270000034
representing the mean value mu of the high-level capsule j in the h-dimension,
Figure FDA0003193733270000035
represents the variance σ of the high-level capsule j in the h dimension;
6.2 logarithm of probability Density
Figure FDA0003193733270000036
Figure FDA0003193733270000037
6.3 summing the vector value of each bottom layer capsule i in the h dimension to the vector value of the top layer capsule j
Figure FDA0003193733270000038
Figure FDA0003193733270000039
Wherein ∑irijRepresents the total amount of data assigned to the higher layer capsule j;
6.4 calculating the activation value a of the higher layer Capsule jj
Figure FDA00031937332700000310
ajAlso indicates the probability of the presence of a high-level capsule j, lambda, betaa、βuAre all hyper-parameters.
6. The method for appealing to the client of the power supply service based on the capsule network as claimed in claim 1, wherein: step 7) calculating a loss value, and calculating a loss value L with a distance n between the positive class and the negative class of each line of power supply client appeal text by adopting an interval loss functionv=(max(0,n-(at-av)))2,atDenotes positive class, avRepresenting negative classes and then adding the loss values to obtain a total loss value L ═ Σv≠tLv
CN202011332961.4A 2020-11-25 2020-11-25 Power supply service client appeal text classification method based on capsule network Active CN112131391B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011332961.4A CN112131391B (en) 2020-11-25 2020-11-25 Power supply service client appeal text classification method based on capsule network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011332961.4A CN112131391B (en) 2020-11-25 2020-11-25 Power supply service client appeal text classification method based on capsule network

Publications (2)

Publication Number Publication Date
CN112131391A CN112131391A (en) 2020-12-25
CN112131391B true CN112131391B (en) 2021-09-17

Family

ID=73852093

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011332961.4A Active CN112131391B (en) 2020-11-25 2020-11-25 Power supply service client appeal text classification method based on capsule network

Country Status (1)

Country Link
CN (1) CN112131391B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112883167A (en) * 2021-03-18 2021-06-01 江西师范大学 Text emotion classification model based on hierarchical self-power-generation capsule network
CN113158679B (en) * 2021-05-20 2023-07-04 广东工业大学 Marine industry entity identification method and device based on multi-feature superposition capsule network

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108595590A (en) * 2018-04-19 2018-09-28 中国科学院电子学研究所苏州研究院 A kind of Chinese Text Categorization based on fusion attention model
CN111078833A (en) * 2019-12-03 2020-04-28 哈尔滨工程大学 Text classification method based on neural network
CN111259157A (en) * 2020-02-20 2020-06-09 广东工业大学 Chinese text classification method based on hybrid bidirectional circulation capsule network model
CN111475622A (en) * 2020-04-08 2020-07-31 广东工业大学 Text classification method, device, terminal and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108319666B (en) * 2018-01-19 2021-09-28 国网浙江省电力有限公司营销服务中心 Power supply service assessment method based on multi-modal public opinion analysis

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108595590A (en) * 2018-04-19 2018-09-28 中国科学院电子学研究所苏州研究院 A kind of Chinese Text Categorization based on fusion attention model
CN111078833A (en) * 2019-12-03 2020-04-28 哈尔滨工程大学 Text classification method based on neural network
CN111259157A (en) * 2020-02-20 2020-06-09 广东工业大学 Chinese text classification method based on hybrid bidirectional circulation capsule network model
CN111475622A (en) * 2020-04-08 2020-07-31 广东工业大学 Text classification method, device, terminal and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于多头注意力胶囊网络的文本分类模型;贾旭东等;《清华大学学报 (自然科学版)》;20200531;第60卷(第5期);第415-420页 *

Also Published As

Publication number Publication date
CN112131391A (en) 2020-12-25

Similar Documents

Publication Publication Date Title
CN109165294B (en) Short text classification method based on Bayesian classification
CN111783474B (en) Comment text viewpoint information processing method and device and storage medium
CN110990525A (en) Natural language processing-based public opinion information extraction and knowledge base generation method
CN110321563B (en) Text emotion analysis method based on hybrid supervision model
CN107729309A (en) A kind of method and device of the Chinese semantic analysis based on deep learning
CN110175227A (en) A kind of dialogue auxiliary system based on form a team study and level reasoning
CN112131391B (en) Power supply service client appeal text classification method based on capsule network
CN114330354B (en) Event extraction method and device based on vocabulary enhancement and storage medium
CN111259153B (en) Attribute-level emotion analysis method of complete attention mechanism
CN114036955B (en) Detection method for headword event argument of central word
CN113051914A (en) Enterprise hidden label extraction method and device based on multi-feature dynamic portrait
CN115098634A (en) Semantic dependency relationship fusion feature-based public opinion text sentiment analysis method
CN113360582B (en) Relation classification method and system based on BERT model fusion multi-entity information
CN113326374B (en) Short text emotion classification method and system based on feature enhancement
CN113239694B (en) Argument role identification method based on argument phrase
CN114722198A (en) Method, system and related device for determining product classification code
CN114239828A (en) Supply chain affair map construction method based on causal relationship
CN113159831A (en) Comment text sentiment analysis method based on improved capsule network
TW202034207A (en) Dialogue system using intention detection ensemble learning and method thereof
CN114911940A (en) Text emotion recognition method and device, electronic equipment and storage medium
CN114169418A (en) Label recommendation model training method and device, and label obtaining method and device
CN114020901A (en) Financial public opinion analysis method combining topic mining and emotion analysis
CN114595324A (en) Method, device, terminal and non-transitory storage medium for power grid service data domain division
CN113010680A (en) Electric power work order text classification method and device and terminal equipment
CN112100389A (en) Long text classification method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant