CN109902175A - A kind of file classification method and categorizing system based on neural network structure model - Google Patents

A kind of file classification method and categorizing system based on neural network structure model Download PDF

Info

Publication number
CN109902175A
CN109902175A CN201910125342.9A CN201910125342A CN109902175A CN 109902175 A CN109902175 A CN 109902175A CN 201910125342 A CN201910125342 A CN 201910125342A CN 109902175 A CN109902175 A CN 109902175A
Authority
CN
China
Prior art keywords
neural network
word
network structure
structure model
indicate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910125342.9A
Other languages
Chinese (zh)
Inventor
聂桂芝
杨攀攀
曾青霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHANGHAI FERLY DIGITAL TECHNOLOGIES Co Ltd
Original Assignee
SHANGHAI FERLY DIGITAL TECHNOLOGIES Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHANGHAI FERLY DIGITAL TECHNOLOGIES Co Ltd filed Critical SHANGHAI FERLY DIGITAL TECHNOLOGIES Co Ltd
Priority to CN201910125342.9A priority Critical patent/CN109902175A/en
Publication of CN109902175A publication Critical patent/CN109902175A/en
Pending legal-status Critical Current

Links

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention belongs to text classification fields in natural language processing, disclose a kind of file classification method based on neural network structure model, and the classification method includes collecting text data to be sorted;Text data is handled, the sentence in text data is indicated with term vector;Construct neural network structure model;Based on the neural network structure model, based on RNN and CNN network architecture, attention mechanism is added and constructs encoder;The text data input coding device that will be indicated with term vector, output include the state vector of contextual information;Classified using classifier to text data according to state vector, obtains classification results.Classification method of the present invention sufficiently extracts Text eigenvector, improves the accuracy rate of text classification, improves the accuracy rate of intention assessment in specific QA application scenarios.The invention also provides a kind of Text Classification Systems.

Description

A kind of file classification method and categorizing system based on neural network structure model
Technical field
The invention belongs to text classification fields in natural language processing, more particularly to one kind to be based on neural network structure model File classification method and categorizing system.
Background technique
With the growth of the network platforms explosion type such as mobile Internet, social activity and new media, a large amount of lack has been full of in network Weary effective information tissue but the text with researching value, and key technology one of of the text classification as natural language processing, The problems such as can effectively solve the problem that information clutter, and it is widely used in search engine, Spam filtering, Personalize News and data In the tasks such as sorting.Therefore, text classification plays in the fields such as intelligent Organization And Management of natural language processing, data and focuses on The effect wanted.
Yu Bengong etc. (" grind Chinese short text sort research of Yu Bengong, the Zhang Lianbin based on CP-CNN by computer application Study carefully ") the dual input convolutional neural networks MODEL C P-CNN of a kind of bluebeard compound and character is proposed, effectively increase short text classification Effect.(Yang Z, Yang D, Dyer C, the et al.Hierarchical attention networks such as Yang Z For document classification [C], Proceedings ofNAACL-HLT.2016:1480-1489.) it will pay attention to Power mechanism is introduced into network structure and carries out text classification, improves the accuracy rate of classification.Summer waits (Xia Congling, Qian Tao, Ji Dong from zero Newsletter archive classification computer application research of great (2017) based on event convolution feature, 34 (4), 991-994.) it proposes A kind of file classification method based on event convolution feature.But due to the particularity of this body structure of natural language, in natural language There is the discontinuous relationship of Context-dependent, there is convolution kernel sizes to be difficult to for above-mentioned studied convolutional neural networks model It determines, the problems such as vector dimension of text is excessively high, and these models and is applied to image procossing and language identification field at present Outstanding network structure compared is still shallower, and convolutional neural networks (CNN) (as shown in Figure 1) are multiple network layers superposition compositions , it is shallower refer to the CNN number of plies for text classification compared with the CNN number of plies of image procossing and field of speech recognition, and convolution Neural network does not comprehensively consider between text forward and backward relationship characteristic and entire text when extracting text feature Relationship cannot capture the context implication of text, and semantic feature extraction is not complete, and classification results are undesirable.
Summary of the invention
For the drawbacks described above for overcoming the prior art, the invention proposes a kind of texts based on neural network structure model point Class method and categorizing system can be used in human-computer dialogue QA or more wheel conversational systems, it is intended that the task of classification.Due to being intended to Identification play the role of in QA it is vital, for improve the matched accuracy of user's question, text classification proposed by the present invention Method, the spy based on RNN and CNN network architecture, by the way that attention mechanism is added, as last text classification It levies vector and extracts network, i.e., extract text feature in the method for two-way GRU combination CNN network structure, can effectively improve text The accuracy rate of classification.
The invention proposes a kind of file classification methods based on neural network structure model, comprising:
Step 1: text data to be sorted is collected;
Step 2: processing text data indicates the sentence in text data with term vector;
Step 3: building neural network structure model;The number of plies of the neural network structure model is three layers, and first layer is Word notices that layer, the second layer are that sentence pays attention to layer, and third layer is maxpooling layers;
Step 4: being based on the neural network structure model, based on RNN and CNN network architecture, is added and pays attention to Mechanism construction encoder;
Step 5: the text data input coding device that will be indicated with term vector, output comprising contextual information state to Amount;
Step 6: classifying to text data using classifier according to state vector, obtains classification results.
In the file classification method based on neural network structure model proposed by the present invention, described in text data input When encoder, pay attention in layer in word:
It is indicated by will enter into result obtained in a multilayer neural network as hiding;
Word-based attention model is constructed, its weight matrix is initialized, indicates to calculate the important of each word according to implicit Property;
According to the importance of each word, the important of the sentence being composed of words is obtained by the way that multilayer neural network is weight averaged Property.
In the file classification method based on neural network structure model proposed by the present invention, the multilayer neural network It is bidirectional circulating neural network.
In the file classification method based on neural network structure model proposed by the present invention, the hiding expression of word includes The hidden state of forward direction input and the hidden state reversely inputted, with the calculating of following formula:
xit=Wewit,t∈[1,T]
Wherein,Indicate the output of the hidden state of the positive input at t-th of moment of the i-th word;Indicate the i-th word The output of the hidden state reversely inputted at t moment;witIndicate the input at t-th of moment of the i-th word;WeIndicate initialization Weight matrix;xitIndicate the input of the neural network after treatment at t-th of moment of the i-th word.
In the file classification method based on neural network structure model proposed by the present invention, the sentence that is composed of words Importance is calculated with following formula:
uit=tanh (Wwhit+bw)
Wherein, u indicates corresponding weight matrix;siIndicate the input of the i-th word;h(h1, h2... ... hL) indicate to hide shape The output of state;aitIndicate the weight of corresponding attention model;bwIndicate the bias matrix of word grade;WwIndicate the weight square of word grade Battle array.
In the file classification method based on neural network structure model proposed by the present invention, pay attention in layer in sentence:
It is indicated by will enter into result obtained in multilayer neural network as implicit;
The attention model based on sentence is constructed, its weight matrix is initialized, indicates the important of each of calculating according to implicit Property;
According to the importance of sentence, pass through the weight averaged state vector for obtaining text data of multilayer neural network.
In the file classification method based on neural network structure model proposed by the present invention, the state vector is passed through Following formula is calculated:
ui=tanh (Wshi+bs)
Wherein,Indicate the output of the hidden state of the positive input at t-th of moment of the i-th word;Indicate the i-th word The output of the hidden state reversely inputted at t moment;siIndicate the input of the i-th word;U indicates corresponding weight matrix;aiTable Show the weight of corresponding attention model;bsIndicate the bias matrix of Sentence-level;V indicates the state vector finally exported, includes The information of context.
In the file classification method based on neural network structure model proposed by the present invention, at maxpooling layers In, the Partial Feature of state vector is removed, model parameter is reduced.
In the file classification method based on neural network structure model proposed by the present invention, the classifier is used Softmax classifier classifies to entire text, and mode classification is p=softmax (Wcv+bc), loss function are as follows:
The invention also provides a kind of Text Classification Systems based on neural network structure model, comprising:
Corpus for obtaining text data collects module;
For constructing the building module of neural network structure model;
Encoder, including Chinese word coding device unit and sentence cell encoder;And
Classifier for text classification.
Compared with prior art, the present invention has following beneficial technical effect:
Text Classification System proposed by the present invention and method are with the method extraction text of two-way GRU combination CNN network structure Feature, by sufficiently extracting Text eigenvector using the two-way GRU and CNN network that attention mechanism is added as encoder, To improve the accuracy rate of final text classification, the accuracy rate of intention assessment in specific QA application scenarios is improved.
Detailed description of the invention
Fig. 1 is that the classical model for text classification of the prior art is convolutional neural networks.
Fig. 2 is the neural network structure model schematic of the embodiment of the present invention 1.
Fig. 3 is a kind of flow chart of the file classification method based on neural network structure model in embodiment.
Fig. 4 is a kind of structural schematic diagram of the Text Classification System based on neural network structure model in embodiment.
Specific embodiment
In conjunction with following specific embodiments and attached drawing, the present invention is described in further detail.Implement process of the invention, Condition, experimental method etc. are among the general principles and common general knowledge in the art, this hair in addition to what is specifically mentioned below It is bright that there are no special restrictions to content.
Term "and/or" in the present invention, only a kind of incidence relation for describing affiliated partner, indicates may exist three kinds Relationship, for example, A and/or B, can indicate: individualism A exists simultaneously A and B, these three situations of individualism B.In addition, this Character "/" in invention typicallys represent the relationship that forward-backward correlation object is a kind of "or".
Specifically, as shown in figure 3, the file classification method based on neural network structure model described in the present embodiment includes Following steps:
Step 1: text data to be sorted is collected;Corpus corresponding to text data is all disclosed data set, respectively For Yelp reviews2013,2014,2015.
Step 2: processing text data indicates the sentence in text data with term vector;
Step 3: building neural network structure model;
Step 4: being based on the neural network structure model, based on RNN and CNN network architecture, is added and pays attention to Mechanism construction encoder;
Step 5: the text data input coding device that will be indicated with term vector, output comprising contextual information state to Amount;
Step 6: classifying to text data using classifier according to state vector, obtains classification results.
Wherein, the number of plies of the neural network structure model is three layers, and first layer is that word notices that layer, the second layer pay attention to for sentence Layer, third layer are maxpooling layers.
First layer is that word pays attention to layer, is used to obtain the important information of word rank in sentence, the purpose of attention mechanism is It is most important to the meaning of sentence in a sentence, contribute maximum word to find out.By will enter into a single layer The implicit expression that result obtained in perceptron (MLP) is used as.In order to measure the importance of word, with a random initializtion The similarity of context vector indicate, then obtain a normalized attention weight by softmax operation Matrix represents the weight of each word in sentence.Sentence vector can be regarded as to the weighted sum for forming the term vector of these sentences.
" attention " mechanism of word level:
It is document classification task that invention, which is directed to task, that is, thinks that each document to be classified can be divided into multiple sentences Son.Therefore the first part of level " attention " model handles each subordinate sentence.RNN input two-way for first is every Each word w of wordit, calculation formula is as follows:
xit=Wewit,t∈[1,T]
Be not each word it is useful to classification task but for the word in a word, for example is doing text When this mood classification, it may will compare concern " fine ", " sentiment " these words.In order to make Recognition with Recurrent Neural Network also can be certainly It is dynamic that " attention " is placed on these vocabulary, the attention model based on word is devised, calculation formula is as follows:
uit=tanh (Wwhit+bw)
Firstly, converting by a linear layer to the output of two-way GRU network, then pass through softmax formula meter The importance for calculating each word is weighted and averaged to obtain the expression of each sentence finally by the output to two-way RNN.
The second layer is that sentence pays attention to layer, is used to obtain the attention of the important information of sentence level and word rank in document Mechanism is similar, proposes the context vector of a sentence level, importance of mono- sentence of Lai Hengliang in entire chapter text.It obtains The vector for obtaining entire chapter text indicates, the softmax layer connected entirely finally can be used and classify.
" attention " mechanism of sentence level
" attention " model of sentence level is similar with " attention " of word level.Its calculation formula is as follows:
ui=tanh (Wshi+bs)
Third layer is maxpooling layers, is used to remove Partial Feature, reduces model parameter, prevents model over-fitting, To influence precision of prediction.
Classified using most common softmax classifier to entire text in the present embodiment: p=softmax (Wcv+ bc);Loss function are as follows:
In above formula, α (α1, α2... ... αL) indicate the weight of corresponding attention model;h(h1, h2... hL) Indicate the output of hidden state;witIndicate the input at t-th of moment of the i-th word;U indicates corresponding weight matrix;Indicate the The output of the hidden state of the positive input at t-th of moment of i word;Indicate that reversely inputting for t-th moment of the i-th word is hidden The output of hiding state;siIndicate the input of the i-th word;V indicates the state vector finally exported, contains the information of context.
As shown in figure 4, the present invention also provides a kind of Text Classification Systems based on neural network structure model, comprising: For obtain text data corpus collect module, the building module for constructing neural network structure model, encoder (including Chinese word coding device unit and sentence cell encoder) and for text classification classifier.
Neural network structure model construction module,
First layer is that word pays attention to layer, is used to obtain the important information of word rank in sentence, the purpose of attention mechanism is It is most important to the meaning of sentence in a sentence, contribute maximum word to find out.By will enter into a single layer The implicit expression that result obtained in perceptron (MLP) is used as.In order to measure the importance of word, with a random initializtion The similarity of context vector indicate, then obtain a normalized attention weight by softmax operation Matrix represents the weight of each word in sentence.Sentence vector can be regarded as to the weighted sum for forming the term vector of these sentences.
" attention " mechanism of word level:
The present invention is directed task is document classification task, that is, it is multiple to think that each document to be classified can be divided into Sentence.Therefore the first part of level " attention " model handles each subordinate sentence.RNN two-way for first is inputted Each word w of every wordsit, calculation formula is as follows:
xit=Wewit,t∈[1,T]
Be not each word it is useful to classification task but for the word in a word, for example is doing text When this mood classification, it may will compare concern " fine ", " sentiment " these words.In order to make Recognition with Recurrent Neural Network also can be certainly It is dynamic that " attention " is placed on these vocabulary, the attention model based on word is devised, calculation formula is as follows:
uit=tanh (Wwhit+bw)
Firstly, being converted by a linear layer to the output of two-way RNN, then calculated by softmax formula The importance of each word is weighted and averaged to obtain the expression of each sentence finally by the output to two-way RNN.
The second layer is that sentence pays attention to layer, is used to obtain the attention of the important information of sentence level and word rank in document Mechanism is similar, proposes the context vector of a sentence level, importance of mono- sentence of Lai Hengliang in entire chapter text.It obtains The vector for obtaining entire chapter text indicates, the softmax layer connected entirely finally can be used and classify.
" attention " mechanism of sentence level
" attention " model of sentence level is similar with " attention " of word level.Its calculation formula is as follows:
ui=tanh (Wshi+bs)
Third layer is maxpooling layers, is used to remove Partial Feature, reduces model parameter, prevents model over-fitting, To influence precision of prediction.
Wherein, the form that term vector is converted the text to through network structure model;
(3) encoder
By the form input coding device of the term vector of text;
Wherein, the encoder is based on RNN and CNN network architecture, and attention mechanism is added and carries out structure It builds, the characteristic vector pickup network of text classification as input;
Chinese word coding device unit:
It being made of word sequence, word is converted to term vector first by the sentence of composition,, can then with two-way GRU network To combine the contextual information of forward and reverse, hidden layer output is obtained.The word given for one, by GRU After network, a kind of new expression is obtained, the information of both direction around is contained.
Sentence cell encoder:
After having obtained the expression of sentence vector, document vector is obtained with similar method: available for given sentence Corresponding sentence expression.The expression being achieved in that may include the contextual information of both direction.
(4) classifier
For classifying to text.
In the present embodiment, the classifier is softmax classifier.
In the present embodiment, maxpooling layers of the kernel_size=1.
Yelp reviews13,14,15 years data are tested respectively, 80% data are used in each data acquisition system Make training set, 10% data are used as verifying set, and the set of residue 10% is used as test set.
The description of 1 data set of table
Analysis of experimental results is more as shown in table 2:
2 experimental result contrast table of table
According to the experimental result of table 2 as it can be seen that BiGRU-CNN model is in all Yelp reviews13,14,15 3 numbers It is closed according to collection and achieves best effect.The promotion of this effect is not limited by data set size.Relatively small On data set such as Yelp2013, BiGRU-CNN model of the present invention is more than that the ratio of benchmark model preferably showed is 3.1%. It is identical, on large data sets Yelp2014 and Yelp2015, model of the present invention be better than before best model ratio difference For 3.0% and 1.1%.
For list from the point of view of the structured representation of text, HN-ATT can be obviously improved CNN-word, Conv-GRNN, LSTM- The performance of the models such as GRNN.The model that the present invention joins together BiGRU-CNN and Attention mechanism has even more been more than layer Secondaryization model HN-ATT.
Protection content of the invention is not limited to above embodiments.Without departing from the spirit and scope of the invention, originally Field technical staff it is conceivable that variation and advantage be all included in the present invention, and with appended claims be protect Protect range.

Claims (10)

1. a kind of file classification method based on neural network structure model characterized by comprising
Step 1: text data to be sorted is collected;
Step 2: processing text data indicates the sentence in text data with term vector;
Step 3: building neural network structure model;The number of plies of the neural network structure model is three layers, and first layer is word note Meaning layer, the second layer are that sentence pays attention to layer, and third layer is maxpooling layers;
Step 4: being based on the neural network structure model, and based on RNN and CNN network architecture, attention mechanism is added Construct encoder;
Step 5: the text data input coding device that will be indicated with term vector, output include the state vector of contextual information;
Step 6: classifying to text data using classifier according to state vector, obtains classification results.
2. the file classification method according to claim 1 based on neural network structure model, which is characterized in that textual data When according to inputting the encoder, pay attention in layer in word:
It is indicated by will enter into result obtained in a multilayer neural network as hiding;
Word-based attention model is constructed, its weight matrix is initialized, according to the implicit importance for indicating to calculate each word;
According to the importance of each word, pass through the weight averaged importance for obtaining the sentence being composed of words of multilayer neural network.
3. the file classification method according to claim 2 based on neural network structure model, which is characterized in that described more Layer neural network is two-way GRU network.
4. the file classification method according to claim 3 based on neural network structure model, which is characterized in that word it is hidden Hiding indicates to include the positive hidden state inputted and the hidden state reversely inputted, with the calculating of following formula:
xit=Wewit,t∈[1,T]
Wherein,Indicate the output of the hidden state of the positive input at t-th of moment of the i-th word;Indicate the i-th word t-th The output of the hidden state reversely inputted at moment;witIndicate the input at t-th of moment of the i-th word;WeIndicate the power of initialization Weight matrix;xitIndicate the input of the neural network after treatment at t-th of moment of the i-th word.
5. the file classification method according to claim 3 based on neural network structure model, which is characterized in that by phrase At the importance of sentence calculated with following formula:
uit=tanh (Wwhit+bw)
Wherein, u indicates corresponding weight matrix;siIndicate the input of the i-th word;h(h1, h2... ... hL) indicate hidden state Output;aitIndicate the weight of corresponding attention model;bwIndicate the bias matrix of word grade;WwIndicate the weight matrix of word grade.
6. according to described in any item file classification methods based on neural network structure model of claim 2-5, feature It is, pays attention in layer in sentence:
It is indicated by will enter into result obtained in multilayer neural network as implicit;
The attention model based on sentence is constructed, its weight matrix is initialized, according to the implicit importance for indicating to calculate each;
According to the importance of sentence, pass through the weight averaged state vector for obtaining text data of multilayer neural network.
7. the file classification method according to claim 6 based on neural network structure model, which is characterized in that the shape State vector is calculated by following formula:
ui=tanh (Wshi+bs)
Wherein,Indicate the output of the hidden state of the positive input at t-th of moment of the i-th word;Indicate the i-th word t-th The output of the hidden state reversely inputted at moment;siIndicate the input of the i-th word;U indicates corresponding weight matrix;aiIt indicates The weight of corresponding attention model;bsIndicate the bias matrix of Sentence-level;V indicates the state vector finally exported, contains The information of context.
8. the file classification method according to claim 2 based on neural network structure model, which is characterized in that In maxpooling layers, the Partial Feature of state vector is removed, reduces model parameter.
9. the file classification method according to claim 1 based on neural network structure model, which is characterized in that described point Class device classifies to entire text using softmax classifier, and mode classification is p=softmax (Wcv+bc), loss function Are as follows:
10. a kind of Text Classification System based on neural network structure model characterized by comprising
Corpus for obtaining text data collects module;
For constructing the building module of neural network structure model;
Encoder, including Chinese word coding device unit and sentence cell encoder;And
Classifier for text classification.
CN201910125342.9A 2019-02-20 2019-02-20 A kind of file classification method and categorizing system based on neural network structure model Pending CN109902175A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910125342.9A CN109902175A (en) 2019-02-20 2019-02-20 A kind of file classification method and categorizing system based on neural network structure model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910125342.9A CN109902175A (en) 2019-02-20 2019-02-20 A kind of file classification method and categorizing system based on neural network structure model

Publications (1)

Publication Number Publication Date
CN109902175A true CN109902175A (en) 2019-06-18

Family

ID=66945156

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910125342.9A Pending CN109902175A (en) 2019-02-20 2019-02-20 A kind of file classification method and categorizing system based on neural network structure model

Country Status (1)

Country Link
CN (1) CN109902175A (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110322962A (en) * 2019-07-03 2019-10-11 重庆邮电大学 A kind of method automatically generating diagnostic result, system and computer equipment
CN110377741A (en) * 2019-07-22 2019-10-25 成都深度智慧人工智能科技有限公司 File classification method, intelligent terminal and computer readable storage medium
CN110472052A (en) * 2019-07-31 2019-11-19 西安理工大学 A kind of Chinese social platform sentiment analysis method based on deep learning
CN110472236A (en) * 2019-07-23 2019-11-19 浙江大学城市学院 A kind of two-way GRU text readability appraisal procedure based on attention mechanism
CN110543567A (en) * 2019-09-06 2019-12-06 上海海事大学 Chinese text emotion classification method based on A-GCNN network and ACELM algorithm
CN110704715A (en) * 2019-10-18 2020-01-17 南京航空航天大学 Network overlord ice detection method and system
CN110717038A (en) * 2019-09-17 2020-01-21 腾讯科技(深圳)有限公司 Object classification method and device
CN110825845A (en) * 2019-10-23 2020-02-21 中南大学 Hierarchical text classification method based on character and self-attention mechanism and Chinese text classification method
CN110879838A (en) * 2019-10-29 2020-03-13 中科能效(北京)科技有限公司 Open domain question-answering system
CN110929033A (en) * 2019-11-26 2020-03-27 深圳市信联征信有限公司 Long text classification method and device, computer equipment and storage medium
CN111159396A (en) * 2019-12-04 2020-05-15 中国电子科技集团公司第三十研究所 Method for establishing text data classification hierarchical model facing data sharing exchange
CN111597339A (en) * 2020-05-22 2020-08-28 北京慧闻科技(集团)有限公司 Document-level multi-round conversation intention classification method, device, equipment and storage medium
CN111639152A (en) * 2019-08-29 2020-09-08 上海卓繁信息技术股份有限公司 Intention recognition method
CN111883115A (en) * 2020-06-17 2020-11-03 马上消费金融股份有限公司 Voice flow quality inspection method and device
CN112183086A (en) * 2020-09-23 2021-01-05 北京先声智能科技有限公司 English pronunciation continuous reading mark model based on sense group labeling
CN112230990A (en) * 2020-11-10 2021-01-15 北京邮电大学 Program code duplication checking method based on hierarchical attention neural network
CN112287105A (en) * 2020-09-30 2021-01-29 昆明理工大学 Method for analyzing correlation of law-related news fusing bidirectional mutual attention of title and text
CN112420028A (en) * 2020-12-03 2021-02-26 上海欣方智能系统有限公司 System and method for performing semantic recognition on voice signal
CN112562809A (en) * 2020-12-15 2021-03-26 贵州小宝健康科技有限公司 Method and system for auxiliary diagnosis based on electronic medical record text
CN112559750A (en) * 2020-12-21 2021-03-26 珠海格力电器股份有限公司 Text data classification method and device, nonvolatile storage medium and processor
CN112905796A (en) * 2021-03-16 2021-06-04 山东亿云信息技术有限公司 Text emotion classification method and system based on re-attention mechanism
CN112925908A (en) * 2021-02-19 2021-06-08 东北林业大学 Attention-based text classification method and system for graph Attention network
CN113076127A (en) * 2021-04-25 2021-07-06 南京大学 Method, system, electronic device and medium for extracting question and answer content in programming environment
CN113204971A (en) * 2021-03-26 2021-08-03 南京邮电大学 Scene self-adaptive Attention multi-intention identification method based on deep learning
WO2021169364A1 (en) * 2020-09-23 2021-09-02 平安科技(深圳)有限公司 Semantic emotion analysis method and apparatus, device, and storage medium
CN113377933A (en) * 2021-04-27 2021-09-10 中国联合网络通信集团有限公司 Intention classification method and device for multi-turn conversation
CN113553052A (en) * 2021-06-09 2021-10-26 麒麟软件有限公司 Method for automatically recognizing security-related code submissions using an Attention-coded representation
CN113961698A (en) * 2020-07-15 2022-01-21 上海乐言信息科技有限公司 Intention classification method, system, terminal and medium based on neural network model
CN114579740A (en) * 2022-01-20 2022-06-03 马上消费金融股份有限公司 Text classification method and device, electronic equipment and storage medium
KR20220151777A (en) * 2021-05-07 2022-11-15 연세대학교 산학협력단 Method and device for classifying building defect using multi task channel attention
CN117729545A (en) * 2024-02-18 2024-03-19 北京中科网芯科技有限公司 5G network communication control method

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110322962A (en) * 2019-07-03 2019-10-11 重庆邮电大学 A kind of method automatically generating diagnostic result, system and computer equipment
CN110377741A (en) * 2019-07-22 2019-10-25 成都深度智慧人工智能科技有限公司 File classification method, intelligent terminal and computer readable storage medium
CN110472236A (en) * 2019-07-23 2019-11-19 浙江大学城市学院 A kind of two-way GRU text readability appraisal procedure based on attention mechanism
CN110472052A (en) * 2019-07-31 2019-11-19 西安理工大学 A kind of Chinese social platform sentiment analysis method based on deep learning
CN111639152A (en) * 2019-08-29 2020-09-08 上海卓繁信息技术股份有限公司 Intention recognition method
CN111639152B (en) * 2019-08-29 2021-04-13 上海卓繁信息技术股份有限公司 Intention recognition method
CN110543567A (en) * 2019-09-06 2019-12-06 上海海事大学 Chinese text emotion classification method based on A-GCNN network and ACELM algorithm
CN110717038A (en) * 2019-09-17 2020-01-21 腾讯科技(深圳)有限公司 Object classification method and device
CN110704715A (en) * 2019-10-18 2020-01-17 南京航空航天大学 Network overlord ice detection method and system
CN110704715B (en) * 2019-10-18 2022-05-17 南京航空航天大学 Network overlord ice detection method and system
CN110825845A (en) * 2019-10-23 2020-02-21 中南大学 Hierarchical text classification method based on character and self-attention mechanism and Chinese text classification method
CN110825845B (en) * 2019-10-23 2022-09-23 中南大学 Hierarchical text classification method based on character and self-attention mechanism and Chinese text classification method
CN110879838A (en) * 2019-10-29 2020-03-13 中科能效(北京)科技有限公司 Open domain question-answering system
CN110879838B (en) * 2019-10-29 2023-07-14 中科能效(北京)科技有限公司 Open domain question-answering system
CN110929033A (en) * 2019-11-26 2020-03-27 深圳市信联征信有限公司 Long text classification method and device, computer equipment and storage medium
CN111159396B (en) * 2019-12-04 2022-04-22 中国电子科技集团公司第三十研究所 Method for establishing text data classification hierarchical model facing data sharing exchange
CN111159396A (en) * 2019-12-04 2020-05-15 中国电子科技集团公司第三十研究所 Method for establishing text data classification hierarchical model facing data sharing exchange
CN111597339A (en) * 2020-05-22 2020-08-28 北京慧闻科技(集团)有限公司 Document-level multi-round conversation intention classification method, device, equipment and storage medium
CN111883115A (en) * 2020-06-17 2020-11-03 马上消费金融股份有限公司 Voice flow quality inspection method and device
CN113961698A (en) * 2020-07-15 2022-01-21 上海乐言信息科技有限公司 Intention classification method, system, terminal and medium based on neural network model
CN112183086A (en) * 2020-09-23 2021-01-05 北京先声智能科技有限公司 English pronunciation continuous reading mark model based on sense group labeling
WO2021169364A1 (en) * 2020-09-23 2021-09-02 平安科技(深圳)有限公司 Semantic emotion analysis method and apparatus, device, and storage medium
CN112287105A (en) * 2020-09-30 2021-01-29 昆明理工大学 Method for analyzing correlation of law-related news fusing bidirectional mutual attention of title and text
CN112287105B (en) * 2020-09-30 2023-09-12 昆明理工大学 Method for analyzing correlation of related news by fusing bidirectional mutual attention of title and text
CN112230990A (en) * 2020-11-10 2021-01-15 北京邮电大学 Program code duplication checking method based on hierarchical attention neural network
CN112420028A (en) * 2020-12-03 2021-02-26 上海欣方智能系统有限公司 System and method for performing semantic recognition on voice signal
CN112420028B (en) * 2020-12-03 2024-03-19 上海欣方智能系统有限公司 System and method for carrying out semantic recognition on voice signals
CN112562809A (en) * 2020-12-15 2021-03-26 贵州小宝健康科技有限公司 Method and system for auxiliary diagnosis based on electronic medical record text
CN112559750B (en) * 2020-12-21 2024-05-28 珠海格力电器股份有限公司 Text data classification method, device, nonvolatile storage medium and processor
CN112559750A (en) * 2020-12-21 2021-03-26 珠海格力电器股份有限公司 Text data classification method and device, nonvolatile storage medium and processor
CN112925908A (en) * 2021-02-19 2021-06-08 东北林业大学 Attention-based text classification method and system for graph Attention network
CN112905796A (en) * 2021-03-16 2021-06-04 山东亿云信息技术有限公司 Text emotion classification method and system based on re-attention mechanism
CN112905796B (en) * 2021-03-16 2023-04-18 山东亿云信息技术有限公司 Text emotion classification method and system based on re-attention mechanism
CN113204971A (en) * 2021-03-26 2021-08-03 南京邮电大学 Scene self-adaptive Attention multi-intention identification method based on deep learning
CN113204971B (en) * 2021-03-26 2024-01-26 南京邮电大学 Scene self-adaptive Attention multi-intention recognition method based on deep learning
CN113076127B (en) * 2021-04-25 2023-08-29 南京大学 Method, system, electronic device and medium for extracting question and answer content in programming environment
WO2022226714A1 (en) * 2021-04-25 2022-11-03 南京大学 Method and system for extracting question and answer content in programming environment, electronic device, and medium
CN113076127A (en) * 2021-04-25 2021-07-06 南京大学 Method, system, electronic device and medium for extracting question and answer content in programming environment
CN113377933B (en) * 2021-04-27 2023-05-30 中国联合网络通信集团有限公司 Intention classification method and device for multi-round dialogue
CN113377933A (en) * 2021-04-27 2021-09-10 中国联合网络通信集团有限公司 Intention classification method and device for multi-turn conversation
KR102501730B1 (en) 2021-05-07 2023-02-21 연세대학교 산학협력단 Method and device for classifying building defect using multi task channel attention
KR20220151777A (en) * 2021-05-07 2022-11-15 연세대학교 산학협력단 Method and device for classifying building defect using multi task channel attention
CN113553052B (en) * 2021-06-09 2022-07-08 麒麟软件有限公司 Method for automatically recognizing security-related code submissions using an Attention-coded representation
CN113553052A (en) * 2021-06-09 2021-10-26 麒麟软件有限公司 Method for automatically recognizing security-related code submissions using an Attention-coded representation
CN114579740A (en) * 2022-01-20 2022-06-03 马上消费金融股份有限公司 Text classification method and device, electronic equipment and storage medium
CN114579740B (en) * 2022-01-20 2023-12-05 马上消费金融股份有限公司 Text classification method, device, electronic equipment and storage medium
CN117729545A (en) * 2024-02-18 2024-03-19 北京中科网芯科技有限公司 5G network communication control method
CN117729545B (en) * 2024-02-18 2024-05-03 北京中科网芯科技有限公司 5G network communication control method

Similar Documents

Publication Publication Date Title
CN109902175A (en) A kind of file classification method and categorizing system based on neural network structure model
Giachanou et al. Multimodal multi-image fake news detection
Mane et al. A survey on supervised convolutional neural network and its major applications
Haddad et al. Arabic offensive language detection with attention-based deep neural networks
CN107608999A (en) A kind of Question Classification method suitable for automatically request-answering system
Gao et al. Convolutional neural network based sentiment analysis using Adaboost combination
CN110287323B (en) Target-oriented emotion classification method
CN112257449B (en) Named entity recognition method and device, computer equipment and storage medium
CN103886108B (en) The feature selecting and weighing computation method of a kind of unbalanced text set
Rahman et al. Personality detection from text using convolutional neural network
CN110705247B (en) Based on x2-C text similarity calculation method
CN113254655B (en) Text classification method, electronic device and computer storage medium
CN113220890A (en) Deep learning method combining news headlines and news long text contents based on pre-training
CN113704396A (en) Short text classification method, device, equipment and storage medium
Al-Tai et al. Deep learning for fake news detection: Literature review
CN111435375A (en) Threat information automatic labeling method based on FastText
DE202023102803U1 (en) System for emotion detection and mood analysis through machine learning
CN113779282B (en) Fine-grained cross-media retrieval method based on self-attention and generation countermeasure network
Pandey et al. Ensem_SLDR: classification of cybercrime using ensemble learning technique
Kang et al. A short texts matching method using shallow features and deep features
Devi et al. Dive in Deep Learning: Computer Vision, Natural Language Processing, and Signal Processing
CN115033689B (en) Original network Euclidean distance calculation method based on small sample text classification
Vikas et al. User Gender Classification Based on Twitter Profile Using Machine Learning
CN116318845A (en) DGA domain name detection method under unbalanced proportion condition of positive and negative samples
Zhang et al. Job opportunity finding by text classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination