CN111414454A - Law recommendation processing method based on bert model and law knowledge - Google Patents
Law recommendation processing method based on bert model and law knowledge Download PDFInfo
- Publication number
- CN111414454A CN111414454A CN202010180118.2A CN202010180118A CN111414454A CN 111414454 A CN111414454 A CN 111414454A CN 202010180118 A CN202010180118 A CN 202010180118A CN 111414454 A CN111414454 A CN 111414454A
- Authority
- CN
- China
- Prior art keywords
- knowledge
- semantic representation
- case description
- keywords
- representation vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 12
- 239000013598 vector Substances 0.000 claims abstract description 93
- 238000000034 method Methods 0.000 claims abstract description 30
- 230000009466 transformation Effects 0.000 claims abstract description 4
- 230000002457 bidirectional effect Effects 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 9
- 238000012549 training Methods 0.000 claims description 8
- 238000012545 processing Methods 0.000 abstract description 2
- 239000000284 extract Substances 0.000 abstract 1
- 238000012360 testing method Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 238000002679 ablation Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/3331—Query processing
- G06F16/334—Query execution
- G06F16/3344—Query execution using natural language analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9535—Search customisation based on user profiles and personalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/18—Legal services; Handling legal documents
Abstract
The invention relates to a law enforcement recommendation processing method based on a bert model and law enforcement knowledge, and belongs to the technical field of data processing. The method extracts keywords from law knowledge in the judicial field; semantic representation is carried out on case description texts and legal knowledge keywords; based on an attention mechanism, fusing the case description text semantic representation vector and the legal knowledge keyword semantic representation vector to obtain a case description feature vector fused with the legal knowledge keywords; and performing linear transformation and softmax on case description feature vectors fused with the French knowledge keywords, and finally realizing French recommendation. The invention fuses the legal provision knowledge and case description to realize intelligent legal provision recommendation based on knowledge driving.
Description
Technical Field
The invention relates to a law enforcement recommendation processing method based on a bert model and law enforcement knowledge, and belongs to the technical field of data processing.
Background
The legal provision recommendation based on the association of the case description content and the legal provision is to realize the selection and recommendation of related legal provisions according to the description content of the case and by combining legal provision knowledge in the judicial field.
The existing law statement recommendation algorithms are mostly based on a data-driven mode, early law judgment and clause prediction are realized based on a statistical method, meanwhile, with the continuous and deep development of a machine learning algorithm, the law statement recommendation is realized based on a text classification mode, for example, an SVM (support vector machine) method is used for predicting judgment data results, and a preliminary law statement classification method can also be used. In recent years, with the continuous development of deep learning, a method based on a deep neural network is greatly developed in the field of legal decision prediction.
The law recommendation is not a simple law prediction process based on data driving, and has strong judicial field characteristics, in the case approval process, a judge usually analyzes cases by taking legal knowledge as a criterion, and selects corresponding law knowledge as the basis for judging criminal states and names of crimes, for example, in the case approval process of criminal cases such as 'robbery' and 'robbery', 'fraud' and 'extortion', the judge usually pays attention to the following vocabularies in the process of adjudication, as shown in table 1.
TABLE 1 analysis of the French concern in the judge examination
Therefore, in case trial, the judge usually needs to repeatedly re-interpret information such as case description based on the core words of the law. Because the judicial officer cannot automatically recommend proper legal rules to the officer in the case reviewing process, the burden of the officer on the case reviewing is increased, and the judicial judging efficiency is also reduced. Therefore, the invention provides a law enforcement recommendation processing method based on the bert model and law enforcement knowledge.
Disclosure of Invention
The invention provides a law enforcement recommendation processing method based on a bert model and law enforcement knowledge, which fuses the law enforcement knowledge and case description to realize intelligent law enforcement recommendation based on knowledge driving.
The technical scheme of the invention is as follows:
extracting key words from judicial domain law knowledge;
semantic representation is carried out on case description texts and legal knowledge keywords;
based on an attention mechanism, fusing the case description text semantic representation vector and the legal knowledge keyword semantic representation vector to obtain a case description feature vector fused with the legal knowledge keywords;
and performing linear transformation and softmax on case description feature vectors fused with the French knowledge keywords, and finally realizing French recommendation.
And further, extracting keywords from the French knowledge in the judicial field by using a TextRank-based method to obtain the keywords of the French knowledge.
Further, comprising:
semantic representation is carried out on the case description text by sequentially adopting a bert-based model and a bidirectional L STM to obtain a case description text semantic representation vector;
semantic representation is carried out on the French knowledge keywords by adopting a bert model and a bidirectional L STM in sequence to obtain semantic representation vectors of the French knowledge keywords.
Further, performing semantic representation on the case description text and the legal knowledge keywords comprises:
the case description text X is [ X1, …, xN ] and the French knowledge keyword text set Y is [ Y1, …, ym ], wherein N represents the length of the case description text, m is the length of the French knowledge keyword text set, a bert pre-training model is adopted to respectively represent the case description text, and based on the bert model, the semantic representation vectors of specific text descriptions are respectively obtained;
FX=BERT(X)
FY=BERT(Y)
wherein, FXAnd FYRespectively representing a bert-based case description text semantic representation vector and a legal knowledge keyword semantic representation vector.
Meanwhile, in order to improve the continuous representation capability of the text sequence information, a bidirectional L STM layer is added behind the bert module, so that the context feature information capability of the text feature vector is further improved, and the method specifically comprises the following steps:
FX1=BiLSTM(FX)
FY1=BiLSTM(FY)
wherein, FX1And FY1And respectively representing the final case description text semantic representation vector and the legal knowledge keyword semantic representation vector.
Further, the obtaining case description feature vectors fused with the law knowledge keywords comprises:
semantic representation vector F for case description textXHem knowledge keyword semantic representation vector FYPerforming attention calculation to obtain a case description text semantic representation vector FXHem knowledge keyword semantic representation vector FYFused feature attention;
semantic representation vector F of case description textXHem knowledge keyword semantic representation vector FYThe fused feature attention is normalized, and finally, a case description text semantic representation vector F is expressedXAnd carrying out weighted summation based on the attention weight to obtain case description feature vectors fused with the keywords of the law knowledge.
The method specifically comprises the following steps:
step 4.1, based on the attention mechanism, semantic representation vector F of case description textXHem knowledge keyword semantic representation vector FYPerforming attention calculation; case description text semantic representation vector FXHem knowledge keyword semantic representation vector FYThe fused attention calculation formula is as followsThe following steps:
f(FX(i),FY)=FX(i)*FY T
wherein, FX(i) A semantic representation vector of the ith text representing the case description text;
step 4.2, semantic representation vector F of case description textXHem knowledge keyword semantic representation vector FYThe specific formula for normalizing the fused feature attention is as follows:
ai=softmax[f(FX(i),FY)]
wherein, aiExpressing attention vectors of the French knowledge key words and the ith text of the case description text;
step 4.3, representing the semantic representation vector F of the case description textXCarrying out weighted summation based on attention weight to obtain case description feature vector F fused with law knowledge keywordsXYThe specific formula is as follows,
FXY=∑aiFX(i)。
the invention has the beneficial effects that: the method combines the Bert model with strong text representation and text understanding capability, refers the Bert model to the law statement recommendation task and deduces the law statement recommendation task to obtain better effect; the invention adopts the bert model to realize accurate representation of legal provision knowledge and case description, and improves the effect of legal provision recommendation.
Drawings
FIG. 1 is a diagram of the architecture of the legal recommendation based on the fusion of the bert model and the legal knowledge in the present invention.
Detailed Description
Example 1: as shown in fig. 1, the technical solution of the invention is that, in the law recommendation processing method based on the bert model and law knowledge:
step 1, acquiring a case description text; selecting a training set and a testing set from the training sets;
the invention uses a usage research cup public data set to obtain case description texts, and selects a training set of 60 ten thousand case description texts and a test set of 10 ten thousand case description texts;
in the invention, for example, when a clockwork spring of a criminal case is recommended, the case belongs to the criminal case, and the criminal law knowledge in the judicial field is correspondingly adopted to construct a database;
step 2, extracting keywords from judicial domain law knowledge;
step 3, performing semantic representation on case description texts and legal knowledge keywords;
step 4, fusing the case description text semantic representation vector and the legal knowledge keyword semantic representation vector based on an attention mechanism to obtain a case description feature vector fused with the legal knowledge keywords;
and 5, performing linear transformation and softmax on case description feature vectors fused with the French knowledge keywords, and finally realizing French recommendation.
And further, extracting keywords from the French knowledge in the judicial field by using a TextRank-based method to obtain the keywords of the French knowledge.
Further, comprising:
semantic representation is carried out on the case description text by sequentially adopting a bert-based model and a bidirectional L STM to obtain a case description text semantic representation vector;
semantic representation is carried out on the French knowledge keywords by adopting a bert model and a bidirectional L STM in sequence to obtain semantic representation vectors of the French knowledge keywords.
Further, performing semantic representation on the case description text and the legal knowledge keywords comprises:
the case description text X is [ X1, …, xN ] and the French knowledge keyword text set Y is [ Y1, …, ym ], wherein N represents the length of the case description text, m is the length of the French knowledge keyword text set, a bert pre-training model is adopted to respectively represent the case description text, and based on the bert model, the semantic representation vectors of specific text descriptions are respectively obtained;
FX=BERT(X)
FY=BERT(Y)
wherein, FXAnd FYRespectively represent the case description based on bertText semantic representation vectors and legal knowledge keyword semantic representation vectors.
Meanwhile, in order to improve the continuous representation capability of the text sequence information, a bidirectional L STM layer is added behind the bert module, so that the context feature information capability of the text feature vector is further improved, and the method specifically comprises the following steps:
FX1=BiLSTM(FX)
FY1=BiLSTM(FY)
wherein, FX1And FY1And respectively representing the final case description text semantic representation vector and the legal knowledge keyword semantic representation vector.
Further, the step 4 comprises:
semantic representation vector F for case description textXHem knowledge keyword semantic representation vector FYPerforming attention calculation to obtain a case description text semantic representation vector FXHem knowledge keyword semantic representation vector FYFused feature attention;
semantic representation vector F of case description textXHem knowledge keyword semantic representation vector FYThe fused feature attention is normalized, and finally, a case description text semantic representation vector F is expressedXAnd carrying out weighted summation based on the attention weight to obtain case description feature vectors fused with the keywords of the law knowledge.
The method specifically comprises the following steps:
step 4.1, based on the attention mechanism, semantic representation vector F of case description textXHem knowledge keyword semantic representation vector FYPerforming attention calculation; case description text semantic representation vector FXHem knowledge keyword semantic representation vector FYThe fused attention calculation formula is as follows:
f(FX(i),FY)=FX(i)*FY T
wherein, FX(i) A semantic representation vector of the ith text representing the case description text;
step 4.2, describing the case description text semanticsToken vector FXHem knowledge keyword semantic representation vector FYThe specific formula for normalizing the fused feature attention is as follows:
ai=softmax[f(FX(i),FY)]
wherein, aiExpressing attention vectors of the French knowledge key words and the ith text of the case description text;
step 4.3, representing the semantic representation vector F of the case description textXCarrying out weighted summation based on attention weight to obtain case description feature vector F fused with law knowledge keywordsXYThe specific formula is as follows,
FXY=∑aiFX(i)
to more intuitively illustrate the process of law-recommended practice, a relevant example analysis about fraud is given, such as case fact description fact: aiming at a certain fraud case, the description of the case is included; legal provisions for fraud; fraud and guilt: the behavior of deceiving public and private properties with large amount by using fictional facts or a method of hiding true figures with illegal purposes.
Aiming at the case description, the keyword of the case description such as 'fraud public and private property' and 'large amount' of the fraud law is fused with the case description based on the attention mechanism, so that the attention weight of the case description text with the law knowledge can be pertinently improved, the targeted feature extraction is further realized, and the accurate case recommendation is finally realized.
To illustrate the effect of the present invention, the present invention is compared with the conventional law recommended method, as shown in tables 2, 3, and 4:
TABLE 2 comparison of the patented method with the conventional law enforcement recommendation method
Table 3: the invention relates to a law statement knowledge keyword ablation contrast experiment result
Data of | Model (model) | F1 |
Test set of French grinding cup 2018 | Method for fusing law knowledge | 0.92 |
Test set of French grinding cup 2018 | Method for not fusing law knowledge | 0.88 |
TABLE 4 comparative experiment of the bert model of the present invention and the Word2vec model
Data of | Model (model) | F1 |
Test set of French grinding cup 2018 | Word2vec | 0.89 |
Test set of French grinding cup 2018 | Bert | 0.92 |
According to the experimental result, model training is carried out based on the same training set, the accuracy (P), the recall rate (R) and the F1 value (F1) of the proposed method are obviously superior to those of the traditional data-driven-based law recommendation method on the given test set, and the proposed knowledge-driven-based law recommendation method has certain effect improvement relative to a reference model. Different from the traditional data-driven-based method, the law knowledge keywords and the method for representing based on the bert model are integrated, so that the effect of the law recommendation model is better improved.
While the present invention has been described in detail with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, and various changes can be made without departing from the spirit of the present invention within the knowledge of those skilled in the art.
Claims (6)
1. The law enforcement recommendation processing method based on the bert model and law enforcement knowledge is characterized by comprising the following steps of:
extracting key words from judicial domain law knowledge;
semantic representation is carried out on case description texts and legal knowledge keywords;
based on an attention mechanism, fusing the case description text semantic representation vector and the legal knowledge keyword semantic representation vector to obtain a case description feature vector fused with the legal knowledge keywords;
and performing linear transformation and softmax on case description feature vectors fused with the French knowledge keywords, and finally realizing French recommendation.
2. The french recommendation processing method based on bert model and french knowledge of claim 1, wherein keywords are extracted from french knowledge in the judicial field by using a TextRank-based method to obtain french knowledge keywords.
3. The legal recommendation processing method based on bert model and legal knowledge according to claim 1, comprising:
semantic representation is carried out on the case description text by sequentially adopting a bert-based model and a bidirectional L STM to obtain a case description text semantic representation vector;
semantic representation is carried out on the French knowledge keywords by adopting a bert model and a bidirectional L STM in sequence to obtain semantic representation vectors of the French knowledge keywords.
4. The method of claim 1, wherein the semantic representation of case description text and keywords of legal knowledge comprises:
the case description text X is [ X1, …, xN ] and the French knowledge keyword text set Y is [ Y1, …, ym ], wherein N represents the length of the case description text, m is the length of the French knowledge keyword text set, a bert pre-training model is adopted to represent the case description text, specific text description semantic representation vectors are obtained respectively based on the bert model, in order to improve the continuous representation capability of text sequence information, a bidirectional L STM layer is added behind the bert module, and the context feature information capability of the text feature vectors is further improved.
5. The law enforcement recommendation processing method based on the bert model and the law enforcement knowledge as claimed in claim 1, wherein the obtaining case description feature vectors fusing keywords of the law enforcement knowledge comprises:
semantic representation vector F for case description textXHem knowledge keyword semantic representation vector FYPerforming attention calculation to obtain a case description text semantic representation vector FXHem knowledge keyword semantic representation vector FYFused feature attention;
semantic representation vector F of case description textXHem knowledge keyword semantic representation vector FYThe fused feature attention is normalized, and finally, a case description text semantic representation vector F is expressedXWeighted summation is carried out based on attention weight to obtain keywords fusing with law knowledgeCase description feature vectors.
6. The law enforcement recommendation processing method based on the bert model and the law enforcement knowledge as claimed in claim 1, wherein the specific steps of obtaining case description feature vectors fused with the keywords of the law enforcement knowledge are as follows:
step 4.1, based on the attention mechanism, semantic representation vector F of case description textXHem knowledge keyword semantic representation vector FYPerforming attention calculation; case description text semantic representation vector FXHem knowledge keyword semantic representation vector FYThe fused attention calculation formula is as follows:
f(FX(i),FY)=FX(i)*FY T
wherein, FX(i) A semantic representation vector of the ith text representing the case description text;
step 4.2, semantic representation vector F of case description textXHem knowledge keyword semantic representation vector FYThe specific formula for normalizing the fused feature attention is as follows:
ai=softmax[f(FX(i),FY)]
wherein, aiExpressing attention vectors of the French knowledge key words and the ith text of the case description text;
step 4.3, representing the semantic representation vector F of the case description textXCarrying out weighted summation based on attention weight to obtain case description feature vector F fused with law knowledge keywordsXYThe specific formula is as follows,
FXY=∑aiFX(i)。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010180118.2A CN111414454B (en) | 2020-03-16 | 2020-03-16 | Law recommendation processing method based on bert model and law knowledge |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010180118.2A CN111414454B (en) | 2020-03-16 | 2020-03-16 | Law recommendation processing method based on bert model and law knowledge |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111414454A true CN111414454A (en) | 2020-07-14 |
CN111414454B CN111414454B (en) | 2022-07-19 |
Family
ID=71493034
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010180118.2A Active CN111414454B (en) | 2020-03-16 | 2020-03-16 | Law recommendation processing method based on bert model and law knowledge |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111414454B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112614024A (en) * | 2020-12-30 | 2021-04-06 | 成都数之联科技有限公司 | Case fact based intelligent law strip recommendation method, system, device and medium |
CN113377944A (en) * | 2020-12-02 | 2021-09-10 | 中国司法大数据研究院有限公司 | Case feature extraction and law enforcement recommendation method and device based on multiple tasks |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080126078A1 (en) * | 2003-04-29 | 2008-05-29 | Telstra Corporation Limited | A System and Process For Grammatical Interference |
CN110334210A (en) * | 2019-05-30 | 2019-10-15 | 哈尔滨理工大学 | A kind of Chinese sentiment analysis method merged based on BERT with LSTM, CNN |
CN110413785A (en) * | 2019-07-25 | 2019-11-05 | 淮阴工学院 | A kind of Automatic document classification method based on BERT and Fusion Features |
CN110717334A (en) * | 2019-09-10 | 2020-01-21 | 上海理工大学 | Text emotion analysis method based on BERT model and double-channel attention |
CN110750635A (en) * | 2019-10-21 | 2020-02-04 | 南京大学 | Joint deep learning model-based law enforcement recommendation method |
CN110782008A (en) * | 2019-10-16 | 2020-02-11 | 北京百分点信息科技有限公司 | Training method, prediction method and device of deep learning model |
-
2020
- 2020-03-16 CN CN202010180118.2A patent/CN111414454B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080126078A1 (en) * | 2003-04-29 | 2008-05-29 | Telstra Corporation Limited | A System and Process For Grammatical Interference |
CN110334210A (en) * | 2019-05-30 | 2019-10-15 | 哈尔滨理工大学 | A kind of Chinese sentiment analysis method merged based on BERT with LSTM, CNN |
CN110413785A (en) * | 2019-07-25 | 2019-11-05 | 淮阴工学院 | A kind of Automatic document classification method based on BERT and Fusion Features |
CN110717334A (en) * | 2019-09-10 | 2020-01-21 | 上海理工大学 | Text emotion analysis method based on BERT model and double-channel attention |
CN110782008A (en) * | 2019-10-16 | 2020-02-11 | 北京百分点信息科技有限公司 | Training method, prediction method and device of deep learning model |
CN110750635A (en) * | 2019-10-21 | 2020-02-04 | 南京大学 | Joint deep learning model-based law enforcement recommendation method |
Non-Patent Citations (2)
Title |
---|
XIAOYU DONG 等: "Chinese NER by Span-Level Self-Attention", 《2019 15TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND SECURITY (CIS)》 * |
杨彬: "基于BERT词向量和Attention-CNN的智能司法研究", 《中国优秀博硕士学位论文全文数据库(硕士)》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113377944A (en) * | 2020-12-02 | 2021-09-10 | 中国司法大数据研究院有限公司 | Case feature extraction and law enforcement recommendation method and device based on multiple tasks |
CN112614024A (en) * | 2020-12-30 | 2021-04-06 | 成都数之联科技有限公司 | Case fact based intelligent law strip recommendation method, system, device and medium |
CN112614024B (en) * | 2020-12-30 | 2024-03-08 | 成都数之联科技股份有限公司 | Legal intelligent recommendation method, system, device and medium based on case facts |
Also Published As
Publication number | Publication date |
---|---|
CN111414454B (en) | 2022-07-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111275085B (en) | Online short video multi-modal emotion recognition method based on attention fusion | |
CN108363804B (en) | Local model weighted fusion Top-N movie recommendation method based on user clustering | |
CN109241255A (en) | A kind of intension recognizing method based on deep learning | |
CN106469560B (en) | Voice emotion recognition method based on unsupervised domain adaptation | |
Dai et al. | Learning discriminative features from spectrograms using center loss for speech emotion recognition | |
CN111966917A (en) | Event detection and summarization method based on pre-training language model | |
CN109271537A (en) | A kind of text based on distillation study is to image generating method and system | |
CN111414454B (en) | Law recommendation processing method based on bert model and law knowledge | |
CN110909529B (en) | User emotion analysis and prejudgment system of company image promotion system | |
CN110992988B (en) | Speech emotion recognition method and device based on domain confrontation | |
Lian et al. | Unsupervised representation learning with future observation prediction for speech emotion recognition | |
CN110135694A (en) | Product risks appraisal procedure, device, computer equipment and storage medium | |
CN113032601A (en) | Zero sample sketch retrieval method based on discriminant improvement | |
Cao et al. | Deep multi-view learning to rank | |
CN112417132A (en) | New intention recognition method for screening negative samples by utilizing predicate guest information | |
CN113239159A (en) | Cross-modal retrieval method of videos and texts based on relational inference network | |
Wu et al. | Investigations on classification methods for loan application based on machine learning | |
CN107193916A (en) | Method and system are recommended in a kind of personalized variation inquiry | |
CN116524960A (en) | Speech emotion recognition system based on mixed entropy downsampling and integrated classifier | |
CN113076425B (en) | Event related viewpoint sentence classification method for microblog comments | |
Fursov et al. | Sequence embeddings help to identify fraudulent cases in healthcare insurance | |
TWI761090B (en) | Dialogue data processing system and method thereof and computer readable medium | |
Li et al. | Research on the model of UBI car insurance rates rating based on CNN-softmax algorithm | |
CN112463965A (en) | Method and system for semantic understanding of text | |
Zhao et al. | Upgraded attention-based local feature learning block for speech emotion recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |