CN110968689A - Training method of criminal name and law bar prediction model and criminal name and law bar prediction method - Google Patents

Training method of criminal name and law bar prediction model and criminal name and law bar prediction method Download PDF

Info

Publication number
CN110968689A
CN110968689A CN201811160557.6A CN201811160557A CN110968689A CN 110968689 A CN110968689 A CN 110968689A CN 201811160557 A CN201811160557 A CN 201811160557A CN 110968689 A CN110968689 A CN 110968689A
Authority
CN
China
Prior art keywords
case description
criminal name
law
name
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811160557.6A
Other languages
Chinese (zh)
Inventor
张广鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Gridsum Technology Co Ltd
Original Assignee
Beijing Gridsum Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Gridsum Technology Co Ltd filed Critical Beijing Gridsum Technology Co Ltd
Priority to CN201811160557.6A priority Critical patent/CN110968689A/en
Publication of CN110968689A publication Critical patent/CN110968689A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/18Legal services; Handling legal documents

Abstract

The application discloses a training method of a criminal name and law enforcement prediction model, which comprises the following steps: acquiring sample data; wherein the sample data comprises: case description, a criminal name tag corresponding to the case description and a corresponding legal label; and then training the initial neural network model by adopting the sample data to obtain the neural network model meeting the training end condition as a criminal name and law article prediction model. Because the corresponding criminal name label and the corresponding legal bar label of the same case description are highly related, the incidence relation between the criminal name and the legal bar can be learned when sample data is used for training, and therefore, the criminal name and legal bar prediction model obtained through the training of the method has higher classification accuracy when a criminal name classification task and a legal bar classification task are executed. The application also discloses a criminal name and law enforcement prediction method, a training device, a prediction device, a processor and a storage medium.

Description

Training method of criminal name and law bar prediction model and criminal name and law bar prediction method
Technical Field
The application relates to the field of data processing, in particular to a training method of a criminal name and law statement prediction model, a criminal name and law statement prediction method, a corresponding device, a processor and a storage medium.
Background
With the advent of the big data era and the rise of artificial intelligence technology, it becomes possible to utilize machine learning algorithm to mine the rules in massive document data to realize the automatic prediction of the names of guilties and laws. One way to implement automatic prediction of the names of the guilties and the laws by using a machine learning algorithm is to use a classic deep learning model to classify the names of the guilties and the laws corresponding to the judicial documents respectively, thereby implementing automatic prediction of the names of the guilties and the laws.
However, when the classical deep learning model is applied to the judicial field, the accuracy of the classification of the names of guilties and the law bars is not high, so that the accuracy of the prediction of the names of guilties and the law bars is not high, and the model is difficult to be widely applied. Therefore, there is a need to provide a method for automatically predicting the names of crimes and laws so that the accuracy of the prediction of the names of crimes and laws is high.
Disclosure of Invention
In view of the above, the present application provides a training method for a criminal name and law bar prediction model, which realizes simultaneous prediction of criminal names and law bars by using a sample data training model including case description, corresponding criminal name labels and law bar labels, and has high accuracy. Correspondingly, the application also provides a method for predicting the criminal name and the legal instrument, a training device of a model for predicting the criminal name and the legal instrument, a device for predicting the criminal name and the legal instrument, a corresponding processor and a corresponding storage medium.
In a first aspect, the present application provides a training method for a criminal name and law enforcement prediction model, where the method includes:
acquiring sample data; the sample data includes: case description, a criminal name tag corresponding to the case description and a corresponding legal label;
training the initial neural network model by adopting the sample data to obtain a neural network model meeting training ending conditions as a criminal name and law bar prediction model; the criminal name and law bar prediction model takes case description as input and takes a criminal name label and a law bar label as output.
Optionally, the initial neural network model includes a feature extraction layer, where the feature extraction layer includes a plurality of parallel convolutional layers, and a size of a convolutional kernel of the plurality of parallel convolutional layers is smaller than a preset size.
Optionally, the plurality of parallel convolutional layers include a first convolutional layer, a second convolutional layer, a third convolutional layer, and a fourth convolutional layer, and sizes of convolutional cores of the first convolutional layer, the second convolutional layer, the third convolutional layer, and the fourth convolutional layer are 1 by 1, 3 by 3, 5 by 5, and 1 by 1, respectively.
Optionally, the second convolutional layer is further connected to a fifth convolutional layer, the third convolutional layer is further connected to a sixth convolutional layer, an output of the fifth convolutional layer is used as an input of the second convolutional layer, an output of the sixth convolutional layer is used as an input of the third convolutional layer, and a convolutional kernel of the fifth convolutional layer and a convolutional kernel of the sixth convolutional layer are 1 by 1.
Optionally, wherein the initial neural network model comprises a plurality of feature extraction layers; the plurality of feature extraction layers are cascaded through a full connection layer.
Optionally, the method further includes:
acquiring a judicial literature, and determining a case description section and a conclusion section of the judicial literature;
generating case description according to the case description section, and generating a guilt name tag and a legal label corresponding to the case description according to the conclusion section;
and generating sample data according to the case description, the criminal name tag corresponding to the case description and the legal label.
Optionally, the generating of the case description according to the case description section includes:
if the text length of the case description section is smaller than the preset length, filling characters at the tail of the case description section to generate case description, wherein the text length of the case description is equal to the preset length;
if the text length of the case description section is greater than the preset length, respectively intercepting texts at the head end and the tail end of the case description section to obtain a first text and a second text, and generating case description according to the first text and the second text; the sum of the text length of the first text and the text length of the second text is equal to a preset length.
A second aspect of the present application provides a method for predicting a criminal name and a legal provision, the method comprising:
acquiring case description;
inputting the case description into the criminal name and law statement prediction model, and acquiring a criminal name label and a law statement label output by the criminal name and law statement prediction model as prediction results;
the criminal name and law bar prediction model is obtained by training through the training method of the criminal name and law bar prediction model in the first aspect of the application.
A third aspect of the present application provides a training apparatus for a criminal name and law enforcement prediction model, the apparatus comprising:
an acquisition unit configured to acquire sample data; the sample data includes: case description, a criminal name tag corresponding to the case description and a corresponding legal label;
the training unit is used for training the initial neural network model by adopting the sample data to obtain a neural network model meeting the training end condition as a criminal name and law article prediction model; the criminal name and law bar prediction model takes case description as input and takes a criminal name label and a law bar label as output.
Optionally, the initial neural network model includes a feature extraction layer, where the feature extraction layer includes a plurality of parallel convolutional layers, and a size of a convolutional kernel of the plurality of parallel convolutional layers is smaller than a preset size.
Optionally, the plurality of parallel convolutional layers include a first convolutional layer, a second convolutional layer, a third convolutional layer, and a fourth convolutional layer, and sizes of convolutional cores of the first convolutional layer, the second convolutional layer, the third convolutional layer, and the fourth convolutional layer are 1 by 1, 3 by 3, 5 by 5, and 1 by 1, respectively.
Optionally, the second convolutional layer is further connected to a fifth convolutional layer, the third convolutional layer is further connected to a sixth convolutional layer, an output of the fifth convolutional layer is used as an input of the second convolutional layer, an output of the sixth convolutional layer is used as an input of the third convolutional layer, and a convolutional kernel of the fifth convolutional layer and a convolutional kernel of the sixth convolutional layer are 1 by 1.
Optionally, the initial neural network model includes a plurality of feature extraction layers; the plurality of feature extraction layers are cascaded through a full connection layer.
Optionally, the apparatus further comprises:
the determining unit is used for acquiring the judicial documents and determining case description sections and conclusion sections of the judicial documents;
the first generation unit is used for generating case description according to the case description section and generating a criminal name label and a legal label corresponding to the case description according to the conclusion section;
and the second generation unit is used for generating sample data according to the case description, the criminal name tag corresponding to the case description and the legal label.
Optionally, the first generating unit is specifically configured to:
if the text length of the case description section is smaller than the preset length, filling characters at the tail of the case description section to generate case description, wherein the text length of the case description is equal to the preset length;
if the text length of the case description section is greater than the preset length, respectively intercepting texts at the head end and the tail end of the case description section to obtain a first text and a second text, and generating case description according to the first text and the second text; the sum of the text length of the first text and the text length of the second text is equal to a preset length.
A fourth aspect of the present application provides a device for predicting a criminal name and a legal instrument, the device comprising:
the acquisition unit is used for acquiring case description;
and the prediction unit is used for inputting the case description into the criminal name and law bar prediction model and acquiring the criminal name label and law bar label output by the criminal name and law bar prediction model as prediction results.
A fifth aspect of the present application provides a processor for executing a program, wherein the program executes a training method of a crime name and law enforcement prediction model according to the first aspect of the present application or a crime name and law enforcement prediction method according to the second aspect of the present application.
A sixth aspect of the present application provides a storage medium including a stored program, wherein when the program runs, a device on which the storage medium is located is controlled to execute a training method of a crime name and law enforcement prediction model according to the first aspect of the present application or a crime name and law enforcement prediction method according to the second aspect of the present application.
According to the technical scheme, the embodiment of the application has the following advantages:
the embodiment of the application provides a method for predicting a criminal name and a legal bar, which is realized based on a criminal name and legal bar prediction model, wherein the criminal name and legal bar prediction model is obtained by training a neural network by using sample data formed by case description, criminal name labels corresponding to the case description and corresponding legal bar labels, and because the criminal name labels and the legal bar labels corresponding to the same case description are highly related, the incidence relation between the criminal name and the legal bar can be learned during training by using the sample data, the criminal name and legal bar prediction model obtained by training by the method has higher classification accuracy rate when a criminal name classification task and a legal bar classification task are executed, and has higher prediction accuracy rate when the criminal name and legal bar prediction model is used for predicting the criminal name and the legal bar.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a scene structure diagram of a training method of a criminal name and law enforcement prediction model in the embodiment of the present application;
FIG. 2 is a flowchart of a training method of a criminal name and law enforcement prediction model according to an embodiment of the present disclosure;
FIG. 3 is a schematic structural diagram of an initial neural network model in an embodiment of the present application;
FIG. 4 is a schematic structural diagram of an initial neural network model in an embodiment of the present application;
FIG. 5 is a schematic structural diagram of an initial neural network model in an embodiment of the present application;
FIG. 6 is a diagram of a scene architecture of a criminal name and law enforcement prediction method according to an embodiment of the present disclosure;
FIG. 7 is a flowchart of a method for predicting a criminal name and a legal title according to an embodiment of the present disclosure;
FIG. 8 is a schematic structural diagram of a training apparatus for a criminal name and law enforcement prediction model according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a device for predicting a criminal name and a legal instrument in an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Aiming at the technical problems that the accuracy rate of the classical deep learning model applied to the judicial field is not high, so that the accuracy rate of the criminal name and the legal classification is not high, and the criminal name prediction and the legal prediction are difficult to be widely applied, the invention provides a criminal name and legal prediction method, which is realized based on a criminal name and legal prediction model, wherein the criminal name and legal prediction model is obtained by utilizing a sample data training neural network formed by a case description, a criminal name label corresponding to the case description and a corresponding legal label, and because the criminal name label and the legal label corresponding to the same case description are highly related, the incidence relation between the criminal name and the legal can be learned during the training by utilizing the sample data, so the criminal name and legal prediction model obtained by the training method has higher classification accuracy rate when executing the criminal classification task and the legal classification task, therefore, the criminal name and law statement prediction model has higher prediction accuracy in the process of predicting the criminal name and law statement.
In order to facilitate understanding of the technical scheme of the present application, a training method of a criminal name and law enforcement prediction model provided by the embodiment of the present application is introduced first.
The training method of the criminal name and law bar prediction model provided by the embodiment of the application can be applied to data processing equipment, the data processing equipment comprises any equipment capable of processing text data, and as a specific example of the application, the data processing equipment can be a server, and comprises an independent server or a server cluster formed by a plurality of servers. In a specific implementation, the training method for the criminal name and law enforcement prediction model provided by the embodiment of the application can be stored in the server in the form of an application program, and when the server runs the application program, the training method for the criminal name and law enforcement prediction model provided by the embodiment of the application is executed. The application program may be a stand-alone application program, or may be a functional module or a plug-in integrated with another application program. It should be noted that, in other possible implementation manners of the embodiment of the present application, the data processing device may also be a terminal device.
Next, an application scenario of the training method of the guilt name and law enforcement prediction model in the embodiment of the present application is described in terms of a server. Referring to fig. 1, a scene architecture diagram of a training method of a criminal name and law enforcement prediction model is shown, and an application scene includes a training server 10 and a storage server 20.
The storage server 20 may capture data such as judicial documents from the internet, extract case description, the criminal name tag corresponding to the case description, and the corresponding statute tag from the data, and generate sample data according to the case description, the criminal name tag, and the statute tag. Then, the training server 10 acquires sample data from the storage server 20, and trains the initial neural network model using the sample data to obtain a neural network model satisfying a training end condition as a crime name and law enforcement prediction model. The case description is used as an input of the case name and law bar prediction model, and the case name tag and law bar tag are used as an output of the case name and law bar prediction model, so that after the case description is obtained, the case name and law bar corresponding to the case description can be predicted through the case name and law bar prediction model.
In order to make the technical solution of the embodiment of the present application easier to understand, the training method of the criminal name and law bar prediction model of the embodiment of the present application will be described below with reference to the accompanying drawings. The embodiment is described from the perspective of a server, and does not limit the technical solution of the present application. Referring to fig. 2, a flow chart of a training method of a criminal name and law enforcement prediction model is shown, and the method comprises the following steps:
s201: and acquiring sample data.
The sample data includes: case description, a criminal name tag corresponding to the case description and a corresponding legal label. In specific implementation, the server can acquire sample data from the sample database, and can also capture judicial documents from the internet in real time to acquire the sample data. The judicial documents refer to special documents formed and used by the judicial authorities for investigation, examination, approval, notarization and the like in each link and step of processing various cases, and can include judgment documents, referents and the like.
Taking the case as an example, because the case has a specific literary format, including a case description section describing the fact and a conclusion section describing the conclusion, the case description and the corresponding criminal name tag and legal tag can be extracted from the case description to generate sample data. After the sample data is generated, the sample data can be stored in the sample database, so that the sample data can be directly acquired from the sample database during subsequent training.
The process of generating sample data is explained in detail below. Specifically, the server can obtain the judicial documents from the websites related to the judicial field, such as a judgment document website and the like, determine case description sections and conclusion sections of the judicial documents in a text parsing technology and the like, then generate case description according to the case description sections, generate the criminal name tags and the legal provision tags corresponding to the case description according to the conclusion sections, and then generate sample data according to the case description, the criminal name tags corresponding to the case description and the legal provision tags.
In order to reduce the difficulty of model training and improve the efficiency of model training, case description can be processed, so that the text length corresponding to the case description in each sample data is equal to the preset length. In specific implementation, the server may count the text length of the case description segment, and perform corresponding processing on the case description segment according to the text length. Specifically, if the text length of the case description segment is smaller than a preset length, filling characters at the end of the case description segment to generate a case description, wherein the text length of the case description is equal to the preset length; if the text length of the case description section is greater than the preset length, respectively intercepting texts at the head end and the tail end of the case description section to obtain a first text and a second text, and generating case description according to the first text and the second text; and the sum of the text length of the first text and the text length of the second text is equal to a preset length.
The preset length may be set according to an empirical value, as an example, the preset length may be 600 characters, and in a specific implementation, the server intercepts the case description segment into an equal-length segment of the 600 characters to form the case description. Specifically, if the text length of the case description segment is less than 600 characters, pad identification is filled behind the case description segment, the pad identification represents the filled characters, when the text length of the case description segment reaches 600 characters, filling is stopped, and the filled case description segment is used as a safety description; if the text length of the case description section is greater than 600 characters, the first 550 characters at the head end of the case description section can be intercepted to obtain a first text, the last 50 characters at the tail end of the case description section can be intercepted to obtain a second text, and the first text and the second text are spliced to form the case description with the text length of 600 characters.
It should be noted that, the sum of the text length of the first text and the text length of the second text is equal to the preset length, and the text length of the first text and the text length of the second text may be set according to actual requirements, and are not limited to 550 characters and 50 characters in the above embodiment.
It can be understood that by intercepting and processing the case description section, most of information irrelevant to the forecasting of the criminal name and the law clause can be filtered, so that the interference on model training is avoided, the training difficulty is reduced, and the accuracy of the model is improved.
S202: and training the initial neural network model by adopting the sample data to obtain the neural network model meeting the training end condition as a criminal name and law article prediction model.
After the server obtains the sample data, the initial neural network model can be trained by adopting the sample data. Specifically, after sample data is input into the initial neural network model, the initial neural network model can predict the criminal name and the legal bar corresponding to the case description, then the predicted criminal name is compared with the criminal name label, the predicted legal bar is compared with the legal bar label, and the parameters of the initial neural network model are updated according to the comparison result, so that the training of the initial neural network model is realized.
When the neural network model meets the training end condition, the neural network model can be used as a criminal name and law enforcement prediction model, the criminal name and law enforcement prediction model takes case description as input, and the criminal name label and law enforcement label as output and is used for predicting criminal names and law enforcement. As a specific example of the present application, the training end condition may be that the target function of the model is in a convergence state, and thus, when the target function of the neural network model is in the convergence state, the training is stopped, and the neural network model satisfying the training end condition may be used as a criminal name and law provision prediction model for predicting the criminal name and law provision.
In the embodiment of the application, the criminal name and law statement prediction model is a model for processing the Text, so the initial neural network model used for training can be a Text-based convolutional neural network model Text CNN or a Text-based recurrent neural network model Text RNN. Training a Text CNN or Text RNN model to simultaneously execute a criminal classification task and a legal classification task, so that the accuracy of the model in the two classification tasks can be improved.
Therefore, the embodiment of the application provides a training method of a criminal name and law bar prediction model, an initial neural network model is trained by using sample data formed by case description, criminal name labels corresponding to the case description and corresponding law bar labels, and the criminal name labels and the law bar labels corresponding to the same case description are highly related, so that the incidence relation between the criminal name and the law bar can be learned when the sample data is used for training, and therefore, the criminal name and law bar prediction model trained by the method has high classification accuracy rate when a criminal name classification task and a law bar classification task are executed, and has wide application prospect.
The Text CNN network can extract input local features, reduce the number of weights and reduce the complexity of a network model. In the field of image processing, in the traditional convolution operation, an output feature is connected with all input features at a certain position, and certain redundancy exists, so that an inclusion module is provided for the industry to approximate a sparse CNN model so as to improve the characterization capability of the model and further improve the classification accuracy. Based on this, the thought is applied to the judicial field, and the accurate category of the criminal name and the law enforcement prediction can be further improved.
During specific implementation, the large convolution kernel of the feature extraction layer in the initial neural network model can be converted into a plurality of parallel small convolution kernels, so that the parameters of the model can be obviously reduced, the model is sparse, overfitting of the model is avoided, and the accuracy of model classification is improved. In some possible implementations, the initial neural network model includes a feature extraction layer, where the feature extraction layer includes a plurality of parallel convolutional layers, and a size of a convolutional kernel of the plurality of parallel convolutional layers is smaller than a preset size. The predetermined size may be set according to an empirical value, for example, the predetermined size may be 7, and the convolution kernel smaller than the predetermined size may be 1 by 1, which is denoted as 1 × 1, or 3 × 3 or 5 × 5.
Fig. 3 shows a schematic structural diagram of an initial neural network model, referring to fig. 3, the initial neural network model includes an Input Layer, an encoding Layer, a feature extraction Layer, a fully connected Layer, a ConcatenationLayer, and an Output Layer, wherein the feature extraction Layer includes a plurality of convolutional layers in parallel, specifically, a first convolutional Layer, a second convolutional Layer, a third convolutional Layer, and a fourth convolutional Layer, where a convolutional kernel of the first convolutional Layer is 1 × 1, a convolutional kernel of the second convolutional Layer is 3 × 3, a convolutional kernel of the third convolutional Layer is 5 × 5, and a convolutional kernel of the fourth convolutional Layer is 1 × 1.
After an initial neural network model is built, sample data is input into the initial neural network model through an input layer, the input layer can preprocess the sample data, specifically, case description in the sample data can be segmented, then a segmentation result is input into a coding layer to code the segmentation result to obtain a word vector, then the word vector is input into a feature extraction layer, a plurality of parallel convolution layers of the feature extraction layer can respectively carry out convolution processing on the word vector to realize feature extraction, then the features extracted by the feature extraction layer are spliced by a full connection layer, the spliced features are input into an output layer, and the output layer can map the features through a classifier to realize criminal classification and French classification.
In the above embodiment, the feature map after convolution of the second convolutional layer and the third convolutional layer has a larger thickness, and in order to reduce the feature map thickness, one convolutional layer with a convolution kernel of 1 × 1 may be added before the second convolutional layer and the third convolutional layer, specifically, referring to fig. 4, the second convolutional layer is further connected with the fifth convolutional layer, the third convolutional layer is further connected with the sixth convolutional layer, an output of the fifth convolutional layer is used as an input of the second convolutional layer, an output of the sixth convolutional layer is used as an input of the third convolutional layer, a convolution kernel of the fifth convolutional layer is 1 by 1, and a convolution kernel of the sixth convolutional layer is 1 by 1.
In order to increase the depth of the model and improve the characterization capability of the model, the initial neural network model may include a plurality of feature extraction layers, and the plurality of feature extraction layers may be cascaded through a full connection layer. Referring to fig. 5 in particular, the neural network model includes two feature extraction layers shown in fig. 4, and the two feature extraction layers are cascaded through a full connection layer. Therefore, a structure deeper than that of the traditional Text CNN can be obtained, and the model obtained by the structure training is applied to the criminal name prediction task and the legal bar prediction task in the judicial field and has higher accuracy.
It should be noted that fig. 5 illustrates 2 feature extraction layers as an example, which does not limit the technical solution of the present application, and in other possible implementation manners of the embodiment of the present application, the number of feature extraction layers may also be 3 or more.
The structure of the initial neural network model is mainly introduced, and after the initial neural network model is constructed, the server can set the learning rate and train the initial neural network model based on the learning rate. Specifically, the initial learning rate may be set to 0.001, and after the 2 nd round, i.e., the second epoch, is set, the learning rate is attenuated to 0.85 every 10000 steps. And then, setting the early stop parameter to be 5, representing that the results are not improved after five rounds of training, meeting the training end condition, and stopping training.
It should be further noted that the sample data obtained in the embodiment of the present application may be used for training and testing, based on which, part of the generated sample data may be stored in the training set, and the other part of the generated sample data may be stored in the test set, the initial neural network model is trained by using the training data, which is the sample data of the training set, and after the training is performed to obtain the criminal name and the law enforcement prediction model, the indicators such as the accuracy of the criminal name and the law enforcement prediction model are tested by using the testing data, which is the sample data of the test set. As a specific example of the present application, 170 thousand sample data may be used for model training, and an additional 20 thousand sample data may be used for model testing.
Based on the concrete implementation manners of the training method of the criminal name and law enforcement prediction model provided by the embodiment of the application, the embodiment of the application also provides a criminal name and law enforcement prediction method based on the criminal name and law enforcement prediction model. The method can be applied to a server and can also be applied to a terminal device, and is exemplified in the perspective of the server hereinafter. Next, an application scenario of the criminal name and law enforcement prediction method according to the embodiment of the present application will be described.
Referring to a scene structure diagram of a method for predicting a criminal name and a legal instrument shown in fig. 6, an application scene includes a prediction server 100 and a terminal device 200, wherein a judicial officer such as a judge uploads a case description to the prediction server 100 through the terminal device 200, after obtaining the case description, the prediction server 100 inputs the case description to a built-in criminal name and legal instrument prediction model, and then obtains a criminal name tag and a legal instrument tag output by the criminal name and legal instrument prediction model as prediction results. The prediction results may provide assistance to a judge performing a decision or adjudication task.
In order to make the technical solution of the present application clearer and easier to understand, detailed descriptions will be provided below for the criminal name and the law enforcement prediction method provided by the embodiment of the present application with reference to the drawings. Referring to fig. 7, a flowchart of a method for predicting a criminal name and a law enforcement is shown, the method comprising:
s701: and acquiring case description.
In specific implementation, the server may obtain the case description in a manner of receiving the case description uploaded by the user based on the terminal device. The user can be a judicial officer and the like, the case description can be uploaded to the server through the terminal device when the case is audited, the server can predict the criminal name and the statute corresponding to the current case description based on the judgment result of the historical case, and then the prediction result of the server is used as the reference for the case.
In specific implementation, the server may intercept the content uploaded by the terminal device, so that the text length of the case description is equal to a preset length. For example, when the case description content uploaded by the terminal device is greater than 600 characters, the first 550 characters and the second 50 characters may be intercepted and spliced to generate the case description. For another example, when the case description content uploaded by the terminal device is less than 600 characters, the characters may be filled at the end to generate the case description, where the text length of the case description is equal to 600 characters.
S702: and inputting the case description into the criminal name and law statement prediction model, and acquiring a criminal name label and a law statement label output by the criminal name and law statement prediction model as prediction results.
The criminal name and law bar prediction model is obtained by training through the training method of the criminal name and law bar prediction model provided by the embodiment of the application. After acquiring the case description, the server inputs the case description into a criminal name and law bar prediction model which is built in the server, the criminal name and law bar prediction model is obtained based on historical data training and represents the incidence relation among the case description, the criminal name and the law bar, and a criminal name label and a law bar label which correspond to the current case description can be output based on the incidence relation. Therefore, the server can obtain the criminal name label and the legal label output by the criminal name and legal prediction model as prediction results.
It should be noted that, when case description is input into the case name and law bar prediction model, the case description may be preprocessed by the case name and law bar prediction model, for example, a word segmentation tool is used to segment words for case description, then word segmentation results are encoded into word vectors through a word2vec module in an encoding layer, the word vectors are input into a feature extraction layer to extract features, then the extracted features are input into a full connection layer to be spliced, and then the spliced features are mapped through an output layer, so that the classification of the case names and the classification of the law bars are realized.
It can be seen from the above that, the embodiment of the application provides a method for predicting a criminal name and a legal system, which is realized based on a criminal name and legal system prediction model, the model is obtained by training a neural network by using sample data formed by case description, criminal name labels corresponding to the case description and corresponding legal system labels, and because the criminal name labels and the legal system labels corresponding to the same case description are highly related, the incidence relation between the criminal name and the legal system can be learned during training by using the sample data, the criminal name and legal system prediction model obtained by training through the method has higher classification accuracy rate when a criminal name classification task and a legal system classification task are executed, and has higher prediction accuracy rate when the criminal name and the legal system prediction model is used for predicting the criminal name and the legal system.
Based on the above concrete implementation manners of the training method of the criminal name and law statement prediction model and the criminal name and law statement prediction method provided by the embodiment of the application, the embodiment of the application also provides a training device of the criminal name and law statement prediction model and a criminal name and law statement prediction device, and the device provided by the embodiment of the application is introduced from the aspect of function modularization.
Referring to fig. 8, a schematic structural diagram of a training apparatus for a criminal name and law enforcement prediction model is shown, the apparatus comprising:
an obtaining unit 810, configured to obtain sample data; the sample data includes: case description, a criminal name tag corresponding to the case description and a corresponding legal label;
a training unit 820, configured to train the initial neural network model by using the sample data to obtain a neural network model satisfying a training end condition, which is used as a criminal name and law article prediction model; the criminal name and law bar prediction model takes case description as input and takes a criminal name label and a law bar label as output.
Optionally, the initial neural network model includes a feature extraction layer, where the feature extraction layer includes a plurality of parallel convolutional layers, and a size of a convolutional kernel of the plurality of parallel convolutional layers is smaller than a preset size.
Optionally, the plurality of parallel convolutional layers include a first convolutional layer, a second convolutional layer, a third convolutional layer, and a fourth convolutional layer, and sizes of convolutional cores of the first convolutional layer, the second convolutional layer, the third convolutional layer, and the fourth convolutional layer are 1 by 1, 3 by 3, 5 by 5, and 1 by 1, respectively.
Optionally, the second convolutional layer is further connected to a fifth convolutional layer, the third convolutional layer is further connected to a sixth convolutional layer, an output of the fifth convolutional layer is used as an input of the second convolutional layer, an output of the sixth convolutional layer is used as an input of the third convolutional layer, and a convolutional kernel of the fifth convolutional layer and a convolutional kernel of the sixth convolutional layer are 1 by 1.
Optionally, the initial neural network model includes a plurality of feature extraction layers; the plurality of feature extraction layers are cascaded through a full connection layer.
Optionally, the apparatus further comprises:
the determining unit is used for acquiring the judicial documents and determining case description sections and conclusion sections of the judicial documents;
the first generation unit is used for generating case description according to the case description section and generating a criminal name label and a legal label corresponding to the case description according to the conclusion section;
and the second generation unit is used for generating sample data according to the case description, the criminal name tag corresponding to the case description and the legal label.
Optionally, the first generating unit is specifically configured to:
if the text length of the case description section is smaller than the preset length, filling characters at the tail of the case description section to generate case description, wherein the text length of the case description is equal to the preset length;
if the text length of the case description section is greater than the preset length, respectively intercepting texts at the head end and the tail end of the case description section to obtain a first text and a second text, and generating case description according to the first text and the second text; the sum of the text length of the first text and the text length of the second text is equal to a preset length.
Therefore, the embodiment of the application provides a training device for a criminal name and law bar prediction model, the training device trains an initial neural network model by using sample data formed by case description, criminal name labels corresponding to the case description and corresponding law bar labels, and since the criminal name labels and the law bar labels corresponding to the same case description are highly correlated, the device can learn the incidence relation between the criminal name and the law bar during training by using the sample data, so that the criminal name and law bar prediction model trained by the training device has high classification accuracy rate during executing criminal name classification tasks and law bar classification tasks, and has wide application prospect.
Next, referring to a schematic structural diagram of a criminal name and law enforcement prediction apparatus shown in fig. 9, the apparatus includes:
an obtaining unit 910, configured to obtain a case description;
and the prediction unit 920 is configured to input the case description into the criminal name and law enforcement prediction model, and obtain a criminal name tag and a law enforcement tag output by the criminal name and law enforcement prediction model as prediction results.
It can be seen from the above that, the embodiment of the present application provides a device for predicting a criminal name and a legal system, which is implemented based on a criminal name and legal system prediction model, wherein the model is obtained by training a neural network with sample data formed by case description, criminal name tags corresponding to the case description and corresponding legal system tags, and because the criminal name tags and the legal system tags corresponding to the same case description are highly correlated, the device can learn the correlation between the criminal name and the legal system during training with the sample data, so that the device has a high classification accuracy rate when executing a criminal name classification task and a legal system classification task, and has a high prediction accuracy rate when predicting the criminal name and the legal system based on the criminal name and the legal system prediction model.
The training device of the criminal name and law sentence prediction model comprises a processor and a memory, wherein the acquisition unit, the training unit, the determination unit, the first generation unit, the second generation unit and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The device for predicting the criminal name and the legal provision comprises a processor and a memory, wherein the acquisition unit, the prediction unit and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory. The kernel can be set to be one or more than one, and the training of a criminal name and law enforcement prediction model or criminal name and law enforcement prediction can be realized by adjusting the kernel parameters.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
An embodiment of the present application provides a storage medium on which a program is stored, the program implementing a training method of the crime name and law enforcement prediction model or the crime name and law enforcement prediction method when being executed by a processor.
The embodiment of the application provides a processor, wherein the processor is used for running a program, and the program runs to execute a training method of the criminal name and law enforcement prediction model or the criminal name and law enforcement prediction method.
The embodiment of the application provides equipment, the equipment comprises a processor, a memory and a program which is stored on the memory and can run on the processor, and the following steps are realized when the processor executes the program:
acquiring sample data; the sample data includes: case description, a criminal name tag corresponding to the case description and a corresponding legal label;
training the initial neural network model by adopting the sample data to obtain a neural network model meeting training ending conditions as a criminal name and law bar prediction model; the criminal name and law bar prediction model takes case description as input and takes a criminal name label and a law bar label as output.
Optionally, the processor is further configured to execute the steps of any implementation manner of the training method for the criminal name and law enforcement prediction model in the embodiment of the present application.
The embodiment of the present application provides another device, where the device includes a processor, a memory, and a program stored in the memory and capable of running on the processor, and the processor implements the following steps when executing the program:
acquiring case description;
inputting the case description into the criminal name and law statement prediction model, and acquiring a criminal name label and a law statement label output by the criminal name and law statement prediction model as prediction results;
the criminal name and law bar prediction model is obtained by training through the training method of the criminal name and law bar prediction model.
The device herein may be a server, a PC, a PAD, a mobile phone, etc.
The present application further provides a computer program product adapted to execute a program of initializing the steps of a training method of a crime name and law enforcement prediction model or any one implementation of the crime name and law enforcement prediction method provided in the embodiments of the present application when executed on a data processing device.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (12)

1. A training method of a criminal name and law enforcement prediction model is characterized by comprising the following steps:
acquiring sample data; the sample data includes: case description, a criminal name tag corresponding to the case description and a corresponding legal label;
training the initial neural network model by adopting the sample data to obtain a neural network model meeting training ending conditions as a criminal name and law bar prediction model; the criminal name and law bar prediction model takes case description as input and takes a criminal name label and a law bar label as output.
2. The method of claim 1, wherein the initial neural network model comprises a feature extraction layer, wherein the feature extraction layer comprises a plurality of parallel convolutional layers, and wherein the sizes of convolutional kernels of the plurality of parallel convolutional layers are smaller than a preset size.
3. The method of claim 2, wherein the plurality of parallel convolutional layers comprises a first convolutional layer, a second convolutional layer, a third convolutional layer, and a fourth convolutional layer, and the sizes of the convolutional cores of the first convolutional layer, the second convolutional layer, the third convolutional layer, and the fourth convolutional layer are 1 by 1, 3 by 3, 5 by 5, and 1 by 1, respectively.
4. The method of claim 3, wherein a fifth convolutional layer is further connected to the second convolutional layer, wherein a sixth convolutional layer is further connected to the third convolutional layer, wherein an output of the fifth convolutional layer is provided as an input to the second convolutional layer, wherein an output of the sixth convolutional layer is provided as an input to the third convolutional layer, wherein a convolutional kernel of the fifth convolutional layer is 1 by 1, and wherein a convolutional kernel of the sixth convolutional layer is 1 by 1.
5. The method of any one of claims 2 to 4, wherein the initial neural network model comprises a plurality of feature extraction layers; the plurality of feature extraction layers are cascaded through a full connection layer.
6. The method of claim 1, further comprising:
acquiring a judicial literature, and determining a case description section and a conclusion section of the judicial literature;
generating case description according to the case description section, and generating a guilt name tag and a legal label corresponding to the case description according to the conclusion section;
and generating sample data according to the case description, the criminal name tag corresponding to the case description and the legal label.
7. The method of claim 6, wherein generating the case description according to the case description segment comprises:
if the text length of the case description section is smaller than the preset length, filling characters at the tail of the case description section to generate case description, wherein the text length of the case description is equal to the preset length;
if the text length of the case description section is greater than the preset length, respectively intercepting texts at the head end and the tail end of the case description section to obtain a first text and a second text, and generating case description according to the first text and the second text; the sum of the text length of the first text and the text length of the second text is equal to a preset length.
8. A method for predicting a criminal name and a legal instrument is characterized by comprising the following steps:
acquiring case description;
inputting the case description into the criminal name and law statement prediction model, and acquiring a criminal name label and a law statement label output by the criminal name and law statement prediction model as prediction results;
the criminal name and law article prediction model is obtained by training through the training method of the criminal name and law article prediction model according to any one of claims 1 to 7.
9. A training apparatus for a criminal name and law enforcement prediction model, the apparatus comprising:
an acquisition unit configured to acquire sample data; the sample data includes: case description, a criminal name tag corresponding to the case description and a corresponding legal label;
the training unit is used for training the initial neural network model by adopting the sample data to obtain a neural network model meeting the training end condition as a criminal name and law article prediction model; the criminal name and law bar prediction model takes case description as input and takes a criminal name label and a law bar label as output.
10. A device for predicting a criminal name and a legal instrument, the device comprising:
the acquisition unit is used for acquiring case description;
and the prediction unit is used for inputting the case description into the criminal name and law bar prediction model and acquiring the criminal name label and law bar label output by the criminal name and law bar prediction model as prediction results.
11. A processor, characterized in that the processor is configured to run a program, wherein the program is configured to execute a training method of a crime name and law enforcement prediction model according to any one of claims 1 to 7 or a crime name and law enforcement prediction method according to claim 8.
12. A storage medium comprising a stored program, wherein the apparatus on which the storage medium is installed is controlled to execute the training method of the crime name and law statement prediction model according to any one of claims 1 to 7 or the crime name and law statement prediction method according to claim 8 when the program is executed.
CN201811160557.6A 2018-09-30 2018-09-30 Training method of criminal name and law bar prediction model and criminal name and law bar prediction method Pending CN110968689A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811160557.6A CN110968689A (en) 2018-09-30 2018-09-30 Training method of criminal name and law bar prediction model and criminal name and law bar prediction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811160557.6A CN110968689A (en) 2018-09-30 2018-09-30 Training method of criminal name and law bar prediction model and criminal name and law bar prediction method

Publications (1)

Publication Number Publication Date
CN110968689A true CN110968689A (en) 2020-04-07

Family

ID=70029141

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811160557.6A Pending CN110968689A (en) 2018-09-30 2018-09-30 Training method of criminal name and law bar prediction model and criminal name and law bar prediction method

Country Status (1)

Country Link
CN (1) CN110968689A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111815485A (en) * 2020-06-12 2020-10-23 中国司法大数据研究院有限公司 Sentencing prediction method and device based on deep learning BERT model
CN112101559A (en) * 2020-09-04 2020-12-18 中国航天科工集团第二研究院 Case and criminal name inference method based on machine learning
CN112308453A (en) * 2020-11-19 2021-02-02 上海优扬新媒信息技术有限公司 Risk identification model training method, user risk identification method and related device
CN112966072A (en) * 2021-03-11 2021-06-15 暨南大学 Case prediction method and device, electronic device and storage medium
CN113221560A (en) * 2021-05-31 2021-08-06 平安科技(深圳)有限公司 Personality trait and emotion prediction method, personality trait and emotion prediction device, computer device, and medium
CN113515631A (en) * 2021-06-18 2021-10-19 深圳大学 Method, device, terminal equipment and storage medium for predicting criminal name
CN113515631B (en) * 2021-06-18 2024-05-17 深圳大学 Method, device, terminal equipment and storage medium for predicting crime name

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120215791A1 (en) * 2011-02-22 2012-08-23 Malik Hassan H Entity fingerprints
TW201501067A (en) * 2013-06-27 2015-01-01 Ci-Xing Chen Sentencing method and system
US20150172243A1 (en) * 2013-12-16 2015-06-18 Whistler Technologies, Inc. Compliance mechanism for messaging
CN106815263A (en) * 2015-12-01 2017-06-09 北京国双科技有限公司 The searching method and device of legal provision
CN106970987A (en) * 2017-03-29 2017-07-21 陈�峰 A kind of data analysing method and device
CN107122451A (en) * 2017-04-26 2017-09-01 北京科技大学 A kind of legal documents case by grader method for auto constructing
CN107358558A (en) * 2017-06-08 2017-11-17 上海市高级人民法院 Criminal case intelligently handle a case method by auxiliary, system and has its storage medium and terminal device
CN107818138A (en) * 2017-09-28 2018-03-20 银江股份有限公司 A kind of case legal regulation recommends method and system
CN108009284A (en) * 2017-12-22 2018-05-08 重庆邮电大学 Using the Law Text sorting technique of semi-supervised convolutional neural networks
CN108133436A (en) * 2017-11-23 2018-06-08 科大讯飞股份有限公司 Automatic method and system of deciding a case
CN108416440A (en) * 2018-03-20 2018-08-17 上海未来伙伴机器人有限公司 A kind of training method of neural network, object identification method and device
CN108563703A (en) * 2018-03-26 2018-09-21 北京北大英华科技有限公司 A kind of determination method of charge, device and computer equipment, storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120215791A1 (en) * 2011-02-22 2012-08-23 Malik Hassan H Entity fingerprints
TW201501067A (en) * 2013-06-27 2015-01-01 Ci-Xing Chen Sentencing method and system
US20150172243A1 (en) * 2013-12-16 2015-06-18 Whistler Technologies, Inc. Compliance mechanism for messaging
CN106815263A (en) * 2015-12-01 2017-06-09 北京国双科技有限公司 The searching method and device of legal provision
CN106970987A (en) * 2017-03-29 2017-07-21 陈�峰 A kind of data analysing method and device
CN107122451A (en) * 2017-04-26 2017-09-01 北京科技大学 A kind of legal documents case by grader method for auto constructing
CN107358558A (en) * 2017-06-08 2017-11-17 上海市高级人民法院 Criminal case intelligently handle a case method by auxiliary, system and has its storage medium and terminal device
CN107818138A (en) * 2017-09-28 2018-03-20 银江股份有限公司 A kind of case legal regulation recommends method and system
CN108133436A (en) * 2017-11-23 2018-06-08 科大讯飞股份有限公司 Automatic method and system of deciding a case
CN108009284A (en) * 2017-12-22 2018-05-08 重庆邮电大学 Using the Law Text sorting technique of semi-supervised convolutional neural networks
CN108416440A (en) * 2018-03-20 2018-08-17 上海未来伙伴机器人有限公司 A kind of training method of neural network, object identification method and device
CN108563703A (en) * 2018-03-26 2018-09-21 北京北大英华科技有限公司 A kind of determination method of charge, device and computer equipment, storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
方清: "基于深度学习的自然场景文本检测与识别" *
邓文超: "基于深度学习的司法智能研究" *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111815485A (en) * 2020-06-12 2020-10-23 中国司法大数据研究院有限公司 Sentencing prediction method and device based on deep learning BERT model
CN112101559A (en) * 2020-09-04 2020-12-18 中国航天科工集团第二研究院 Case and criminal name inference method based on machine learning
CN112101559B (en) * 2020-09-04 2023-08-04 中国航天科工集团第二研究院 Case crime name deducing method based on machine learning
CN112308453A (en) * 2020-11-19 2021-02-02 上海优扬新媒信息技术有限公司 Risk identification model training method, user risk identification method and related device
CN112308453B (en) * 2020-11-19 2023-04-28 度小满科技(北京)有限公司 Risk identification model training method, user risk identification method and related devices
CN112966072A (en) * 2021-03-11 2021-06-15 暨南大学 Case prediction method and device, electronic device and storage medium
CN113221560A (en) * 2021-05-31 2021-08-06 平安科技(深圳)有限公司 Personality trait and emotion prediction method, personality trait and emotion prediction device, computer device, and medium
CN113515631A (en) * 2021-06-18 2021-10-19 深圳大学 Method, device, terminal equipment and storage medium for predicting criminal name
CN113515631B (en) * 2021-06-18 2024-05-17 深圳大学 Method, device, terminal equipment and storage medium for predicting crime name

Similar Documents

Publication Publication Date Title
CN110968689A (en) Training method of criminal name and law bar prediction model and criminal name and law bar prediction method
CN110598620B (en) Deep neural network model-based recommendation method and device
CN113095346A (en) Data labeling method and data labeling device
US11914963B2 (en) Systems and methods for determining and using semantic relatedness to classify segments of text
CN107392311B (en) Method and device for segmenting sequence
CN110245227B (en) Training method and device for text classification fusion classifier
CN114387567B (en) Video data processing method and device, electronic equipment and storage medium
CN112052451A (en) Webshell detection method and device
Shen et al. A joint model for multimodal document quality assessment
CN113222022A (en) Webpage classification identification method and device
CN117409419A (en) Image detection method, device and storage medium
CN110968664A (en) Document retrieval method, device, equipment and medium
CN117132763A (en) Power image anomaly detection method, device, computer equipment and storage medium
CN111651981B (en) Data auditing method, device and equipment
CN114372532A (en) Method, device, equipment, medium and product for determining label marking quality
CN113449816A (en) Website classification model training method, website classification method, device, equipment and medium
CN114254588B (en) Data tag processing method and device
CN113257227B (en) Speech recognition model performance detection method, device, equipment and storage medium
CN117173530B (en) Target abnormality detection method and device
CN117056836B (en) Program classification model training and program category identification method and device
CN115392341A (en) Information auditing method and device and storage medium
Ying et al. Unsafe behaviour detection with the improved YOLOv5 model
CN115359402A (en) Video labeling method and device, equipment, medium and product thereof
CN116112763A (en) Method and system for automatically generating short video content labels
CN117808816A (en) Image anomaly detection method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200407