CN109446514A - Construction method, device and the computer equipment of news property identification model - Google Patents

Construction method, device and the computer equipment of news property identification model Download PDF

Info

Publication number
CN109446514A
CN109446514A CN201811089168.9A CN201811089168A CN109446514A CN 109446514 A CN109446514 A CN 109446514A CN 201811089168 A CN201811089168 A CN 201811089168A CN 109446514 A CN109446514 A CN 109446514A
Authority
CN
China
Prior art keywords
model
label
chinese character
news
random field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811089168.9A
Other languages
Chinese (zh)
Inventor
黄萍
汪伟
肖京
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201811089168.9A priority Critical patent/CN109446514A/en
Publication of CN109446514A publication Critical patent/CN109446514A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • G06F40/295Named entity recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

Construction method, device, computer equipment and the storage medium for the news property identification model based on transfer learning that this application involves a kind of.The described method includes: building Named Entity Extraction Model;The neural network parameter of nervus opticus network model in trained part-of-speech tagging model in advance is extracted, and initializes the first nerves network model of Named Entity Extraction Model according to neural network parameter;News corpus training sample is obtained, the first Chinese marks in news corpus training sample have corresponding label;The first word vector is converted by the first Chinese character, and the first word vector is input to first nerves network model, obtains the first eigenvector of Chinese character;Using the corresponding first eigenvector of the first Chinese character and corresponding label, Training is carried out to goal condition random field models, obtains news property identification model.The recognition effect of news property identification model is able to ascend using this method.

Description

Construction method, device and the computer equipment of news property identification model
Technical field
This application involves machine learning techniques field, more particularly to a kind of news property identification model construction method, Device, computer equipment and storage medium.
Background technique
Currently, neural network model combination conventional machines learning model --- condition random field (Conditional Random Field algorithm, CRF) model is to name the current main model of Entity recognition.Mind in the main model It also can be realized the automatic extraction of feature without artificial Feature Engineering through network model, but the determination of complicated model parameter is past It is past to need to be trained by a large amount of labeled data.But in newsletter archive corpus, language that Business Name is marked or annotates Expect that data are in contrast less, be not enough to the model of training complexity, traditional news property is caused to identify to newsletter archive corpus In Business Name recognition effect it is unsatisfactory.
Summary of the invention
Based on this, it is necessary to be imitated for traditional news property identification to the identification of the Business Name in newsletter archive corpus The unsatisfactory technical problem of fruit provides construction method, device, computer equipment and the storage of a kind of news property identification model Medium.
A kind of construction method of news property identification model, which comprises
Named Entity Extraction Model is constructed, the Named Entity Extraction Model includes first nerves network model and target Conditional random field models;
The neural network parameter of nervus opticus network model in trained part-of-speech tagging model in advance is extracted, and according to institute It states neural network parameter and initializes the first nerves network model;
News corpus training sample is obtained, the first Chinese marks in the news corpus training sample have corresponding Label;
The first word vector is converted by first Chinese character, and the first word vector is input to first mind Through network model, the first eigenvector of the Chinese character is obtained;
Using the corresponding first eigenvector of first Chinese character and corresponding label, to the goal condition with Airport model carries out Training, obtains news property identification model.
It is described in one of the embodiments, to utilize the corresponding first eigenvector of first Chinese character and correspondence Label, Training, the step of obtaining news property identification model, packet are carried out to the goal condition random field models It includes:
The first eigenvector is input to goal condition random field models, obtains the news corpus training sample Prediction label sequence;
The default sequence label of the news corpus training sample is determined according to the corresponding label of first Chinese character, And calculate the cross entropy between the predictive marker sequence and the default sequence label;
The parameter of the goal condition random field layer is adjusted, so that the intersection entropy minimization.
It is described in one of the embodiments, to extract nervus opticus network model in trained part-of-speech tagging model in advance Neural network parameter the step of before, comprising:
Part-of-speech tagging model is constructed, the part-of-speech tagging model includes nervus opticus network model and source condition random field Model;
Part-of-speech tagging training sample is obtained, wherein the second Chinese marks in the part-of-speech tagging training sample are corresponding Part of speech label;
The second word vector is converted by second Chinese character, and the second word vector is input to nervus opticus net Network model obtains the second feature vector of second Chinese character;
The second feature vector is input in the source conditional random field models, second Chinese character is obtained Predict part of speech label;
According to the prediction part of speech label of second Chinese character and corresponding part of speech label, by reverse transfer and Gradient descent method is adjusted the parameter of nervus opticus network model and source conditional random field models.
In one of the embodiments, the first nerves network model include forward recursive neural network hidden layer and Backward recursive neural network hidden layer;
It is described that the first word vector is input to the first nerves network model, obtain the first of the Chinese character The step of feature vector, comprising:
The first word vector is input in the forward recursive neural network hidden layer, to hidden state sequence before obtaining Column;
The first word vector is input in the backward recursive neural network hidden layer, to hidden state sequence after acquisition Column;
Merge the hidden status switch of the forward direction and the backward hidden status switch generates the of first Chinese character One feature vector.
In one of the embodiments, after described the step of obtaining news property identification model, further includes:
Obtain news corpus test sample, the corresponding mark of third Chinese marks in the news corpus test sample Label;
The word vector of the third Chinese character is input in news property identification model, the news corpus is obtained and surveys The prediction label sequence of sample sheet;
It is calculated according to the label of the prediction label sequence of the news corpus test sample and the third Chinese character The error rate of Business Name recognition result;
If the error rate is greater than preset threshold, to the goal condition random field mould in the news property identification model Type carries out parameter adjustment.
A kind of construction device of news property identification model, described device include:
Model construction module, for constructing Named Entity Extraction Model, the Named Entity Extraction Model includes the first mind Through network model and goal condition random field models;
Neural network parameter obtains module, for extracting nervus opticus network mould in preparatory trained part-of-speech tagging model The neural network parameter of type, and the first nerves network model is initialized according to the neural network parameter;
Training sample obtains module, for obtaining news corpus training sample, in the news corpus training sample the One Chinese marks have corresponding label;
Feature vector obtains module, for converting the first word vector for first Chinese character, and by described first Word vector is input to the first nerves network model, obtains the first eigenvector of the Chinese character;
Model training module, for utilizing the corresponding first eigenvector of first Chinese character and corresponding mark Label carry out Training to the goal condition random field models, obtain news property identification model.
The model training module in one of the embodiments, for the first eigenvector to be input to target Conditional random field models obtain the prediction label sequence of the news corpus training sample;According to first Chinese character pair The label answered determines the default sequence label of the news corpus training sample, and calculate the predictive marker sequence with it is described pre- If the cross entropy between sequence label;The parameter of the goal condition random field layer is adjusted, so that the intersection entropy minimization.
The construction device of news property identification model further includes that originating task obtains module in one of the embodiments,;
The originating task obtains module, and for constructing part-of-speech tagging model, the part-of-speech tagging model includes nervus opticus Network model and source conditional random field models;Part-of-speech tagging training sample is obtained, wherein in the part-of-speech tagging training sample The corresponding part of speech label of the second Chinese marks;The second word vector is converted by second Chinese character, and will be described Second word vector is input to nervus opticus network model, obtains the second feature vector of second Chinese character;By described Two feature vectors are input in the source conditional random field models, obtain the prediction part of speech label of second Chinese character;Root According to the prediction part of speech label and corresponding part of speech label of second Chinese character, pass through reverse transfer and gradient descent method The parameter of nervus opticus network model and source conditional random field models is adjusted.
A kind of computer equipment, including memory and processor, the memory are stored with computer program, the processing Device performs the steps of when executing the computer program
Named Entity Extraction Model is constructed, the Named Entity Extraction Model includes first nerves network model and target Conditional random field models;
The neural network parameter of nervus opticus network model in trained part-of-speech tagging model in advance is extracted, and according to institute It states neural network parameter and initializes the first nerves network model;
News corpus training sample is obtained, the first Chinese marks in the news corpus training sample have corresponding Label;
The first word vector is converted by first Chinese character, and the first word vector is input to first mind Through network model, the first eigenvector of the Chinese character is obtained;
Using the corresponding first eigenvector of first Chinese character and corresponding label, to the goal condition with Airport model carries out Training, obtains news property identification model.
A kind of computer readable storage medium, is stored thereon with computer program, and the computer program is held by processor It is performed the steps of when row
Named Entity Extraction Model is constructed, the Named Entity Extraction Model includes first nerves network model and target Conditional random field models;
The neural network parameter of nervus opticus network model in trained part-of-speech tagging model in advance is extracted, and according to institute It states neural network parameter and initializes the first nerves network model;
News corpus training sample is obtained, the first Chinese marks in the news corpus training sample have corresponding Label;
The first word vector is converted by first Chinese character, and the first word vector is input to first mind Through network model, the first eigenvector of the Chinese character is obtained;
Using the corresponding first eigenvector of first Chinese character and corresponding label, to the goal condition with Airport model carries out Training, obtains news property identification model.
Construction method, device, computer equipment and the storage medium of above-mentioned news property identification model utilize preparatory training Good part-of-speech tagging model moves to name Entity recognition mould as originating task, using neural network model as reusable feature In type, neural network model output is the text information feature for inputting the Chinese character of corpus, so that name Entity recognition mould Type only needs this layer of training objective condition random field, less in the corpus data for having had mark or annotation of Business Name In the case of, mention the Named Entity Extraction Model for training to the accuracy rate of the Business Name identification in newsletter archive corpus Height promotes the recognition effect of news property identification model.
Detailed description of the invention
Fig. 1 is the schematic diagram of internal structure of computer equipment in one embodiment of the invention;
Fig. 2 is the flow chart of the construction method of news property identification model in one embodiment of the invention;
Fig. 3 is the structural block diagram of the construction device of news property identification model in one embodiment of the invention;
Fig. 4 is the structural block diagram of the construction device of news property identification model in another embodiment of the present invention;
Fig. 5 is the structural block diagram of the construction device of news property identification model in another embodiment of the invention.
Specific embodiment
It is with reference to the accompanying drawings and embodiments, right in order to which the objects, technical solutions and advantages of the application are more clearly understood The application is further elaborated.It should be appreciated that specific embodiment described herein is only used to explain the application, not For limiting the application.
Fig. 1 is the schematic diagram of internal structure of computer equipment in one embodiment.The computer equipment can be server, As shown in Figure 1, the computer equipment includes processor, memory, network interface and the database connected by system bus.Its In, the processor of the computer equipment is for providing calculating and control ability.The memory of the computer equipment includes non-volatile Property storage medium, built-in storage.The non-volatile memory medium is stored with operating system, computer program and database.This is interior Memory provides environment for the operation of operating system and computer program in non-volatile memory medium.The computer equipment Database is for storing the data such as news corpus training sample, neural network parameter.The network interface of the computer equipment is used for It is communicated with external terminal by network connection.To realize a kind of news property identification when the computer program is executed by processor Model method.
It will be understood by those skilled in the art that structure shown in Fig. 1, only part relevant to application scheme is tied The block diagram of structure does not constitute the restriction for the computer equipment being applied thereon to application scheme, specific computer equipment It may include perhaps combining certain components or with different component layouts than more or fewer components as shown in the figure.
In one embodiment, as shown in Fig. 2, a kind of construction method of news property identification model is provided, with the party Method is applied to be illustrated for the server in Fig. 1, comprising the following steps:
Step S210: building Named Entity Extraction Model, Named Entity Extraction Model include first nerves network model with And goal condition random field models.
In this step, server construction Named Entity Extraction Model, Named Entity Extraction Model is neural network model knot It is built-up to close conditional random field models.
Step S220: the neural network ginseng of nervus opticus network model in trained part-of-speech tagging model in advance is extracted Number, and first nerves network model is initialized according to neural network parameter.
In this step, part-of-speech tagging model is the building of neural network model conjugation condition random field models, in advance in Chinese It is trained under language environment;Server utilizes the nerve net of part-of-speech tagging model using the part-of-speech tagging model as originating task The neural network parameter of network model initializes the neural network model of Named Entity Extraction Model.
Step S230: obtaining news corpus training sample, and the first Chinese marks in news corpus training sample have Corresponding label.
In this step, server, which obtains Chinese marks, has the news corpus of label as training sample, wherein news The corresponding Chinese character of Business Name is come out by label for labelling as name entity in corpus training sample;Specifically, can The label of the Chinese character in news corpus training sample to be made an addition to the rear end of each Chinese character, scratched with underscore; Wherein, label for labelling rule can use BIOES mode, and B label, i.e. Begin indicate beginning character;I label, i.e., Intermediate indicates intermediate character;E label, i.e. End indicate ending character;S label, i.e. Single indicate single word Symbol;O label, i.e. Other indicate other characters, for marking unrelated character, i.e. Business Name pair in news corpus training sample The Chinese character segmentation markers answered have B label, I label and E label, and in news corpus training sample except Business Name with Outer Chinese character is labeled as O label.
Step S240: the first word vector is converted by the first Chinese character, and the first word vector is input to first nerves Network model obtains the first eigenvector of Chinese character.
In this step, the Chinese character in news corpus training sample is converted word vector by server, and by word vector It is input in the neural network model of Named Entity Extraction Model, to obtain each Chinese character in news corpus training sample Feature vector;Specifically, server can use general word2vec model for first in instruction news corpus training sample Middle text is converted into corresponding word vector.
Step S250: the corresponding first eigenvector of the first Chinese character and corresponding label are utilized, to goal condition Random field models carry out Training, obtain news property identification model.
In this step, the corresponding first eigenvector of the first Chinese character of server by utilizing and corresponding label, to life The conditional random field models of name entity recognition model carry out Training, and the Named Entity Extraction Model after training is as final News property identification model, wherein only need the parameter to the conditional random field models of Named Entity Extraction Model to adjust Whole, the neural network model of Named Entity Extraction Model does not need re -training.
In the construction method of above-mentioned news property identification model, appoint using preparatory trained part-of-speech tagging model as source Business exports neural network model moving in Named Entity Extraction Model as reusable feature, neural network model It is the text information feature for inputting the Chinese character of corpus, so that Named Entity Extraction Model only needs training objective condition random This layer of field makes to train the name come in the case where the corpus data for having had mark or annotation of Business Name is less Entity recognition model improves the accuracy rate of the Business Name identification in newsletter archive corpus, promotes news property identification model Recognition effect.
In one embodiment, using the corresponding first eigenvector of the first Chinese character and corresponding label, to mesh Mark the step of conditional random field models carry out Training, obtain news property identification model, comprising: by first eigenvector Goal condition random field models are input to, the prediction label sequence of news corpus training sample is obtained;According to the first Chinese character Corresponding label determines the default sequence label of news corpus training sample, and calculates predictive marker sequence and default sequence label Between cross entropy;The parameter of goal condition random field layer is adjusted, so as to intersect entropy minimization.
In the present embodiment, the feature vector of the middle character in news corpus training sample is input to goal condition by server It will be that default sequence label is defeated as the true value of goal condition random field models in news corpus training sample in random field models Out, by being minimised as target with the cross entropy between predictive marker sequence and default sequence label, adjustment updates goal condition The parameter of random field layer, the predictive marker sequence for improving the output of goal condition random field models are consistent with true sequence label Property, the accuracy rate of the Business Name identification in newsletter archive corpus is improved to improve Named Entity Extraction Model, in addition, with Cross entropy effectively avoids the problem that learning rate reduces when gradient declines as loss function.
In one embodiment, after the step of obtaining news property identification model, further includes: obtain news corpus test Sample, the corresponding label of third Chinese marks in news corpus test sample;The word vector of third Chinese character is defeated Enter into news property identification model, obtains the prediction label sequence of news corpus test sample;According to news corpus test specimens The error rate of label Computer Corp. title recognition result of prediction label sequence originally and third Chinese character;If error rate is big In preset threshold, then parameter adjustment is carried out to the goal condition random field models in news property identification model.
The above-mentioned test process for the goal condition random field models in news property identification model, by obtaining news language The character vector of Chinese character is input in the first nerves network of news property identification model in material test sample, obtains news The feature vector of Chinese character in corpus test sample, it is then that the feature vector of Chinese character in news corpus test sample is defeated Enter into the goal condition random field models of news property identification model, obtains the prediction label sequence of news corpus test sample Column;Prediction label sequence and the sequence label of Chinese character in news corpus test sample are compared, news property is calculated The error rate of Business Name recognition result in identification model, if error rate is greater than preset threshold, i.e., when threshold value is not achieved in accuracy rate, Again parameter adjustment is carried out to the goal condition random field models in news property identification model, to guarantee the accurate of sequence label Property, thus the accuracy of guarantee company's title identification.
In one embodiment, first nerves network model includes forward recursive neural network hidden layer and backward recursive Neural network hidden layer;First word vector is input to first nerves network model, obtains the first eigenvector of Chinese character The step of, comprising: the first word vector is input in forward recursive neural network hidden layer, to hidden status switch before obtaining;It will First word vector is input in backward recursive neural network hidden layer, to hidden status switch after acquisition;To hidden state sequence before merging Column and backward hidden status switch generate the first eigenvector of the first Chinese character.
In the present embodiment, the word vector of the Chinese character in news corpus training sample is input to name entity by server In the neural network model of identification model, by forward recursive neural network hidden layer, according to the previous word of current word vector The hidden state vector of vector calculates the hidden status switch of forward direction of current word vector;And pass through backward recursive neural network hidden layer, The backward hidden status switch of current word vector is calculated according to the hidden state vector of the latter word vector of current word vector, then will The hidden status switch of forward direction is cascaded with backward hidden status switch, obtains the feature vector of Chinese character, wherein feature vector packet The dependence for containing Chinese character Yu front and back Chinese character can in the subsequent progress Entity recognition to newsletter archive corpus It provides more to language, semantic relevant feature, effectively reduces identification Business Name task to the labeled data in professional domain Dependence.
In one embodiment, the nerve net of nervus opticus network model in trained part-of-speech tagging model in advance is extracted Before the step of network parameter, comprising: building part-of-speech tagging model, part-of-speech tagging model includes nervus opticus network model and source Conditional random field models;Part-of-speech tagging training sample is obtained, wherein the second Chinese marks in part-of-speech tagging training sample Corresponding part of speech label;The second word vector is converted by the second Chinese character, and the second word vector is input to nervus opticus net Network model obtains the second feature vector of the second Chinese character;Second feature vector is input in the conditional random field models of source, Obtain the prediction part of speech label of the second Chinese character;According to the prediction part of speech label of the second Chinese character and corresponding part of speech mark Label carry out the parameter of nervus opticus network model and source conditional random field models by reverse transfer and gradient descent method Adjustment.
The present embodiment is the training process of part-of-speech tagging model, server construction by neural network model and source condition with The part-of-speech tagging model that airport model is constituted, and obtain the training sample of part-of-speech tagging model, wherein part-of-speech tagging training sample Each Chinese character is labeled with corresponding part of speech label, such as verb, noun, adjective etc. in this;Server is by part of speech Mark each Chinese character of training sample is converted into word vector, and the word vector is input to the nerve net of part-of-speech tagging model In network model, the feature vector of Chinese character is obtained;This feature vector is input in the conditional random field models of source, in each Chinese character carries out part-of-speech tagging, obtains part-of-speech tagging result;According to the part-of-speech tagging result of each Chinese character and originally Part of speech label is adjusted the parameter of source conditional random field models and neural network model.
It should be understood that although each step in the flow chart of Fig. 2 is successively shown according to the instruction of arrow, this A little steps are not that the inevitable sequence according to arrow instruction successively executes.Unless expressly state otherwise herein, these steps It executes there is no the limitation of stringent sequence, these steps can execute in other order.Moreover, at least part in Fig. 2 Step may include that perhaps these sub-steps of multiple stages or stage are executed in synchronization to multiple sub-steps It completes, but can execute at different times, the execution sequence in these sub-steps or stage, which is also not necessarily, successively to be carried out, But it can be executed in turn or alternately at least part of the sub-step or stage of other steps or other steps.
In one embodiment, as shown in figure 3, providing a kind of construction device of news property identification model, comprising: mould Type constructs module 310, neural network parameter obtains module 320, training sample obtains module 330, feature vector obtains module 340 With model training module 350, in which:
Model construction module 310, for constructing Named Entity Extraction Model, Named Entity Extraction Model includes first nerves Network model and goal condition random field models;
Neural network parameter obtains module 320, for extracting nervus opticus net in preparatory trained part-of-speech tagging model The neural network parameter of network model, and first nerves network model is initialized according to neural network parameter;
Training sample obtain module 330, for obtaining news corpus training sample, in news corpus training sample first Chinese marks have corresponding label;
Feature vector obtains module 340, for converting the first word vector for the first Chinese character, and by the first word vector It is input to first nerves network model, obtains the first eigenvector of Chinese character;
Model training module 350, for utilizing the corresponding first eigenvector of the first Chinese character and corresponding label, Training is carried out to goal condition random field models, obtains news property identification model.
In one embodiment, model training module 350 is used to first eigenvector being input to goal condition random field Model obtains the prediction label sequence of news corpus training sample;News language is determined according to the corresponding label of the first Chinese character Expect the default sequence label of training sample, and calculates the cross entropy between predictive marker sequence and default sequence label;Adjust mesh The parameter of condition random field layer is marked, so as to intersect entropy minimization.
In one embodiment, as shown in figure 4, providing a kind of construction device of news property identification model, news is real The construction device of body identification model further includes that originating task obtains module 360;Originating task obtains module 360, for constructing part of speech mark Injection molding type, part-of-speech tagging model include nervus opticus network model and source conditional random field models;Obtain part-of-speech tagging training Sample, the wherein corresponding part of speech label of the second Chinese marks in part-of-speech tagging training sample;Second Chinese character is turned The second word vector is turned to, and the second word vector is input to nervus opticus network model, obtain the second Chinese character second is special Levy vector;Second feature vector is input in the conditional random field models of source, the prediction part of speech label of the second Chinese character is obtained; According to the prediction part of speech label of the second Chinese character and corresponding part of speech label, pass through reverse transfer and gradient descent method pair The parameter of nervus opticus network model and source conditional random field models is adjusted.
In one embodiment, first nerves network model includes forward recursive neural network hidden layer and backward recursive Neural network hidden layer;Feature vector obtains module 340 and implies for the first word vector to be input to forward recursive neural network In layer, to hidden status switch before obtaining;First word vector is input in backward recursive neural network hidden layer, to hidden after acquisition Status switch;The first eigenvector of the first Chinese character is generated before merging to hidden status switch and backward hidden status switch.
In one embodiment, as shown in figure 5, providing a kind of construction device of news property identification model, news is real The construction device of body identification model further includes model measurement module 370, and for obtaining news corpus test sample, news corpus is surveyed The corresponding label of third Chinese marks in sample sheet;The word vector of third Chinese character is input to news property identification In model, the prediction label sequence of news corpus test sample is obtained;According to the prediction label sequence of news corpus test sample And the error rate of label Computer Corp. title recognition result of third Chinese character;If error rate is greater than preset threshold, right Goal condition random field models in news property identification model carry out parameter adjustment.
The specific restriction of construction device about news property identification model may refer to know above for news property The restriction of the construction method of other model, details are not described herein.Each mould in the construction device of above-mentioned news property identification model Block can be realized fully or partially through software, hardware and combinations thereof.Above-mentioned each module can be embedded in the form of hardware or independence In processor in computer equipment, it can also be stored in a software form in the memory in computer equipment, in order to Processor, which calls, executes the corresponding operation of the above modules.
In one embodiment, a kind of computer equipment, including memory and processor are provided, which is stored with Computer program, the processor perform the steps of when executing computer program
Named Entity Extraction Model is constructed, Named Entity Extraction Model includes first nerves network model and goal condition Random field models;
The neural network parameter of nervus opticus network model in trained part-of-speech tagging model in advance is extracted, and according to mind First nerves network model is initialized through network parameter;
News corpus training sample is obtained, the first Chinese marks in news corpus training sample have corresponding mark Label;
The first word vector is converted by the first Chinese character, and the first word vector is input to first nerves network model, Obtain the first eigenvector of Chinese character;
Using the corresponding first eigenvector of the first Chinese character and corresponding label, to goal condition random field models Training is carried out, news property identification model is obtained.
In one embodiment, processor executes computer program realization and utilizes the corresponding fisrt feature of the first Chinese character Vector and corresponding label carry out Training to goal condition random field models, obtain news property identification model It when step, implements following steps: first eigenvector is input to goal condition random field models, obtain news corpus instruction Practice the prediction label sequence of sample;The default label of news corpus training sample is determined according to the corresponding label of the first Chinese character Sequence, and calculate the cross entropy between predictive marker sequence and default sequence label;The parameter of goal condition random field layer is adjusted, So as to intersect entropy minimization.
In one embodiment, building part-of-speech tagging model is also performed the steps of when processor executes computer program, Part-of-speech tagging model includes nervus opticus network model and source conditional random field models;Part-of-speech tagging training sample is obtained, The corresponding part of speech label of the second Chinese marks in middle part-of-speech tagging training sample;Second is converted by the second Chinese character Word vector, and the second word vector is input to nervus opticus network model, obtain the second feature vector of the second Chinese character;It will Second feature vector is input in the conditional random field models of source, obtains the prediction part of speech label of the second Chinese character;According to second The prediction part of speech label of Chinese character and corresponding part of speech label, by reverse transfer and gradient descent method to nervus opticus The parameter of network model and source conditional random field models is adjusted.
In one embodiment, first nerves network model includes forward recursive neural network hidden layer and backward recursive Neural network hidden layer;Processor executes computer program realization and the first word vector is input to first nerves network model, obtains When obtaining the step of the first eigenvector of Chinese character, implements following steps: the first word vector is input to forward recursive In neural network hidden layer, to hidden status switch before obtaining;First word vector is input to backward recursive neural network hidden layer In, to hidden status switch after acquisition;The first Chinese character is generated to hidden status switch and backward hidden status switch before merging First eigenvector.
In one embodiment, it is also performed the steps of when processor executes computer program and obtains news corpus test Sample, the corresponding label of third Chinese marks in news corpus test sample;The word vector of third Chinese character is defeated Enter into news property identification model, obtains the prediction label sequence of news corpus test sample;According to news corpus test specimens The error rate of label Computer Corp. title recognition result of prediction label sequence originally and third Chinese character;If error rate is big In preset threshold, then parameter adjustment is carried out to the goal condition random field models in news property identification model.
In one embodiment, a kind of computer readable storage medium is provided, computer program is stored thereon with, is calculated Machine program performs the steps of when being executed by processor
Named Entity Extraction Model is constructed, Named Entity Extraction Model includes first nerves network model and goal condition Random field models;
The neural network parameter of nervus opticus network model in trained part-of-speech tagging model in advance is extracted, and according to mind First nerves network model is initialized through network parameter;
News corpus training sample is obtained, the first Chinese marks in news corpus training sample have corresponding mark Label;
The first word vector is converted by the first Chinese character, and the first word vector is input to first nerves network model, Obtain the first eigenvector of Chinese character;
Using the corresponding first eigenvector of the first Chinese character and corresponding label, to goal condition random field models Training is carried out, news property identification model is obtained.
In one embodiment, computer program is executed by processor realization using corresponding first spy of the first Chinese character Vector and corresponding label are levied, Training is carried out to goal condition random field models, obtains news property identification model Step when, implement following steps: first eigenvector is input to goal condition random field models, obtains news corpus The prediction label sequence of training sample;The pre- bidding of news corpus training sample is determined according to the corresponding label of the first Chinese character Sequence is signed, and calculates the cross entropy between predictive marker sequence and default sequence label;Adjust the ginseng of goal condition random field layer Number, so as to intersect entropy minimization.
In one embodiment, building part-of-speech tagging mould is also performed the steps of when computer program is executed by processor Type, part-of-speech tagging model include nervus opticus network model and source conditional random field models;Part-of-speech tagging training sample is obtained, The corresponding part of speech label of the second Chinese marks wherein in part-of-speech tagging training sample;Is converted by the second Chinese character Two word vectors, and the second word vector is input to nervus opticus network model, obtain the second feature vector of the second Chinese character; Second feature vector is input in the conditional random field models of source, the prediction part of speech label of the second Chinese character is obtained;According to The prediction part of speech label of two Chinese characters and corresponding part of speech label, by reverse transfer and gradient descent method to the second mind Parameter through network model and source conditional random field models is adjusted.
In one embodiment, first nerves network model includes forward recursive neural network hidden layer and backward recursive Neural network hidden layer;Computer program is executed by processor realization and the first word vector is input to first nerves network model, When obtaining the step of the first eigenvector of Chinese character, following steps are implemented: to passing before the first word vector is input to Return in neural network hidden layer, to hidden status switch before obtaining;First word vector is input to backward recursive neural network to imply In layer, to hidden status switch after acquisition;The first Chinese character is generated to hidden status switch and backward hidden status switch before merging First eigenvector.
In one embodiment, it is also performed the steps of when computer program is executed by processor and obtains news corpus survey Sample sheet, the corresponding label of third Chinese marks in news corpus test sample;By the word vector of third Chinese character It is input in news property identification model, obtains the prediction label sequence of news corpus test sample;It is tested according to news corpus The error rate of label Computer Corp. title recognition result of the prediction label sequence and third Chinese character of sample;If error rate Greater than preset threshold, then parameter adjustment is carried out to the goal condition random field models in news property identification model.
Those of ordinary skill in the art will appreciate that realizing all or part of the process in above-described embodiment method, being can be with Relevant hardware is instructed to complete by computer program, the computer program can be stored in a non-volatile computer In read/write memory medium, the computer program is when being executed, it may include such as the process of the embodiment of above-mentioned each method.Wherein, To any reference of memory, storage, database or other media used in each embodiment provided herein, Including non-volatile and/or volatile memory.Nonvolatile memory may include read-only memory (ROM), programming ROM (PROM), electrically programmable ROM (EPROM), electrically erasable ROM (EEPROM) or flash memory.Volatile memory may include Random access memory (RAM) or external cache.By way of illustration and not limitation, RAM is available in many forms, Such as static state RAM (SRAM), dynamic ram (DRAM), synchronous dram (SDRAM), double data rate sdram (DDRSDRAM), enhancing Type SDRAM (ESDRAM), synchronization link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic ram (DRDRAM) and memory bus dynamic ram (RDRAM) etc..
Each technical characteristic of above embodiments can be combined arbitrarily, for simplicity of description, not to above-described embodiment In each technical characteristic it is all possible combination be all described, as long as however, the combination of these technical characteristics be not present lance Shield all should be considered as described in this specification.
The several embodiments of the application above described embodiment only expresses, the description thereof is more specific and detailed, but simultaneously It cannot therefore be construed as limiting the scope of the patent.It should be pointed out that coming for those of ordinary skill in the art It says, without departing from the concept of this application, various modifications and improvements can be made, these belong to the protection of the application Range.Therefore, the scope of protection shall be subject to the appended claims for the application patent.

Claims (10)

1. a kind of construction method of news property identification model, which comprises
Named Entity Extraction Model is constructed, the Named Entity Extraction Model includes first nerves network model and goal condition Random field models;
The neural network parameter of nervus opticus network model in trained part-of-speech tagging model in advance is extracted, and according to the mind The first nerves network model is initialized through network parameter;
News corpus training sample is obtained, the first Chinese marks in the news corpus training sample have corresponding mark Label;
The first word vector is converted by first Chinese character, and the first word vector is input to the first nerves net Network model obtains the first eigenvector of the Chinese character;
Using the corresponding first eigenvector of first Chinese character and corresponding label, to the goal condition random field Model carries out Training, obtains news property identification model.
2. the method according to claim 1, wherein described utilize corresponding first spy of first Chinese character Vector and corresponding label are levied, Training is carried out to the goal condition random field models, obtains news property identification The step of model, comprising:
The first eigenvector is input to goal condition random field models, obtains the prediction of the news corpus training sample Sequence label;
The default sequence label of the news corpus training sample is determined according to the corresponding label of first Chinese character, and is counted Calculate the cross entropy between the predictive marker sequence and the default sequence label;
The parameter of the goal condition random field layer is adjusted, so that the intersection entropy minimization.
3. the method according to claim 1, wherein described extract in advance the in trained part-of-speech tagging model Before the step of neural network parameter of two neural network models, comprising:
Part-of-speech tagging model is constructed, the part-of-speech tagging model includes nervus opticus network model and source condition random field mould Type;
Part-of-speech tagging training sample is obtained, wherein the corresponding word of the second Chinese marks in the part-of-speech tagging training sample Property label;
The second word vector is converted by second Chinese character, and the second word vector is input to nervus opticus network mould Type obtains the second feature vector of second Chinese character;
The second feature vector is input in the source conditional random field models, the prediction of second Chinese character is obtained Part of speech label;
According to the prediction part of speech label of second Chinese character and corresponding part of speech label, pass through reverse transfer and gradient Descent method is adjusted the parameter of nervus opticus network model and source conditional random field models.
4. the method according to claim 1, wherein the first nerves network model includes forward recursive nerve Network hidden layer and backward recursive neural network hidden layer;
It is described that the first word vector is input to the first nerves network model, obtain the fisrt feature of the Chinese character The step of vector, comprising:
The first word vector is input in the forward recursive neural network hidden layer, to hidden status switch before obtaining;
The first word vector is input in the backward recursive neural network hidden layer, to hidden status switch after acquisition;
Merge the hidden status switch of the forward direction and the backward hidden status switch generates the first spy of first Chinese character Levy vector.
5. the method according to claim 1, which is characterized in that described to obtain news property identification model The step of after, further includes:
Obtain news corpus test sample, the corresponding label of third Chinese marks in the news corpus test sample;
The word vector of the third Chinese character is input in news property identification model, the news corpus test specimens are obtained This prediction label sequence;
According to the prediction label sequence of the news corpus test sample and label Computer Corp. of the third Chinese character The error rate of title recognition result;
If the error rate be greater than preset threshold, to the goal condition random field models in the news property identification model into The adjustment of row parameter.
6. a kind of construction device of news property identification model, which is characterized in that described device includes:
Model construction module, for constructing Named Entity Extraction Model, the Named Entity Extraction Model includes first nerves net Network model and goal condition random field models;
Neural network parameter obtains module, for extracting nervus opticus network model in preparatory trained part-of-speech tagging model Neural network parameter, and the first nerves network model is initialized according to the neural network parameter;
Training sample obtain module, for obtaining news corpus training sample, in the news corpus training sample first in Chinese character is labeled with corresponding label;
Feature vector obtains module, for converting the first word vector for first Chinese character, and by first word to Amount is input to the first nerves network model, obtains the first eigenvector of the Chinese character;
Model training module, it is right for utilizing the corresponding first eigenvector of first Chinese character and corresponding label The goal condition random field models carry out Training, obtain news property identification model.
7. device according to claim 6, which is characterized in that the model training module is used for the fisrt feature Vector is input to goal condition random field models, obtains the prediction label sequence of the news corpus training sample;According to described The corresponding label of first Chinese character determines the default sequence label of the news corpus training sample, and calculates the pre- mark Remember the cross entropy between sequence and the default sequence label;The parameter for adjusting the goal condition random field layer, so that described Intersect entropy minimization.
8. device according to claim 6, which is characterized in that further include that originating task obtains module;
The originating task obtains module, and for constructing part-of-speech tagging model, the part-of-speech tagging model includes nervus opticus network Model and source conditional random field models;Part-of-speech tagging training sample is obtained, wherein the in the part-of-speech tagging training sample The corresponding part of speech label of two Chinese marks;The second word vector is converted by second Chinese character, and by described second Word vector is input to nervus opticus network model, obtains the second feature vector of second Chinese character;It is special by described second Sign vector is input in the source conditional random field models, obtains the prediction part of speech label of second Chinese character;According to institute The prediction part of speech label and corresponding part of speech label for stating the second Chinese character, by reverse transfer and gradient descent method to The parameter of two neural network models and source conditional random field models is adjusted.
9. a kind of computer equipment, including memory and processor, the memory are stored with computer program, feature exists In the step of processor realizes any one of claims 1 to 5 the method when executing the computer program.
10. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program The step of method described in any one of claims 1 to 5 is realized when being executed by processor.
CN201811089168.9A 2018-09-18 2018-09-18 Construction method, device and the computer equipment of news property identification model Pending CN109446514A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811089168.9A CN109446514A (en) 2018-09-18 2018-09-18 Construction method, device and the computer equipment of news property identification model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811089168.9A CN109446514A (en) 2018-09-18 2018-09-18 Construction method, device and the computer equipment of news property identification model

Publications (1)

Publication Number Publication Date
CN109446514A true CN109446514A (en) 2019-03-08

Family

ID=65532977

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811089168.9A Pending CN109446514A (en) 2018-09-18 2018-09-18 Construction method, device and the computer equipment of news property identification model

Country Status (1)

Country Link
CN (1) CN109446514A (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109992782A (en) * 2019-04-02 2019-07-09 深圳市华云中盛科技有限公司 Legal documents name entity recognition method, device and computer equipment
CN110348012A (en) * 2019-07-01 2019-10-18 北京明略软件系统有限公司 Determine method, apparatus, storage medium and the electronic device of target character
CN110516251A (en) * 2019-08-29 2019-11-29 秒针信息技术有限公司 A kind of construction method, construction device, equipment and the medium of electric business entity recognition model
CN110532570A (en) * 2019-09-10 2019-12-03 杭州橙鹰数据技术有限公司 A kind of method and apparatus of method and apparatus and model training that naming Entity recognition
CN110598213A (en) * 2019-09-06 2019-12-20 腾讯科技(深圳)有限公司 Keyword extraction method, device, equipment and storage medium
CN110717331A (en) * 2019-10-21 2020-01-21 北京爱医博通信息技术有限公司 Neural network-based Chinese named entity recognition method, device, equipment and storage medium
CN110866402A (en) * 2019-11-18 2020-03-06 北京香侬慧语科技有限责任公司 Named entity identification method and device, storage medium and electronic equipment
CN111159200A (en) * 2019-12-31 2020-05-15 华中科技大学鄂州工业技术研究院 Data storage method and device based on deep learning
CN111291566A (en) * 2020-01-21 2020-06-16 北京明略软件系统有限公司 Event subject identification method and device and storage medium
CN111368544A (en) * 2020-02-28 2020-07-03 中国工商银行股份有限公司 Named entity identification method and device
CN111444723A (en) * 2020-03-06 2020-07-24 深圳追一科技有限公司 Information extraction model training method and device, computer equipment and storage medium
CN111583911A (en) * 2020-04-30 2020-08-25 深圳市优必选科技股份有限公司 Speech recognition method, device, terminal and medium based on label smoothing
CN111651995A (en) * 2020-06-07 2020-09-11 上海建科工程咨询有限公司 Accident information automatic extraction method and system based on deep circulation neural network
CN111737560A (en) * 2020-07-20 2020-10-02 平安国际智慧城市科技股份有限公司 Content search method, field prediction model training method, device and storage medium
CN111753506A (en) * 2020-05-15 2020-10-09 北京捷通华声科技股份有限公司 Text replacement method and device
CN111832291A (en) * 2020-06-02 2020-10-27 北京百度网讯科技有限公司 Entity recognition model generation method and device, electronic equipment and storage medium
CN111859948A (en) * 2019-04-28 2020-10-30 北京嘀嘀无限科技发展有限公司 Language identification, language model training and character prediction method and device
CN111885000A (en) * 2020-06-22 2020-11-03 网宿科技股份有限公司 Network attack detection method, system and device based on graph neural network
CN111950277A (en) * 2019-04-30 2020-11-17 中移(苏州)软件技术有限公司 Business situation entity determining method, device and storage medium
CN112101041A (en) * 2020-09-08 2020-12-18 平安科技(深圳)有限公司 Entity relationship extraction method, device, equipment and medium based on semantic similarity
CN112699683A (en) * 2020-12-31 2021-04-23 大唐融合通信股份有限公司 Named entity identification method and device fusing neural network and rule
CN113449113A (en) * 2020-03-27 2021-09-28 京东数字科技控股有限公司 Knowledge graph construction method and device, electronic equipment and storage medium
CN113836928A (en) * 2021-09-28 2021-12-24 平安科技(深圳)有限公司 Text entity generation method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160171974A1 (en) * 2014-12-15 2016-06-16 Baidu Usa Llc Systems and methods for speech transcription
CN106557462A (en) * 2016-11-02 2017-04-05 数库(上海)科技有限公司 Name entity recognition method and system
CN107203511A (en) * 2017-05-27 2017-09-26 中国矿业大学 A kind of network text name entity recognition method based on neutral net probability disambiguation
CN107622050A (en) * 2017-09-14 2018-01-23 武汉烽火普天信息技术有限公司 Text sequence labeling system and method based on Bi LSTM and CRF
CN107797992A (en) * 2017-11-10 2018-03-13 北京百分点信息科技有限公司 Name entity recognition method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160171974A1 (en) * 2014-12-15 2016-06-16 Baidu Usa Llc Systems and methods for speech transcription
CN106557462A (en) * 2016-11-02 2017-04-05 数库(上海)科技有限公司 Name entity recognition method and system
CN107203511A (en) * 2017-05-27 2017-09-26 中国矿业大学 A kind of network text name entity recognition method based on neutral net probability disambiguation
CN107622050A (en) * 2017-09-14 2018-01-23 武汉烽火普天信息技术有限公司 Text sequence labeling system and method based on Bi LSTM and CRF
CN107797992A (en) * 2017-11-10 2018-03-13 北京百分点信息科技有限公司 Name entity recognition method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JOHN M. GIORGI ET AL: "Transfer learning for biomedical named entity recognition with neural networks", 《BIOINFORMATICS》, vol. 34, no. 23, 1 June 2018 (2018-06-01), pages 4087 - 4094 *
ZHILIN YANG ET AL: "Transfer Learning For Sequence Tagging With Hierarchical Recurrent Networks", 《ARXIV》, 18 March 2017 (2017-03-18), pages 1 - 10 *
冯蕴天;张宏军;郝文宁;陈刚;: "基于深度信念网络的命名实体识别", 计算机科学, no. 04, 15 April 2016 (2016-04-15), pages 224 - 230 *

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109992782A (en) * 2019-04-02 2019-07-09 深圳市华云中盛科技有限公司 Legal documents name entity recognition method, device and computer equipment
CN109992782B (en) * 2019-04-02 2023-07-07 深圳市华云中盛科技股份有限公司 Legal document named entity identification method and device and computer equipment
CN111859948A (en) * 2019-04-28 2020-10-30 北京嘀嘀无限科技发展有限公司 Language identification, language model training and character prediction method and device
CN111950277A (en) * 2019-04-30 2020-11-17 中移(苏州)软件技术有限公司 Business situation entity determining method, device and storage medium
CN110348012A (en) * 2019-07-01 2019-10-18 北京明略软件系统有限公司 Determine method, apparatus, storage medium and the electronic device of target character
CN110348012B (en) * 2019-07-01 2022-12-09 北京明略软件系统有限公司 Method, device, storage medium and electronic device for determining target character
CN110516251B (en) * 2019-08-29 2023-11-03 秒针信息技术有限公司 Method, device, equipment and medium for constructing electronic commerce entity identification model
CN110516251A (en) * 2019-08-29 2019-11-29 秒针信息技术有限公司 A kind of construction method, construction device, equipment and the medium of electric business entity recognition model
CN110598213A (en) * 2019-09-06 2019-12-20 腾讯科技(深圳)有限公司 Keyword extraction method, device, equipment and storage medium
CN110532570A (en) * 2019-09-10 2019-12-03 杭州橙鹰数据技术有限公司 A kind of method and apparatus of method and apparatus and model training that naming Entity recognition
CN110717331B (en) * 2019-10-21 2023-10-24 北京爱医博通信息技术有限公司 Chinese named entity recognition method, device and equipment based on neural network and storage medium
CN110717331A (en) * 2019-10-21 2020-01-21 北京爱医博通信息技术有限公司 Neural network-based Chinese named entity recognition method, device, equipment and storage medium
CN110866402B (en) * 2019-11-18 2023-11-28 北京香侬慧语科技有限责任公司 Named entity identification method and device, storage medium and electronic equipment
CN110866402A (en) * 2019-11-18 2020-03-06 北京香侬慧语科技有限责任公司 Named entity identification method and device, storage medium and electronic equipment
CN111159200B (en) * 2019-12-31 2023-10-17 华中科技大学鄂州工业技术研究院 Data storage method and device based on deep learning
CN111159200A (en) * 2019-12-31 2020-05-15 华中科技大学鄂州工业技术研究院 Data storage method and device based on deep learning
CN111291566B (en) * 2020-01-21 2023-04-28 北京明略软件系统有限公司 Event main body recognition method, device and storage medium
CN111291566A (en) * 2020-01-21 2020-06-16 北京明略软件系统有限公司 Event subject identification method and device and storage medium
CN111368544B (en) * 2020-02-28 2023-09-19 中国工商银行股份有限公司 Named entity identification method and device
CN111368544A (en) * 2020-02-28 2020-07-03 中国工商银行股份有限公司 Named entity identification method and device
CN111444723A (en) * 2020-03-06 2020-07-24 深圳追一科技有限公司 Information extraction model training method and device, computer equipment and storage medium
CN113449113A (en) * 2020-03-27 2021-09-28 京东数字科技控股有限公司 Knowledge graph construction method and device, electronic equipment and storage medium
CN111583911A (en) * 2020-04-30 2020-08-25 深圳市优必选科技股份有限公司 Speech recognition method, device, terminal and medium based on label smoothing
CN111583911B (en) * 2020-04-30 2023-04-14 深圳市优必选科技股份有限公司 Speech recognition method, device, terminal and medium based on label smoothing
CN111753506B (en) * 2020-05-15 2023-12-08 北京捷通华声科技股份有限公司 Text replacement method and device
CN111753506A (en) * 2020-05-15 2020-10-09 北京捷通华声科技股份有限公司 Text replacement method and device
CN111832291B (en) * 2020-06-02 2024-01-09 北京百度网讯科技有限公司 Entity recognition model generation method and device, electronic equipment and storage medium
CN111832291A (en) * 2020-06-02 2020-10-27 北京百度网讯科技有限公司 Entity recognition model generation method and device, electronic equipment and storage medium
CN111651995A (en) * 2020-06-07 2020-09-11 上海建科工程咨询有限公司 Accident information automatic extraction method and system based on deep circulation neural network
CN111885000A (en) * 2020-06-22 2020-11-03 网宿科技股份有限公司 Network attack detection method, system and device based on graph neural network
CN111885000B (en) * 2020-06-22 2022-06-21 网宿科技股份有限公司 Network attack detection method, system and device based on graph neural network
CN111737560A (en) * 2020-07-20 2020-10-02 平安国际智慧城市科技股份有限公司 Content search method, field prediction model training method, device and storage medium
WO2021121198A1 (en) * 2020-09-08 2021-06-24 平安科技(深圳)有限公司 Semantic similarity-based entity relation extraction method and apparatus, device and medium
CN112101041A (en) * 2020-09-08 2020-12-18 平安科技(深圳)有限公司 Entity relationship extraction method, device, equipment and medium based on semantic similarity
CN112699683A (en) * 2020-12-31 2021-04-23 大唐融合通信股份有限公司 Named entity identification method and device fusing neural network and rule
CN113836928A (en) * 2021-09-28 2021-12-24 平安科技(深圳)有限公司 Text entity generation method, device, equipment and storage medium
CN113836928B (en) * 2021-09-28 2024-02-27 平安科技(深圳)有限公司 Text entity generation method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109446514A (en) Construction method, device and the computer equipment of news property identification model
CN109522393A (en) Intelligent answer method, apparatus, computer equipment and storage medium
CN110704633A (en) Named entity recognition method and device, computer equipment and storage medium
CN109145315A (en) Text interpretation method, device, storage medium and computer equipment
CN110162627A (en) Data increment method, apparatus, computer equipment and storage medium
CN109977234A (en) A kind of knowledge mapping complementing method based on subject key words filtering
KR20190085098A (en) Keyword extraction method, computer device, and storage medium
CN109271646A (en) Text interpretation method, device, readable storage medium storing program for executing and computer equipment
CN108628974A (en) Public feelings information sorting technique, device, computer equipment and storage medium
CN110399484A (en) Sentiment analysis method, apparatus, computer equipment and the storage medium of long text
CN111462751B (en) Method, apparatus, computer device and storage medium for decoding voice data
CN109815331A (en) Construction method, device and the computer equipment of text emotion disaggregated model
CN110334179A (en) Question and answer processing method, device, computer equipment and storage medium
CN111428448B (en) Text generation method, device, computer equipment and readable storage medium
CN112633423B (en) Training method of text recognition model, text recognition method, device and equipment
CN110968725B (en) Image content description information generation method, electronic device and storage medium
Kamal et al. Textmage: The automated bangla caption generator based on deep learning
CN109977394A (en) Text model training method, text analyzing method, apparatus, equipment and medium
CN112016271A (en) Language style conversion model training method, text processing method and device
CN109885830A (en) Sentence interpretation method, device, computer equipment
CN114528394B (en) Text triple extraction method and device based on mask language model
CN109992770A (en) A kind of Laotian name entity recognition method based on combination neural net
CN115098722B (en) Text and image matching method and device, electronic equipment and storage medium
CN112905763B (en) Session system development method, device, computer equipment and storage medium
CN114638229A (en) Entity identification method, device, medium and equipment of record data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination