CN108460012A - A kind of name entity recognition method based on GRU-CRF - Google Patents
A kind of name entity recognition method based on GRU-CRF Download PDFInfo
- Publication number
- CN108460012A CN108460012A CN201810102699.0A CN201810102699A CN108460012A CN 108460012 A CN108460012 A CN 108460012A CN 201810102699 A CN201810102699 A CN 201810102699A CN 108460012 A CN108460012 A CN 108460012A
- Authority
- CN
- China
- Prior art keywords
- gru
- crf
- entity recognition
- name entity
- word
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/289—Phrasal analysis, e.g. finite state techniques or chunking
- G06F40/295—Named entity recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/289—Phrasal analysis, e.g. finite state techniques or chunking
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Character Discrimination (AREA)
Abstract
The invention discloses a kind of name entity recognition methods based on GRU CRF, belong to natural language processing field.In order to further increase the recognition effect of name Entity recognition, GRU networks are combined by the present invention with CRF, and sentence characteristics are extracted using GRU networks, carry out last entity mark in conjunction with CRF to complete name Entity recognition.The features such as GRU has parameter few, and training speed is fast, reduces the time used when being trained to large-scale data, and CRF pairs of position can use the information marked during being labeled, and there is good entity to mark effect.GRU network applications in name Entity recognition fields, the parameter of network internal are reduced on the basis of reaching mark effect, improves training effectiveness, has good application prospect, can be widely applied to the Entity recognition occasion in each field by the present invention.
Description
Technical field
The present invention relates to natural language processing more particularly to a kind of name entity recognition methods based on GRU-CRF.
Background technology
With the fast development of Internet technology, the mankind have been accustomed to obtaining a large amount of knowledge from network, and therefore, name is real
The research of body recognition methods has obtained extensive concern as people are improved from method network acquisition and find new knowledge.Life
Name Entity recognition is a basic task in natural language processing field, and is the research in natural language processing field
Hot spot, the method learnt from the method based on early dictionary and rule to conventional machines, then in recent years based on deep learning
Method, recognition effect are constantly improving.Condition random field (Conditional Random Field, CRF) is natural in recent years
Language Processing field is usually used in naming the algorithm of Entity recognition, selects some characteristic functions with Manual definition as feature
Template is named Entity recognition research, can be into for the given position in sentence, between different feature templates
Row combines to form a new feature templates.Sentence mark is carried out using feature templates, but CRF is for naming Entity recognition
With limitation, overall effect is less desirable.Recognition with Recurrent Neural Network (Recurrent Neural Networks, RNN) is
Through achieving extensive use in numerous natural language processings, but RNN is easy to that gradient disappearance problem occurs in training, this
Gradient cannot always hand in longer sequence when causing to train, and make RNN that can not capture the influence of long range.Length
Phase memory network (Long Short-TermMemory, LSTM) solves the influence of long range information loss, has good
Therefore effect, but since LSTM structures are relative complex, result in the need for a large amount of learning time is badly in need of a kind of can both solving at present
Certainly gradient disappearance problem, and have the neural network model of shorter learning time, it is carried by scholars such as Kyunghyun Cho within 2014
Thresholding Recursive Networks Recognition with Recurrent Neural Network (Gated Recurrent Unit, GRU) is gone out, it forgets door and input by LSTM's
The advantages of door has synthesized a single update door, had not only remained LSTM algorithms, but also the algorithm of LSTM can be simplified, subtract significantly
The e-learning time is lacked, but there is no GRU is combined the related patents named applied to entity with CRF at present.
Invention content
The problem of for above name entity recognition method, the present invention utilize GRU neural networks and condition random
Field CRF is combined, and proposes a kind of name entity recognition method based on GRU-CRF.
A kind of name entity recognition method based on GRU-CRF, includes the following steps:
Step (1):Language material is divided into training set and test set;
Step (2):Training set is pre-processed;
Step (3):Each word in training set and test set sentence is indicated with one-hot vectors;
Step (4):The word vector generated by training set is input in GRU networks and carries out feature extraction;
Step (5):Sequence labelling is carried out in conjunction with CRF;
Step (6):Model training;
Step (7):Model measurement.
It is described that language material is carried out in preprocessing process, first training set is segmented using participle software, then to participle
Each word afterwards carries out BIO labels, is finally trained to word2vec using the training set after label.
It is as follows that the GRU networks carry out contained calculating in feature extraction:
The update door formula of GRU is:
zt=σ (Uzxt+Wzst-1)
Wherein, ztFor the feature vector of the update door output of GRU, σ is sigmoid functions, UzAnd WzFor training parameter, xt
For the word vector of t moment, st-1For xt-1The feature vector of corresponding GRU outputs, stFor xtThe feature of corresponding GRU output to
Amount,
The resetting door formula of GRU is:
rt=σ (Urxt+Wrst-1)
Wherein, rtFor the feature vector of the resetting door output of GRU, UrAnd WrFor training parameter,
Hidden state equation is:
ht=tanh (Uhxt+Wh(st-1*rt))
Wherein, htFor the hidden state vector of t moment, UhAnd WhFor training parameter,
xtThe corresponding output vector of word vector is:
st=(1-zt)*ht+zt*st-1
GRU extracts the matrix M=(S of sentence characteristics1, S2, Sn), wherein n is the number of word contained by sentence.
The sequence labelling computational methods of the CRF are:
With x=(x1, x2, xn) it is expressed as the sentence that word number is n;With y=(y1, y2, yn) indicate sentence
Word number contained by son is the annotated sequence of n, defines the sequence and is scored at:
Wherein, M is the eigenvectors matrix obtained by GRU networks, and A is the transfer matrix of CRF, to score (x, y) into
Row index and standardization obtain the probability value p (x | y) of annotated sequence:
Wherein y'=(y1', y2', yn'), for the sequence that may be marked.
When the model training, using maximal condition Likelihood estimation, selection makes the maximum parameter of log-likelihood,
It is as follows for the log-likelihood of training sample (x, y):
During the model measurement, solved using the Viterbi algorithm of Dynamic Programming:
WhereinFor optimal path, the annotated sequence of as CRF outputs.
Advantageous effect:
1, the present invention need not carry out word segmentation processing in the practical stage to text, i.e., other participles need not be used soft
Part solves dependence of the prior art to participle software, also improves the independence of text-processing.
2, the present invention reduces network with GRU neural networks on the basis of remaining LSTM neural network advantages
Learning time improves e-learning efficiency.
3, the method that the present invention is combined using GRU with CRF, takes full advantage of deep learning and counts excellent with Probability
Gesture.
4, the present invention uses the Viterbi algorithm of Dynamic Programming, and compared to the method for exhaustion, algorithm is more simple and effective.
Description of the drawings
Fig. 1 is that GRU-CRF models name Entity recognition overall flow figure;
Fig. 2 GRU neural network structure schematic diagrames;
Fig. 3 embodiments name Entity recognition overall flow figure.
Specific implementation mode
The exemplary embodiment of the present invention is described hereinafter in connection with attached drawing.It is understood that this place
The specific embodiment of description is used only for explaining the embodiment of the present invention, rather than the restriction to the embodiment of the present invention.It further needs exist for
Illustrate, illustrate only for ease of description, in attached drawing with the relevant part of the embodiment of the present invention rather than entire infrastructure, and
The certain components of attached drawing have omission, zoom in or out, and do not represent the size of actual product.
The name Entity recognition overall flow figure of the present embodiment is as shown in Figure 1, with 98 years《People's Daily》Part news material
Material is language material, and language material is divided into training set and test set, is pre-processed to training set, in preprocessing process, is first used
Jieba segments training set, reuses word2vec softwares and is trained to the training set segmented, finally will be each
Word is divided into a line and is labeled, and represents name lead-in using BIO mark collection, i.e. B-PER in annotation process, I-PER represents name
Non- lead-in, B-LOC represent place name lead-in, and I-LOC represents the non-lead-in of place name, and B-ORG represents institution term lead-in, I-ORG generations
The non-lead-in of mechanism of table organization, O represent the part that the word is not belonging to name entity.By each sentence of the training set after mark
Son indicates that the word2vec that utilization has been trained is by each word vectorization of training set with one-hot forms, further, utilizes
Word2vec after training is vectorial for the dense word of low-dimensional by one-hot DUAL PROBLEMS OF VECTOR MAPPINGs by each word in sentence, selects here
100 dimensions, naturally it is also possible to select 300 dimensions.Using the word vector matrix of generation as the input of GRU neural networks, the main knot of GRU
Structure is as shown in Fig. 2, ztFor the output feature vector of the update door of GRU, σ is sigmoid functions, xtFor the word vector of t moment, st
For xtThe feature vector of corresponding GRU outputs, st-1For xt-1The feature vector of corresponding GRU outputs, rtIt is defeated for the resetting door of GRU
The feature vector gone out, htFor the hidden state vector of t moment, xtThe corresponding output vector of word vector is:st=(1-zt)*h+zt*
st-1, wherein * expressions are multiplied by element, i.e., the new vector that element multiplication obtains one by one between vector, finally by GRU nets
Network extracts the matrix M=(S of sentence characteristics1, S2, Sn), n is the number of word contained by sentence.Be input to GRU networks it
Before, dropout layers of setting can be first passed through to alleviate over-fitting, then extract sentence characteristics using GRU networks.Embodiment
Experimental situation is based under python running environment, and major parameter is set as dropout=0.5, and batch_size takes 64, learns
Habit rate is 0.001, and iterations are 100 wheels, by the GRU eigenvectors matrix M extracted and are used as CRF layers of condition random field
Input, the representation method of the CRF uses matrix-style, with x=(x1, x2, xn) word number is expressed as n's
Sentence;With y=(y1, y2, yn) indicate the annotated sequence that word number contained by sentence is n, it defines the sequence and is scored at:
Wherein, M is the eigenvectors matrix M obtained by GRU networks, and A is the transfer matrix of CRF, to score (x, y)
Indexation and standardization are carried out, the probability value p (x | y) of annotated sequence is obtained:
Wherein y'=(y1', y2', yn'), for the sequence that may be marked;
When model training, when solving probability value, asked using the method for maximal possibility estimation, and using log-likelihood
Solution.Using annotated sequence y and the probability value p (x | y) of annotated sequence y is obtained, model is trained.
Annotation process when model measurement as shown in figure 3, test set is indicated with one-hot, what utilization had been trained
Word vector matrix after vectorization is input in model by word2vec by each word vectorization of test set, CRF layers into
The sequence labelling of row Sentence-level during model measurement, is solved, utilization is each using the Viterbi algorithm of Dynamic Programming
The suboptimization of step achievees the effect that global optimization, last output are:
WhereinFor optimal path, the annotated sequence of as CRF outputs, last mark knot
Fruit is as follows:
PER:Name, LOC:Place name, ORG:Institution term
Finally illustrate, although describing the present invention according to the embodiment of limited quantity, benefit from above description,
It will be understood by those skilled in the art that in the scope of the present invention thus described, it can be envisaged that other implementations
Example.Additionally, it should be noted that the language used in this specification primarily to readable and introduction purpose and select, and
It is not configured to explain or limits subject of the present invention and select.Therefore, without departing from the scope of the appended claims and
In the case of objective, for those skilled in the art, many modifications and changes are obvious.It is right
In the scope of the present invention, the disclosure done to the present invention is illustrative and not restrictive, and the scope of the present invention is by appended
Claims limited.
Claims (6)
1. a kind of name entity recognition method based on GRU-CRF, which is characterized in that include the following steps:
Step (1):Language material is divided into training set and test set;
Step (2):Training set is pre-processed;
Step (3):Each word in training set and test set sentence is indicated with one-hot vectors;
Step (4):The word vector generated by training set is input in GRU networks and carries out feature extraction;
Step (5):Sequence labelling is carried out in conjunction with CRF;
Step (6):Model training;
Step (7):Model measurement.
2. a kind of name entity recognition method based on GRU-CRF according to claim 1, which is characterized in that described right
Language material carry out preprocessing process in, first using participle software training set is segmented, then to each word after participle into
Row BIO labels, are finally trained word2vec using the training set after label.
3. a kind of name entity recognition method based on GRU-CRF according to claim 1, which is characterized in that the GRU
It is as follows that network carries out contained calculating in feature extraction:
The update door formula of GRU is:
zt=σ (Uzxt+Wzst-1)
Wherein, ztFor the feature vector of the update door output of GRU, σ is sigmoid functions, UzAnd WzFor training parameter, xtFor t when
The word vector at quarter, st-1For xt-1The feature vector of corresponding GRU outputs, stFor xtThe feature vector of corresponding GRU outputs,
The resetting door formula of GRU is:
rt=σ (Urxt+Wrst-1)
Wherein, rtFor the feature vector of the resetting door output of GRU, UrAnd WrFor training parameter,
Hidden state equation is:
ht=tanh (Uhxt+Wh(st-1*rt))
Wherein, htFor the hidden state vector of t moment, UhAnd WhFor training parameter,
xtThe corresponding output vector of word vector is:
st=(1-zt)*ht+zt*st-1
GRU extracts the matrix M=(S of sentence characteristics1, S2, Sn), wherein n is the number of word contained by sentence.
4. a kind of name entity recognition method based on GRU-CRF according to claim 1, which is characterized in that the CRF
Sequence labelling computational methods be:
With x=(x1, x2, xn) it is expressed as the sentence that word number is n;With y=(y1, y2, yn) indicate sentence institute
Number containing word is the annotated sequence of n, defines the sequence and is scored at:
Wherein, M is the eigenvectors matrix obtained by GRU networks, and A is the transfer matrix of CRF, is referred to score (x, y)
Numberization and standardization obtain the probability value p (x | y) of annotated sequence:
Wherein y'=(y1', y2', yn'), for the sequence that may be marked.
5. a kind of name entity recognition method based on GRU-CRF according to claim 1 or 4, which is characterized in that described
When model training, using maximal condition Likelihood estimation, selection makes the maximum parameter of log-likelihood, for training sample
The log-likelihood of (x, y) is as follows:
6. a kind of name entity recognition method based on GRU-CRF according to claim 1 or 4, which is characterized in that in institute
During stating model measurement, solved using the Viterbi algorithm of Dynamic Programming:
WhereinFor optimal path, the annotated sequence of as CRF outputs.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810102699.0A CN108460012A (en) | 2018-02-01 | 2018-02-01 | A kind of name entity recognition method based on GRU-CRF |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810102699.0A CN108460012A (en) | 2018-02-01 | 2018-02-01 | A kind of name entity recognition method based on GRU-CRF |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108460012A true CN108460012A (en) | 2018-08-28 |
Family
ID=63238650
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810102699.0A Pending CN108460012A (en) | 2018-02-01 | 2018-02-01 | A kind of name entity recognition method based on GRU-CRF |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108460012A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109284361A (en) * | 2018-09-29 | 2019-01-29 | 深圳追科技有限公司 | A kind of entity abstracting method and system based on deep learning |
CN109299457A (en) * | 2018-09-06 | 2019-02-01 | 北京奇艺世纪科技有限公司 | A kind of opining mining method, device and equipment |
CN109741732A (en) * | 2018-08-30 | 2019-05-10 | 京东方科技集团股份有限公司 | Name entity recognition method, name entity recognition device, equipment and medium |
CN109871535A (en) * | 2019-01-16 | 2019-06-11 | 四川大学 | A kind of French name entity recognition method based on deep neural network |
CN110222343A (en) * | 2019-06-13 | 2019-09-10 | 电子科技大学 | A kind of Chinese medicine plant resource name entity recognition method |
CN110298043A (en) * | 2019-07-03 | 2019-10-01 | 吉林大学 | A kind of vehicle name entity recognition method and system |
CN110717331A (en) * | 2019-10-21 | 2020-01-21 | 北京爱医博通信息技术有限公司 | Neural network-based Chinese named entity recognition method, device, equipment and storage medium |
WO2020133039A1 (en) * | 2018-12-27 | 2020-07-02 | 深圳市优必选科技有限公司 | Entity identification method and apparatus in dialogue corpus, and computer device |
CN111382569A (en) * | 2018-12-27 | 2020-07-07 | 深圳市优必选科技有限公司 | Method and device for recognizing entities in dialogue corpus and computer equipment |
CN112256828A (en) * | 2020-10-20 | 2021-01-22 | 平安科技(深圳)有限公司 | Medical entity relationship extraction method and device, computer equipment and readable storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106776711A (en) * | 2016-11-14 | 2017-05-31 | 浙江大学 | A kind of Chinese medical knowledge mapping construction method based on deep learning |
CN106980608A (en) * | 2017-03-16 | 2017-07-25 | 四川大学 | A kind of Chinese electronic health record participle and name entity recognition method and system |
WO2017130434A1 (en) * | 2016-01-28 | 2017-08-03 | 楽天株式会社 | Computer system, method, and program for transferring named entity recognition model for multiple languages |
CN107644014A (en) * | 2017-09-25 | 2018-01-30 | 南京安链数据科技有限公司 | A kind of name entity recognition method based on two-way LSTM and CRF |
-
2018
- 2018-02-01 CN CN201810102699.0A patent/CN108460012A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017130434A1 (en) * | 2016-01-28 | 2017-08-03 | 楽天株式会社 | Computer system, method, and program for transferring named entity recognition model for multiple languages |
CN106776711A (en) * | 2016-11-14 | 2017-05-31 | 浙江大学 | A kind of Chinese medical knowledge mapping construction method based on deep learning |
CN106980608A (en) * | 2017-03-16 | 2017-07-25 | 四川大学 | A kind of Chinese electronic health record participle and name entity recognition method and system |
CN107644014A (en) * | 2017-09-25 | 2018-01-30 | 南京安链数据科技有限公司 | A kind of name entity recognition method based on two-way LSTM and CRF |
Non-Patent Citations (2)
Title |
---|
MOURAD GRIDACH 等: "Arabic Named Entity Recognition : A Bidirectional GRU-CRF Approach", 《HTTPS://WWW.RESEARCHGATE.NET/PUBLICATION/328165330》 * |
李佰蔚: "基于GRU-CRF的中文命名实体识别方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109741732A (en) * | 2018-08-30 | 2019-05-10 | 京东方科技集团股份有限公司 | Name entity recognition method, name entity recognition device, equipment and medium |
CN109741732B (en) * | 2018-08-30 | 2022-06-21 | 京东方科技集团股份有限公司 | Named entity recognition method, named entity recognition device, equipment and medium |
CN109299457A (en) * | 2018-09-06 | 2019-02-01 | 北京奇艺世纪科技有限公司 | A kind of opining mining method, device and equipment |
CN109284361A (en) * | 2018-09-29 | 2019-01-29 | 深圳追科技有限公司 | A kind of entity abstracting method and system based on deep learning |
CN111382569A (en) * | 2018-12-27 | 2020-07-07 | 深圳市优必选科技有限公司 | Method and device for recognizing entities in dialogue corpus and computer equipment |
CN111382569B (en) * | 2018-12-27 | 2024-05-03 | 深圳市优必选科技有限公司 | Method and device for identifying entity in dialogue corpus and computer equipment |
WO2020133039A1 (en) * | 2018-12-27 | 2020-07-02 | 深圳市优必选科技有限公司 | Entity identification method and apparatus in dialogue corpus, and computer device |
CN109871535A (en) * | 2019-01-16 | 2019-06-11 | 四川大学 | A kind of French name entity recognition method based on deep neural network |
CN109871535B (en) * | 2019-01-16 | 2020-01-10 | 四川大学 | French named entity recognition method based on deep neural network |
CN110222343A (en) * | 2019-06-13 | 2019-09-10 | 电子科技大学 | A kind of Chinese medicine plant resource name entity recognition method |
CN110298043B (en) * | 2019-07-03 | 2023-04-07 | 吉林大学 | Vehicle named entity identification method and system |
CN110298043A (en) * | 2019-07-03 | 2019-10-01 | 吉林大学 | A kind of vehicle name entity recognition method and system |
CN110717331A (en) * | 2019-10-21 | 2020-01-21 | 北京爱医博通信息技术有限公司 | Neural network-based Chinese named entity recognition method, device, equipment and storage medium |
CN110717331B (en) * | 2019-10-21 | 2023-10-24 | 北京爱医博通信息技术有限公司 | Chinese named entity recognition method, device and equipment based on neural network and storage medium |
CN112256828A (en) * | 2020-10-20 | 2021-01-22 | 平安科技(深圳)有限公司 | Medical entity relationship extraction method and device, computer equipment and readable storage medium |
CN112256828B (en) * | 2020-10-20 | 2023-08-08 | 平安科技(深圳)有限公司 | Medical entity relation extraction method, device, computer equipment and readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108460012A (en) | A kind of name entity recognition method based on GRU-CRF | |
CN112115995B (en) | Image multi-label classification method based on semi-supervised learning | |
CN109657239B (en) | Chinese named entity recognition method based on attention mechanism and language model learning | |
Baradel et al. | Glimpse clouds: Human activity recognition from unstructured feature points | |
Cihan Camgoz et al. | Subunets: End-to-end hand shape and continuous sign language recognition | |
Creswell et al. | Generative adversarial networks: An overview | |
CN110969020B (en) | CNN and attention mechanism-based Chinese named entity identification method, system and medium | |
CN107943784B (en) | Relationship extraction method based on generation of countermeasure network | |
CN110196980A (en) | A kind of field migration based on convolutional network in Chinese word segmentation task | |
CN111243699A (en) | Chinese electronic medical record entity extraction method based on word information fusion | |
CN111581970B (en) | Text recognition method, device and storage medium for network context | |
CN110263174B (en) | Topic category analysis method based on focus attention | |
CN111476024A (en) | Text word segmentation method and device and model training method | |
Jiang et al. | Boosting facial expression recognition by a semi-supervised progressive teacher | |
Sahu et al. | Dynamic routing using inter capsule routing protocol between capsules | |
Sun et al. | Study on medical image report generation based on improved encoding-decoding method | |
CN110298046B (en) | Translation model training method, text translation method and related device | |
Sun et al. | Inter-cluster and intra-cluster joint optimization for unsupervised cross-domain person re-identification | |
CN113220865B (en) | Text similar vocabulary retrieval method, system, medium and electronic equipment | |
CN114925205A (en) | GCN-GRU text classification method based on comparative learning | |
CN110674642A (en) | Semantic relation extraction method for noisy sparse text | |
Pan et al. | Teach machine to learn: hand-drawn multi-symbol sketch recognition in one-shot | |
Tian et al. | Text classification model based on BERT-capsule with integrated deep learning | |
CN113111654B (en) | Word segmentation method based on word segmentation tool common information and partial supervised learning | |
Wang et al. | Deep convolutional neural network based hidden markov model for offline handwritten Chinese text recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20180828 |