CN109800799A - A kind of online Active Learning Method suitable for no label unbalanced data stream - Google Patents

A kind of online Active Learning Method suitable for no label unbalanced data stream Download PDF

Info

Publication number
CN109800799A
CN109800799A CN201910001840.2A CN201910001840A CN109800799A CN 109800799 A CN109800799 A CN 109800799A CN 201910001840 A CN201910001840 A CN 201910001840A CN 109800799 A CN109800799 A CN 109800799A
Authority
CN
China
Prior art keywords
sample
label
linear classifier
asymmetric
data stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910001840.2A
Other languages
Chinese (zh)
Inventor
吴庆耀
张一帆
谭明奎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Publication of CN109800799A publication Critical patent/CN109800799A/en
Priority to PCT/CN2019/114167 priority Critical patent/WO2020140597A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The present invention provides a kind of online Active Learning Methods suitable for no label unbalanced data stream, include: no label data stream timing input linear classifier in predicted, wherein the classification of data flow has height imbalance problem, i.e., positive class sample size is rare;According to the asymmetric access strategy of proposition, linear classifier is directed to unbalanced data, dynamically determines the sample for needing to be marked label;According to the asymmetric more new strategy of proposition, linear classifier updates linear classifier using the labeled data of error prediction, and is improved learning efficiency using the second order information of sample;A kind of online Active Learning Method suitable for no label unbalanced data stream of the invention utilizes the second order information of sample, proposes new non-symmetrical strategies;The non-symmetrical strategies consider the update of the mark and model of sample simultaneously, can better solve the class imbalance problem of sample, and promote the classification performance of the Active Learning model based on flow data.

Description

A kind of online Active Learning Method suitable for no label unbalanced data stream
Technical field
The present invention relates to on-line studies and semi-supervised learning technical field, and in particular to a kind of uneven suitable for no label The online Active Learning Method of data flow.
Background technique
In recent years, artificial intelligence and related industry are just rapidly developing growth, become academia, industry and countries in the world The focus of attention from government.Recently, State Council has issued " Artificial Intelligence Development planning of new generation ", highlight artificial intelligence study and The national strategy status of industry.In internet industry, on-line study technology is developed rapidly, and is taken in multiple application fields Obtained incremental advances.However, existing on-line study technology still has lot of challenges.Firstly, original stream data is no label, and And the labeled cost of data is often very high.How in the case where marking budgetary constraint, the data of selection most judgement index It is labeled, and one learner of good performance of training is the major issue of on-line study and its industrial application.Secondly, a large amount of In actual task scene, the classification of data is often unbalanced, i.e., positive class data are far less than negative class data.How sample is solved This class imbalance problem is also industrial application critical issue urgently to be resolved.
Summary of the invention
In view of this, it is uneven suitable for no label that the present invention provides a kind of to solve above-mentioned the problems of the prior art The online Active Learning Method for the data flow that weighs proposes asymmetric access strategy for unbalanced data, dynamically determines to need to mark Infuse the sample of label;For effectively more new model, this method is believed using the second order of sample it is further proposed that asymmetric more new strategy Cease efficiently more new model;Sparse to the labeled data in the presence of actual classification application simultaneously, sample imbalance, flow data etc. Problem has preferable resolution ability.
To achieve the above object, technical scheme is as follows.
A kind of online Active Learning Method suitable for no label unbalanced data stream, comprising the following steps:
Step 1, without being predicted in label data stream sequentially input linear classifier, wherein the classification of data flow has Height imbalance problem, being normally set up positive class sample is classification rareness sample;
Step 2, the asymmetric access strategy according to proposition, linear classifier is for no label unbalanced data, sequentially Decision needs to be marked the sample of label;
Step 3, the asymmetric more new strategy according to proposition, linear classifier utilize the labeled data more new line of error prediction Property classifier, and improved learning efficiency using the second order information of sample.
Further, in the step 1, the no label data stream is represented byIts InThe feature quantity of representative sample is d, and T indicates the sum of unlabeled exemplars.The sample budget that label can be marked is B, mark The classification of label is yt∈ { -1 ,+1 }, then positive class sample yt=+1 quantity is far less than negative class sample yt=-1, described linear point The specifically used method of class device are as follows:
Step 11, the linear classifier are expressed asIt meets multivariate Gaussian distributionIts Middle μ indicates the mean value of linear classifier w, and Σ indicates the variance of linear classifier w;
The classification prediction of step 12, the linear classifier is expressed asWherein sign () table Show and works asThenOtherwise
Step 13, the prediction result of the linear classifier indicate are as follows: ifThen linear classifier classification is correct, Otherwise the classification error of linear classifier.
Further, as follows the step of asymmetric access strategy in the step 2:
Step 21, the second order information Σ (i.e. the variance of linear classifier) based on sample calculate linear classifier to current The confidence level of sample;
Step 22 is based on confidence level, calculates the asymmetric access parameter of current sample;
Step 23 is based on asymmetric access parameter, carries out Bernoulli Jacob's sampling, obtains its sampled value;
If step 24, the sampled value are 1, the label for needing to access the sample is determined;Conversely, not needing then.
Further, as follows the step of asymmetric more new strategy in the step 3:
Step 31 obtains error prediction and has a label data;
Step 32 has label data based on error prediction, calculates the asymmetric loss function value of the data;
Step 33 is based on asymmetric loss function value and optimisation strategy, updates the variance Σ of linear classifier:
Wherein, γ represents regularization coefficient;
Step 34: it is based on asymmetric loss function value and optimisation strategy, updates the mean μ of linear classifier:
μt+1t-ηΣt+1gt
Wherein, η represents the learning rate of linear classifier, gtRepresent asymmetric loss function value ltGradient, to loss function Derivation can obtain.
Further, it is calculated by the following formula confidence level:
Wherein, η represents the learning rate of linear classifier, and γ represents regularization coefficient, ρmax=max (1, ρ), ρ represent positive class The misclassification cost of sample;In addition,Representative model represents model to current sample to the confidence of current sample This familiarity, to preferably calculate confidence level ct
Based on confidence level ct, it is calculated by the following formula the asymmetric access parameter of current sample:
qt=| pt|+ct
Wherein,It is marginal to the prediction of current sample to represent linear classifier, | pt|, i.e., the prediction limit Absolute value represents model to the distance of the Prediction distance classification plane of the sample;
Based on asymmetric access parameter qt, Bernoulli Jacob's sampling is carried out, sampled value is obtained;Different classes of sample is set Different downsampling factors passes through following presentation sampled probability:
Wherein, δ+The class that is positive predicts (i.e. pt>=0) downsampling factor, δ_The class that is negative predicts (i.e. pt< 0) downsampling factor; Bernoulli Jacob's sampling is carried out by the sampled probability, obtains sampled value Zt
Further, it is calculated by the following formula asymmetric loss function value:
Wherein, ρ represents the misclassification weight of positive class sample;Indicator function is represented, that is, meeting condition is then 1, otherwise It is 0.
Based on asymmetric loss function value ltAnd optimisation strategy, it is updated by the formula of step 3.3 and step 3.4 linear The variance Σ and mean μ of classifier:
Compared with the prior art, a kind of online Active Learning Method suitable for no label unbalanced data stream of the invention It has the following advantages that and technical effect:
The present invention utilizes the second order information of sample, proposes new non-symmetrical strategies;The non-symmetrical strategies consider sample simultaneously The update of this mark and model can better solve the class imbalance problem of sample, and promote the master based on flow data The classification performance of dynamic learning model.
Detailed description of the invention
Fig. 1 is a kind of process signal of the online Active Learning Method suitable for no label unbalanced data stream in embodiment Figure.
Fig. 2 is the flow diagram of asymmetric access strategy in embodiment.
Fig. 3 is the flow diagram of asymmetric more new strategy in embodiment.
Fig. 4 is the verification result of the online Active Learning Method in embodiment.
Specific embodiment
Specific implementation of the invention is described further below in conjunction with attached drawing and specific embodiment.It may be noted that It is that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.
As shown in Figure 1, being a kind of online Active Learning Method suitable for no label unbalanced data stream of the present embodiment Flow diagram, comprising the following steps:
Step 1, without being predicted in label data stream sequentially input linear classifier, wherein the classification of data flow has Height imbalance problem, being normally set up positive class sample is classification rareness sample;
Step 2, the asymmetric access strategy according to proposition, linear classifier is for no label unbalanced data, sequentially Decision needs to be marked the sample of label;
Step 3, the asymmetric more new strategy according to proposition, linear classifier utilize the labeled data more new line of error prediction Property classifier, and improved learning efficiency using the second order information of sample.
In the step 1, the no label data stream is represented byWhereinIt represents The feature quantity of sample is d, and T indicates the sum of unlabeled exemplars.The sample budget that label can be marked is B, the classification of label For yt∈ { -1 ,+1 }, then positive class sample yt=+1 quantity is far less than negative class sample yt=-1, the tool of the linear classifier Body application method are as follows:
Step 11, the linear classifier are expressed asIt meets multivariate Gaussian distribution Wherein μ indicates the mean value of linear classifier w, and Σ indicates the variance of linear classifier w;
The classification prediction of step 12, the linear classifier is expressed asWherein sign () table Show and works asThenOtherwise
Step 13, the prediction result of the linear classifier indicate are as follows: ifThen linear classifier classification is correct, Otherwise the classification error of linear classifier.
As shown in Fig. 2, being the flow diagram of asymmetric access strategy of the invention, asymmetric access in the step 2 The step of strategy is as follows:
Step 21, the second order information Σ (i.e. the variance of linear classifier) based on sample calculate linear classifier to current The confidence level of sample:
Wherein, η represents the learning rate of linear classifier, and γ represents regularization coefficient, ρmax=max (1, ρ), ρ represent positive class The misclassification cost of sample;In addition,Representative model represents model to current sample to the confidence of current sample This familiarity, to preferably calculate confidence level ct
Step 22 is based on confidence level ct, it is calculated by the following formula the asymmetric access parameter of current sample:
qt=| pt|+ct
Wherein,It is marginal to the prediction of current sample to represent linear classifier, | pt|, i.e., the prediction limit Absolute value represents model to the distance of the Prediction distance classification plane of the sample;
Step 23 is based on asymmetric access parameter qt, Bernoulli Jacob's sampling is carried out, sampled value is obtained;For different classes of Sample sets different downsampling factors, passes through following presentation sampled probability:
Wherein, δ+The class that is positive predicts (i.e. pt>=0) downsampling factor, δ-The class that is negative predicts (i.e. pt< 0) downsampling factor; Bernoulli Jacob's sampling is carried out by the sampled probability, obtains sampled value Zt
If step 24, sampled value ZtIt is 1, then determines the label for needing to access the sample, then consumed budget obtains it Label;If instead ZtIt is 0, then determines not needing to access its label.
As shown in figure 3, being the flow diagram of asymmetric more new strategy of the invention, asymmetric update in the step 3 The step of strategy is as follows:
Step 31 obtains error prediction and has a label data
Step 32 has label data based on error prediction, is calculated by the following formula Asymmetric Loss value:
Wherein ρ represents the misclassification weight of positive class sample;Represent indicator function, that is, meeting condition is then 1, otherwise for 0.By the loss function of the cost-sensitive, we can asymmetrical update linear classifier;
Step 33 is based on asymmetric loss function value ltAnd optimisation strategy, linear classifier is updated by following formula Variance Σ:
Wherein, γ represents regularization coefficient;
Step 34 is based on asymmetric loss function value ltAnd optimisation strategy, it crosses following formula and updates the equal of linear classifier Value μ:
μt+1t-ηΣt+1gt
Wherein, η represents the learning rate of linear classifier, gtRepresent asymmetric loss function value ltGradient, to loss function Derivation can obtain.
Fig. 4 illustrates the online Active Learning Method suitable for no label unbalanced data stream in network security data collection The performance obtained on w8a, name of this method in Fig. 4 are OA3 and OA3_diag, and wherein OA3_diag is one of this method Simple variation is not described in detail.Other comparative approach, such as PAA, OAAL, CSOAL, SOAL are that solution classical in the problem is done Method, the experiment reference as proposed method.
W8a data set is a classical open source data set, for differentiating whether webpage is abnormal.The data set has 64700 A sample, 300 characteristic values.Its normal webpage quantity belongs to unbalanced data, degree of unbalancedness far more than abnormal webpage For 1:32.5.This example sets abnormal webpage and is positive class sample (minority class), and normal webpage is negative class sample (most classes).
In experiment, all training sample timing arrive and without labels.The Active Learning Method proposed will be directed to each The webpage that moment arrives judges whether to need to mark according to step 2.If desired, then using must money as labeled cost obtain Label, and according to step 3 more new model.
Detailed Experimental result is as shown in figure 4, the online Active Learning suitable for no label unbalanced data stream proposed Method achieves most excellent performance.
A kind of online Active Learning Method suitable for no label unbalanced data stream of the present embodiment, for uneven number According to asymmetric access strategy is proposed, the sample for needing to mark label is dynamically determined;It is further for effectively more new model, this method It is proposed asymmetric more new strategy, and using the second order efficient information of sample more new model;Simultaneously to institute in actual classification application The problems such as existing labeled data is sparse, sample imbalance, flow data has preferable resolution ability.

Claims (8)

1. a kind of online Active Learning Method suitable for no label unbalanced data stream, which comprises the following steps:
Step 1 is obtained without label data stream, is sequentially predicted in input linear classifier, wherein the classification tool of data flow There is height imbalance problem, sets positive class sample as classification rareness sample;
Step 2, the asymmetric access strategy according to proposition, linear classifier are sequentially determined for no label unbalanced data Need to be marked the sample of label;
Step 3, the asymmetric more new strategy according to proposition, linear classifier update linear point using the labeled data of error prediction Class device, and improved learning efficiency using the second order information of sample.
2. a kind of online Active Learning Method suitable for no label unbalanced data stream according to claim 1, special Sign is, in the step 1, the no label data flow table is shown asWhereinRepresentative sample Feature quantity be d, T indicate unlabeled exemplars sum;The sample budget that label can be marked is B, and the classification of label is yt ∈ { -1 ,+1 }, then positive class sample yt=+1 quantity is far less than negative class sample yt=-1, the linear classifier specifically make With method are as follows:
Step 11, the linear classifier are expressed asIt meets multivariate Gaussian distributionWherein μ Indicate the mean value of linear classifier w, and ∑ indicates the variance of linear classifier w;
The classification prediction of step 12, the linear classifier is expressed asWherein sign () expression is worked asThenOtherwise
Step 13, the prediction result of the linear classifier indicate are as follows: ifThen linear classifier classification is correct, otherwise The classification error of linear classifier.
3. a kind of online Active Learning Method suitable for no label unbalanced data stream according to claim 1, special Sign is that the step of asymmetric access strategy is as follows in the step 2:
Step 21, second order information ∑, that is, linear classifier variance based on sample calculate linear classifier to current sample Confidence level;
Step 22 is based on confidence level, calculates the asymmetric access parameter of current sample;
Step 23 is based on asymmetric access parameter, carries out Bernoulli Jacob's sampling, obtains its sampled value;
If step 24, the sampled value are 1, the label for needing to access the sample is determined;Conversely, not needing then.
4. a kind of online Active Learning Method suitable for no label unbalanced data stream according to claim 1, special Sign is that the step of asymmetric more new strategy is as follows in the step 3:
Step 31 obtains error prediction and has a label data;
Step 32 has label data based on error prediction, calculates the asymmetric loss function value of the data;
Step 33 is based on asymmetric loss function value and optimisation strategy, updates the variance ∑ of linear classifier;
Step 34: being based on asymmetric loss function value and optimisation strategy, update the mean μ of linear classifier.
5. a kind of online Active Learning Method suitable for no label unbalanced data stream according to claim 3, special Sign is, is calculated by the following formula confidence level:
Wherein, η represents the learning rate of linear classifier, and γ represents regularization coefficient, ρmax=max (1, ρ), ρ represent positive class sample Misclassification cost;In addition,Representative model represents model to current sample to the confidence of current sample Familiarity, to preferably calculate confidence level ct
Based on confidence level ct, it is calculated by the following formula the asymmetric access parameter of current sample:
qt=| pt|+ct
Wherein,It is marginal to the prediction of current sample to represent linear classifier, | pt|, i.e., prediction limit is absolute Value represents model to the distance of the Prediction distance classification plane of the sample;
Based on asymmetric access parameter qt, Bernoulli Jacob's sampling is carried out, sampled value is obtained;Different classes of sample is set different Downsampling factor, pass through following presentation sampled probability:
Wherein, δ+The class that is positive predicts i.e. pt>=0 downsampling factor, δ-The class that is negative predicts i.e. ptThe downsampling factor of < 0;Pass through the sampling Probability carries out Bernoulli Jacob's sampling, obtains sampled value Zt
6. a kind of online Active Learning Method suitable for no label unbalanced data stream according to claim 4, special Sign is, is calculated by the following formula asymmetric loss function value:
Wherein ρ represents the misclassification weight of positive class sample;Indicator function is represented, that is, meeting condition is then 1, is otherwise 0.
7. a kind of online Active Learning Method suitable for no label unbalanced data stream according to claim 4, special Sign is, asymmetric loss function value l is based on described in step 33tAnd optimisation strategy, linear classifier is updated by following formula Variance ∑:
Wherein, γ represents regularization coefficient.
8. a kind of online Active Learning Method suitable for no label unbalanced data stream according to claim 4, special Sign is, asymmetric loss function value l is based on described in step 34tAnd optimisation strategy, linear classifier is updated by following formula Mean μ:
μt+1t-η∑t+1gt
Wherein, η represents the learning rate of linear classifier, gtRepresent asymmetric loss function value ltGradient, to loss function derivation It can obtain.
CN201910001840.2A 2018-12-31 2019-01-02 A kind of online Active Learning Method suitable for no label unbalanced data stream Pending CN109800799A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/114167 WO2020140597A1 (en) 2018-12-31 2019-10-29 Online active learning method applicable to unlabeled unbalanced data stream

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811652531 2018-12-31
CN2018116525313 2018-12-31

Publications (1)

Publication Number Publication Date
CN109800799A true CN109800799A (en) 2019-05-24

Family

ID=66558426

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910001840.2A Pending CN109800799A (en) 2018-12-31 2019-01-02 A kind of online Active Learning Method suitable for no label unbalanced data stream

Country Status (2)

Country Link
CN (1) CN109800799A (en)
WO (1) WO2020140597A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110647117A (en) * 2019-09-06 2020-01-03 青岛科技大学 Chemical process fault identification method and system
WO2020140597A1 (en) * 2018-12-31 2020-07-09 华南理工大学 Online active learning method applicable to unlabeled unbalanced data stream
CN111882063A (en) * 2020-08-03 2020-11-03 清华大学 Data annotation request method, device, equipment and storage medium suitable for low budget
CN113360512A (en) * 2021-06-21 2021-09-07 特赞(上海)信息科技有限公司 Model updating method and device based on user feedback and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113095423B (en) * 2021-04-21 2024-05-28 南京大学 Stream data classification method based on online anti-deduction learning and realization device thereof
CN113537630A (en) * 2021-08-04 2021-10-22 支付宝(杭州)信息技术有限公司 Training method and device of business prediction model

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150235160A1 (en) * 2014-02-20 2015-08-20 Xerox Corporation Generating gold questions for crowdsourcing
CN106056130A (en) * 2016-05-18 2016-10-26 天津大学 Combined downsampling linear discrimination classification method for unbalanced data sets
CN109101993A (en) * 2018-07-05 2018-12-28 杭州电子科技大学 A kind of classification method for online non-equilibrium flow data
CN109800799A (en) * 2018-12-31 2019-05-24 华南理工大学 A kind of online Active Learning Method suitable for no label unbalanced data stream

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020140597A1 (en) * 2018-12-31 2020-07-09 华南理工大学 Online active learning method applicable to unlabeled unbalanced data stream
CN110647117A (en) * 2019-09-06 2020-01-03 青岛科技大学 Chemical process fault identification method and system
CN110647117B (en) * 2019-09-06 2020-12-18 青岛科技大学 Chemical process fault identification method and system
CN111882063A (en) * 2020-08-03 2020-11-03 清华大学 Data annotation request method, device, equipment and storage medium suitable for low budget
CN113360512A (en) * 2021-06-21 2021-09-07 特赞(上海)信息科技有限公司 Model updating method and device based on user feedback and storage medium
CN113360512B (en) * 2021-06-21 2023-10-27 特赞(上海)信息科技有限公司 Image processing model updating method and device based on user feedback and storage medium

Also Published As

Publication number Publication date
WO2020140597A1 (en) 2020-07-09

Similar Documents

Publication Publication Date Title
CN109800799A (en) A kind of online Active Learning Method suitable for no label unbalanced data stream
WO2023065545A1 (en) Risk prediction method and apparatus, and device and storage medium
CN112084790B (en) Relation extraction method and system based on pre-training convolutional neural network
CN112069415B (en) Interest point recommendation method based on heterogeneous attribute network characterization learning
Bellare et al. Alternating projections for learning with expectation constraints
CN104077352B (en) Linguistic indexing of pictures method based on energy model
CN109992668A (en) A kind of enterprise&#39;s the analysis of public opinion method and apparatus based on from attention
CN104966105A (en) Robust machine error retrieving method and system
CN116644755B (en) Multi-task learning-based few-sample named entity recognition method, device and medium
CN110110092A (en) A kind of knowledge mapping construction method and relevant device
Kashima et al. K-means clustering of proportional data using L1 distance
CN113901224A (en) Knowledge distillation-based secret-related text recognition model training method, system and device
CN113869054A (en) Deep learning-based electric power field project feature identification method
CN108694176A (en) Method, apparatus, electronic equipment and the readable storage medium storing program for executing of document sentiment analysis
CN112115264A (en) Text classification model adjusting method facing data distribution change
CN111339258A (en) University computer basic exercise recommendation method based on knowledge graph
CN110362828A (en) Network information Risk Identification Method and system
CN113722439B (en) Cross-domain emotion classification method and system based on antagonism class alignment network
CN108763487A (en) A kind of word representation method of fusion part of speech and sentence information based on Mean Shift
Wu et al. Clustering-training for data stream mining
CN113204975A (en) Sensitive character wind identification method based on remote supervision
CN114372148A (en) Data processing method based on knowledge graph technology and terminal equipment
Sehrawat Learning word embeddings from 10-K filings for financial NLP tasks
CN113435190A (en) Chapter relation extraction method integrating multilevel information extraction and noise reduction
Wang et al. Study on library management system based on data mining and clustering algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190524