CN110866560A - Symmetric low-rank representation subspace clustering method based on structural constraint - Google Patents

Symmetric low-rank representation subspace clustering method based on structural constraint Download PDF

Info

Publication number
CN110866560A
CN110866560A CN201911118690.XA CN201911118690A CN110866560A CN 110866560 A CN110866560 A CN 110866560A CN 201911118690 A CN201911118690 A CN 201911118690A CN 110866560 A CN110866560 A CN 110866560A
Authority
CN
China
Prior art keywords
matrix
data
clustering
representation
structural constraint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911118690.XA
Other languages
Chinese (zh)
Inventor
陶洋
鲍灵浪
胡昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN201911118690.XA priority Critical patent/CN110866560A/en
Publication of CN110866560A publication Critical patent/CN110866560A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention provides a symmetric low-rank representation subspace clustering method based on structural constraint, which comprises the following steps: s1: acquiring a data matrix X' of an original image; s2: carrying out PCA (principal component analysis) dimensionality reduction on the data matrix X' obtained in the step S1 to obtain X; s3: constructing an information error correction matrix R of the X by utilizing the angle decision weight; s4: inputting the data matrix X obtained in the step S2 and the error correction matrix R obtained in the step S3 into a structural constraint symmetric low-rank representation model, and optimizing an output representation matrix Z; s5: obtaining a weight matrix L for spectral clustering using the angle information indicating the principal direction of the matrix Z output at S4; s6: the weight matrix L obtained in S5 is used for spectral clustering to obtain a clustering result. A large number of experimental results on two data sets show that compared with several latest methods, the method can well reveal the structure of a complex subspace and generate clustering performance with advanced effect.

Description

Symmetric low-rank representation subspace clustering method based on structural constraint
Technical Field
The invention belongs to the field of subspace clustering, and particularly relates to a subspace clustering method based on structural constraint and symmetric low-rank representation.
Background
Recently, clustering algorithms for large-scale and high-dimensional data are one of the hot spots and difficulties in the field of cluster analysis. Due to the sparsity of the high-dimensional data, the traditional clustering algorithm cannot obtain a satisfactory result when clustering the high-dimensional data. The subspace clustering algorithm is used for solving the problem of the traditional clustering algorithm in high-dimensional data clustering, and is a new branch of the clustering algorithm. Subspace clustering methods, which can reveal the potential subspace structure of high-dimensional data by segmenting the data into their corresponding subspaces derived therefrom, have been widely used in many computer vision and machine learning applications, such as saliency detection, motion segmentation, face clustering, image segmentation, and the like. In fact, high dimensional data has such a remarkable feature: the high-dimensional data is not without structure, but rather proves to be in the union of multiple low-dimensional subspaces. Based on this finding, subspace clustering methods have been studied to cluster high-dimensional data. Combining the representation method with the spectral clustering algorithm is the most representative method.
The most representative methods in the past are Low Rank Representation (LRR) proposed by Liu et al and sparse representation (SSC) proposed by Elhamifar et al, and on the basis of the two methods, some constraints are added by scholars to achieve better performance. Ni et al propose a semi-positive definite constrained low-rank representation (LRR-PSD), Vidal et al propose a low-rank subspace clustering (LRSC) non-convex model by decomposing the corrupted data matrix into a sum of clean, self-expressing dictionaries plus noise and/or severe error matrices. Chen et al propose a Low Rank Representation (LRRSC) method with symmetry constraints that extends the original low rank representation algorithm by integrating symmetry constraints into the low rank property of the high dimensional data representation. However, these methods only consider low rank or sparsity, which causes a problem that the representation coefficients are too sparse or too low rank.
The invention provides a symmetric low-rank representation subspace clustering method based on structural constraint, which applies the symmetric constraint and the structural constraint to a low-rank representation coefficient, can capture the global structure and the local structure of data, and can keep the symmetric consistency of a representation matrix. The model well balances the inter-class sparsity and intra-class aggregations of the coefficient matrix and can better reveal the subspace structure of the data.
Disclosure of Invention
The subspace clustering method based on the symmetric low-rank representation of the structural constraint simultaneously applies the structural constraint and the symmetric constraint to a solution model of the low-rank representation, utilizes an angle to determine the weight to construct an information error correction matrix R, adopts a staggered direction method to carry out optimization, thereby obtaining the optimal solution of a representation coefficient Z, then utilizes the main direction information of the representation coefficient to calculate a weight matrix W, and finally uses the W for clustering to obtain a clustering result.
The technical scheme of the method is as follows:
step S1, acquiring a data matrix X' of the original image:
given a data matrix X' ═ X1,x2,…xn]∈Rd×nIs from k subspaces
Figure BDA0002274802930000011
Obtained, each subspace i contains niIndividual data sample
Figure BDA0002274802930000012
The task is to combine the data points xiClustering into subspaces S extracted therefromiIn practice, preprocessing is performed by resizing the face image of size 192 × 168 of the original data to 48 × 42 pixels, i.e., the original dimension of the data matrix is 48 × 42, 2016.
Step S2, using PCA to perform dimensionality reduction on the data matrix XX obtained in the step S1 to obtain a matrix data matrix X: the data is reduced in dimension using PCA to obtain raw pixels (2016, 500, 300, 100 dimensional data, respectively.
Step S3, constructing the information error correction matrix R of the data matrix X obtained in step S2 using the angle determination weights:
the weight between data points can be determined by the angle between data points, and if the angle between data points is small, they are likely to belong to the same class, it can be inferred that the weight of the samples within the class is small, then the data is normalized and the absolute value between them is calculated, so the ideal R is as follows:
Figure BDA0002274802930000013
wherein
Figure BDA0002274802930000014
And
Figure BDA0002274802930000015
is xiAnd xjNormalizing the data points and empirically setting σ
Figure BDA0002274802930000016
The mean of the elements.
S4, inputting the data matrix X obtained in the step S2 and the error correction matrix R obtained in the step S3 into a constructed symmetrical low-rank model based on structural constraint, and optimizing an output expression matrix Z;
it is a natural idea to put constraints on the structure of solutions for LRR models by adding Σ in the objective function in order to have both symmetric and structural constraints on low rank representationsi,jRij|ZijI and Zij=ZjiTo limit the structure of the solution, while, in order to make the obtained Z more robust to noise and avoid NP problems, a symmetric low rank representation model with structural constraints is as follows:
Figure BDA0002274802930000021
s.t.X=AZ+E,Z=ZT
by introducing two auxiliary variables J and L, performing Laplace alternating optimization on the model by using an ADM method to obtain an augmented Lagrange function of the formula as follows:
Figure BDA0002274802930000022
wherein Y is1、Y2And Y3Is Lagrange multiplier and μ is penalty factor. The augmented Lagrange function is optimized by fixing other variables at a time, updating only one variable.
Optimizing step 1, updating a J matrix to obtain the following formula;
Figure BDA0002274802930000023
optimizing step 2, updating the L matrix to obtain the following formula;
Figure BDA0002274802930000024
optimizing step 3, updating the Z matrix to obtain the following formula;
Figure BDA0002274802930000025
optimizing step 4, updating the E matrix to obtain the following formula;
Figure BDA0002274802930000026
optimization step 5, after J, L, Z and E are updated in sequence, the pairs1、Y2、Y3And β, by the following formula:
Figure BDA0002274802930000027
and (5) performing iterative operation until the updating optimization is stopped when convergence occurs, and obtaining an output representation matrix Z.
Step S5: obtaining a weight matrix L for spectral clustering using the angle information indicating the principal direction of the matrix Z output in step S4; specifically, let SVD of Z be U*Σ*(V*)TParameter U*And V*Is an orthogonal base of Z, we convert U*Each column weight of (x) multiplied by (sigma)*)1/2Will V*Is multiplied by (sigma) per row weight of*)1/2Then we define M ═ U**)1/2,N=(Σ*)1/2(V*)TThen there is Z*Finally, by using the angle information from all row vectors of matrix M or all column vectors of matrix N to define affinity map matrix L, then there are:
Figure BDA0002274802930000028
wherein m isi,mjRows and columns of M, n, respectivelyi,njRespectively N rows and columns.
Step S6: compared with the prior art, the method has the following technical effects that the weight matrix L obtained in the S5 is used for spectral clustering Ncluts to obtain clustering results: a symmetrical low-rank representation model with structural constraint is constructed, the representation coefficient Z with more block diagonal property is obtained by simultaneously applying the symmetrical constraint and the structural constraint to the low-rank representation, the natural structure of the subspace can be disclosed better, and the method has more favorable advantages no matter from global symmetrical low-rank property or weighted local linearity. The invention proves that the image clustering has more superior performance in the aspect of subspace clustering.
Drawings
FIG. 1 is a flow chart of the operation of a subspace clustering method based on a symmetric low-rank representation of structural constraints according to the present invention.
Fig. 2 is an exemplary graph under both the Yale and Hopkins data sets.
FIG. 3 clustering accuracy of SCSLR under yale and Hopkins data sets.
FIG. 4 shows the convergence curves of the SCSLR optimization algorithm on yale and Hopkins datasets.
Table 1 representation of the algorithms in the Yale dataset.
Table 2 performance of each algorithm on the Hopkins dataset.
Detailed Description
The structural constraint-based subspace clustering method of symmetric low-rank representation is shown in the flow chart of fig. 1: the method comprises the steps of firstly carrying out size adjustment on data so as to facilitate faster experiment speed, secondly utilizing PCA to reduce dimension, thirdly utilizing an information error correction matrix R required by calculation, thirdly establishing a main model of the user, thirdly utilizing ADM to carry out alternate optimization, fourthly utilizing an expression matrix Z to calculate angle information of a main direction to obtain a weight matrix L for spectral clustering, and lastly carrying out comparative analysis on an obtained clustering result.
The invention is further illustrated by the following example of an embodiment, which is intended only for a better understanding of the subject matter of the invention and is not intended to limit the scope of the invention. The specific technical steps are as follows:
step S1, this embodiment adopts the face and motion data set given by Yale and Hopkins, the Yale data set includes 10 persons, 64 pieces of photo data with different information for each person, and 640 pieces of face data, and fig. 2(top) is the photo data of a single person in the Yale data set. We adjusted the pixels of each person in the Yale dataset from 192 x 168 to 48 x 42 pixels.
The Hopkins dataset contains 155 video sequences, each matched to a low-dimensional subspace, and containing 39 to 550 data vectors extracted from two or three motions, fig. 2(bottom) being some frames of tracked feature points for the two video sequences.
In this embodiment, the test is performed on two data sets, and the test is performed for 1 time respectively, and the standard deviation of the accuracy of each clustering is calculated.
Step S2, after the samples of the two data sets are normalized, PCA reduces the dimensions of the face data set to 2016, 500, 300, and 100 dimensions, and reduces the dimensions of Hopkins to 4n dimensions of subspace (n is the number of subspaces).
Step S3, in this embodiment, after the dimensions of the two data sets are reduced, the information error correction matrix R is calculated after the data matrix is normalized
Step S4, inputting the data matrix X and the error correction matrix R into the constructed model, determining the dimensions in the yale data set to be 2016, 500, 300 and 100 dimensions respectively, and determining the dimensions in the Hopkins data set to be 4n dimensions of the subspace.
A penalty parameter λ is determined β.
According to the lagrangian augmentation model of the model, the present embodiment initializes the parameters, and the parameter list used in the present embodiment is:
parameter(s) J L Z E λ β Y1 Y2 Y3 μmax ρ μ
Numerical value 0 0 0 0 2.5 0.03 0 0 0 105 1.01 0.1
Starting to train the model, adopting a staggered direction method, sequentially updating and optimizing J, L, Z and E matrixes, and then updating Y matrix1、Y2、Y3And β, by the following formula:
Figure BDA0002274802930000031
and circulating until convergence. Resulting in Z and E matrices.
The SCSLR model has now been set up. Two input channels and two output channels are included.
Step S5, the present embodiment determines penalty parameters α, and constructs an affinity matrix L using the representation matrix Z output by the model.
Step S6, in this example, the L is input into the spectral clustering Ncuts algorithm to calculate a clustering result, and an experiment is performed on two data sets, and a clustering error changes with adjustment of parameters, and fig. 3 shows that the clustering error changes with adjustment of parameters for the two data sets. As can be seen from FIG. 3, the clustering error of the model is very low, which indicates that the clustering precision is very good
In the example, the SCSLR is compared with a low-rank or sparse-based correlation algorithm, the clustering error is obtained after the experiment of the example is run for 1 time, and the experimental results are shown in tables 1 and 2.
By observing the above graph, it can be seen that: in the aspect of the clustering error rate, the clustering error rate of the SCSLR algorithm is much lower than that of the conventional low-rank sparse algorithm, and most of the clustering error rate is far lower than that of the latest symmetric low-rank algorithm;
we use an optimization algorithm based on the cross direction method to solve the SCSLR model. We can get the convergence of the optimization algorithm on different datasets. The convergence is expressed using the value of the objective function and the value of the number of iterations. We can see that the curve generally decreases as the number of iterations increases. The convergence curve is shown in fig. 4. Indicating that the ADM based algorithm has good convergence.
Therefore, the SCSLR algorithm is superior to the conventional low-rank or sparse-based method in the aspects of accuracy and stability; and the SCSLR algorithm is also more excellent in clustering performance.
Table 1: representation of algorithms in Yale data set
Figure BDA0002274802930000041
Table 2: performance of algorithms on the Hopkins dataset
Figure BDA0002274802930000042

Claims (7)

1. A subspace clustering method based on symmetric low-rank representation of structural constraint is characterized by comprising the following steps:
s1: acquiring a data matrix X' of an original image;
s2: carrying out PCA (principal component analysis) dimensionality reduction on the data matrix X' obtained in the step S1 to obtain X;
s3: constructing an information error correction matrix R of the data matrix X obtained in the step S2 by using the angle decision weight;
s4: inputting the data matrix X obtained in the step S2 and the error correction matrix R obtained in the step S3 into a constructed symmetrical low-rank model based on structural constraint, and optimizing an output representation matrix Z;
s5: obtaining a weight matrix L for spectral clustering using the angle information indicating the principal direction of the matrix Z output at S4;
s6: the weight matrix L obtained in S5 is used for spectral clustering to obtain a clustering result.
2. A subspace clustering method based on symmetric low rank representation of structural constraint, wherein in step S1: the obtained data matrix X' for the original image is [ X ]1,x2,…xn]∈Rd×nIn expression, let X' be from k subspaces
Figure FDA0002274802920000011
Obtained, each sonSpace i contains niIndividual data sample
Figure FDA0002274802920000012
The task is to combine the data points xiClustering into subspaces S extracted therefromiIn (1).
3. A subspace clustering method based on symmetric low rank representation of structural constraint, wherein in step S2: and (5) carrying out dimensionality reduction on the data matrix X' obtained in the step (S1) by utilizing PCA to obtain a matrix data matrix X.
4. A subspace clustering method based on symmetric low rank representation of structural constraint, wherein in step S3: and (3) constructing an information error correction matrix R by using an angle decision weight method:
the weight between data points can be determined by the angle between data points, and if the angle between data points is small, they are likely to belong to the same class, it can be inferred that the weight of the samples within the class is small, then the data is normalized and the absolute value between them is calculated, so the ideal R is as follows:
Figure FDA0002274802920000013
wherein
Figure FDA0002274802920000014
And
Figure FDA0002274802920000015
is xiAnd xjNormalizing the data points and empirically setting σ
Figure FDA0002274802920000016
The mean of the elements.
5. A subspace clustering method based on symmetric low rank representation of structural constraint, wherein in step S4: constructing a structural constrained symmetric low-rank representation model:
it is a natural idea to put constraints on the structure of solutions for LRR models by adding Σ in the objective function in order to have both symmetric and structural constraints on low rank representationsi,jRij|ZijI and Zij=ZjiTo limit the structure of the solution, while, in order to make the obtained Z more robust to noise and avoid NP problems, a symmetric low rank representation model with structural constraints is as follows:
Figure FDA0002274802920000017
s.t.X=AZ+E,Z=ZT
a symmetrical low-rank representation model with structural constraint is constructed, the low-rank representation is simultaneously subjected to symmetrical constraint and structural constraint to obtain a representation coefficient Z with more block diagonal, and the solution is to update and optimize a representation matrix Z and an error matrix E by introducing two auxiliary variables J and L and using an ADM method to the model.
6. A subspace clustering method based on symmetric low rank representation of structural constraint, wherein in step S5: obtaining a weight matrix L for spectral clustering using the angle information indicating the principal direction of the matrix Z output at S5, specifically, Z ═ U**(V*)TThen define M ═ U*(∑*)1/2,N=(∑*)1/2(V*)TThe affinity graph matrix L is defined by using angle information from all row vectors of matrix M or all column vectors of matrix N.
7. A subspace clustering method based on symmetric low rank representation of structural constraint, wherein in step S6: the weight matrix L obtained in S6 is used for spectral clustering to obtain a clustering result.
CN201911118690.XA 2019-11-15 2019-11-15 Symmetric low-rank representation subspace clustering method based on structural constraint Pending CN110866560A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911118690.XA CN110866560A (en) 2019-11-15 2019-11-15 Symmetric low-rank representation subspace clustering method based on structural constraint

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911118690.XA CN110866560A (en) 2019-11-15 2019-11-15 Symmetric low-rank representation subspace clustering method based on structural constraint

Publications (1)

Publication Number Publication Date
CN110866560A true CN110866560A (en) 2020-03-06

Family

ID=69653799

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911118690.XA Pending CN110866560A (en) 2019-11-15 2019-11-15 Symmetric low-rank representation subspace clustering method based on structural constraint

Country Status (1)

Country Link
CN (1) CN110866560A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930934A (en) * 2020-06-05 2020-11-13 江苏理工学院 Clustering method based on dual local consistency constraint sparse concept decomposition

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107292258A (en) * 2017-06-14 2017-10-24 南京理工大学 High spectrum image low-rank representation clustering method with filtering is modulated based on bilateral weighted

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107292258A (en) * 2017-06-14 2017-10-24 南京理工大学 High spectrum image low-rank representation clustering method with filtering is modulated based on bilateral weighted

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
KEWEI TANG 等: ""Structure-Constrained Low-Rank Representation"", 《IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS》, 27 February 2014 (2014-02-27), pages 2167 - 2179 *
刘婕等: "结构图正则低秩子空间聚类", 《计算机工程与应用》, no. 18, 15 September 2018 (2018-09-15), pages 1 - 7 *
张燕: "一种改进的结构化LRR算法", 《现代计算机(专业版)》, no. 22, 5 August 2018 (2018-08-05), pages 8 - 13 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930934A (en) * 2020-06-05 2020-11-13 江苏理工学院 Clustering method based on dual local consistency constraint sparse concept decomposition
CN111930934B (en) * 2020-06-05 2023-12-26 江苏理工学院 Clustering method based on constraint sparse concept decomposition of dual local agreement

Similar Documents

Publication Publication Date Title
CN109522956B (en) Low-rank discriminant feature subspace learning method
Wang et al. Tensor low-rank constraint and $ l_0 $ total variation for hyperspectral image mixed noise removal
Lin et al. Hyperspectral image denoising via matrix factorization and deep prior regularization
Wang et al. Reweighted low-rank matrix analysis with structural smoothness for image denoising
CN109190511B (en) Hyperspectral classification method based on local and structural constraint low-rank representation
CN106326843B (en) A kind of face identification method
CN108182449A (en) A kind of hyperspectral image classification method
CN109993208B (en) Clustering processing method for noisy images
CN107316309B (en) Hyperspectral image saliency target detection method based on matrix decomposition
CN108520495B (en) Hyperspectral image super-resolution reconstruction method based on clustering manifold prior
CN105550649A (en) Extremely low resolution human face recognition method and system based on unity coupling local constraint expression
CN109376787A (en) Manifold learning network and computer visual image collection classification method based on it
CN114419406A (en) Image change detection method, training method, device and computer equipment
CN116152544A (en) Hyperspectral image classification method based on residual enhancement spatial spectrum fusion hypergraph neural network
CN113838104B (en) Registration method based on multispectral and multimodal image consistency enhancement network
CN109447147B (en) Image clustering method based on depth matrix decomposition of double-image sparsity
CN108121964B (en) Matrix-based joint sparse local preserving projection face recognition method
CN110866560A (en) Symmetric low-rank representation subspace clustering method based on structural constraint
CN113221992A (en) Based on L2,1Large-scale data rapid clustering method of norm
CN111062888B (en) Hyperspectral image denoising method based on multi-target low-rank sparsity and spatial-spectral total variation
CN112784747A (en) Multi-scale eigen decomposition method for hyperspectral remote sensing image
CN105184320B (en) The image classification method of non-negative sparse coding based on structural similarity
CN108428226A (en) A kind of distorted image quality evaluating method based on ICA rarefaction representations and SOM
CN112417234B (en) Data clustering method and device and computer readable storage medium
CN111738298B (en) MNIST handwriting digital data classification method based on deep-wide variable multi-core learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200306