CN109858533A - A kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus - Google Patents
A kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus Download PDFInfo
- Publication number
- CN109858533A CN109858533A CN201910049409.5A CN201910049409A CN109858533A CN 109858533 A CN109858533 A CN 109858533A CN 201910049409 A CN201910049409 A CN 201910049409A CN 109858533 A CN109858533 A CN 109858533A
- Authority
- CN
- China
- Prior art keywords
- mlp
- rbf
- layer
- network
- node
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 28
- 230000009977 dual effect Effects 0.000 claims description 5
- 230000010287 polarization Effects 0.000 claims description 4
- 238000013507 mapping Methods 0.000 abstract description 26
- 238000012549 training Methods 0.000 abstract description 20
- 230000008901 benefit Effects 0.000 abstract description 6
- 230000006641 stabilisation Effects 0.000 abstract 1
- 238000011105 stabilization Methods 0.000 abstract 1
- 238000004422 calculation algorithm Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 10
- 239000011159 matrix material Substances 0.000 description 6
- 238000005457 optimization Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 3
- 230000004807 localization Effects 0.000 description 3
- 238000000034 method Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000007635 classification algorithm Methods 0.000 description 2
- 230000007812 deficiency Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 230000001537 neural effect Effects 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 241000208340 Araliaceae Species 0.000 description 1
- 238000012935 Averaging Methods 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000452 restraining effect Effects 0.000 description 1
- 230000000946 synaptic effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Landscapes
- Image Analysis (AREA)
Abstract
The invention discloses a kind of RBF-MLP mixed structure neural network classifiers of prepositus nucleus, including input layer, output layer;Further include RBF hidden layer, is made of the gaussian kernel function of one group of different parameters;MLP hidden layer is made of RBF hidden layer to each node layer output layer.The present invention effectively combines the strong advantage of the good advantage of RBF network stabilization and MLP network generalization ability, after RBF network nuclear mapping, so that the separability of original sample is improved, input of the sample as subsequent connected MLP network after mapping, the convergence rate of MLP neural network can be effectively improved and fall into the risk of Local Minimum, while reducing the dependence of the experience selection to MLP network hidden node parameter.Therefore training sample space can more effectively be learnt, while effectively improves the classification performance of single RBF network and MLP network.
Description
Technical field
The invention belongs to area of pattern recognition, and in particular to a kind of RBF-MLP mixed structure neural network of prepositus nucleus point
Class device.
Background technique
In RBF neural, it is considered that selection by hidden layer nuclear parameter optimization, it can largely will be former
The nonlinear problem of beginning sample space is converted into linear problem, and subsequent task is how to find the hyperplane of optimization, to solve
Pattern recognition problem after Non-linear Kernel mapping.But in practice, for more complicated nonlinear problem, generally it is difficult straight
It connects and the nonlinear problem in original sample space is mapped as linear problem, i.e., generally it cannot be guaranteed that the classifier of nuclear space is line
What property can divide.This results in the limitation of the linear classification algorithm of these optimization RBF network outputs.
Summary of the invention
In order to improve the above problem of the existing technology, it is an object of that present invention to provide a kind of RBF-MLP of prepositus nucleus
Mixed structure neural network classifier.
The technical scheme adopted by the invention is as follows:
A kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus, including input layer, output layer;It further include RBF
Hidden layer is made of the gaussian kernel function of one group of different parameters;
MLP hidden layer is made of RBF hidden layer to each node layer output layer.
Further, the input layer is made of t source node, and wherein t is the dimension of input vector x, i.e. x ∈ Rt。
Further, the calculating of RBF hidden layer is indicated with following formula:
Wherein, K is the node number in RBF hidden layer, μiFor the center of Gaussian radial basis function, σiIt is corresponding i-th
The core of Gaussian radial basis function is wide.
Further, MLP hidden layer is made of RBF hidden layer to each node layer output layer;By RBF hidden layer
Output h after dual polarizationi(x) input as first hidden layer of BP network, then in i-th of node of first of BP hidden layer
Induction local field is expressed as
Wherein,Be be directed toward from l-1 layers of node j l layers node i weight,For l-1 layers of front
The output signal of node j.
Further, output layer, if the depth of MLP network is L, the depth of MLP network is inputted equal to MLP network here
Layer, hidden layer and output layer are respectively the sum of several layer by layer;At k-th of node of output layer, have
Further, the MLP hidden layer is single.
Further, the MLP hidden layer is two.
The invention has the benefit that
Classifier of the invention combines RBF network and MLP network a little, while effectively inhibiting single MLP
The deficiency of network and RBF network;On the one hand, network structure of the invention is by hidden layer in original RBF network to output layer
Linear classifier is changed to Nonlinear Classifier, and this Nonlinear Classifier is realized using nonlinear MLP neural network, it
Effectively reduce the dependence chosen to original RBF network nuclear parameter;On the other hand, network structure model of the invention is by RBF net
The good advantage of network stability and the strong advantage of MLP network generalization ability effectively combine.After RBF network nuclear mapping, so that former
The separability of beginning sample is improved, and input of sample after mapping as subsequent connected BP network can effectively change
The convergence rate of kind MLP neural network and the risk for falling into Local Minimum, while reducing the warp to BP network hidden node parameter
Test the dependence of selection.Therefore training sample space can more effectively be learnt, while effectively improves single RBF network
And the classification performance of MLP network.
Detailed description of the invention
Fig. 1 is the RBF-MLP mixed structure neural network classifier mapping effect diagram of prepositus nucleus of the present invention.
Fig. 2 is the RBF-MLP mixed structure neural network real network illustraton of model that the present invention has single MLP hidden layer.
Fig. 3 is that there are two the RBF-MLP mixed structure neural network real network illustratons of model of MLP hidden layer for present invention tool.
Specific embodiment
In order to which technical solution of the present invention and advantage is more clearly understood, with reference to the accompanying drawing and specific embodiment is to this
Invention is further elaborated.
For RBF network, determine that the major parameter of network performance is the selection optimization and output of hidden node nuclear parameter
The adjustment problem of weight.Existing RBF network output layer weight generally uses linear classification algorithm, including recurrence least square
(RLS), least mean square algorithm (LMS), Algorithm of Orthogonal Least Square (OLS) etc., for more complicated nonlinear problem, often
Do not ensure that nuclear space sample is linear separability, this just increases the dependence to hidden node kernel parameter selection.
Based on the above main problem, the present invention by it is original connection the network concealed layer of RBF to output layer linear classifier more
It is changed to Nonlinear Classifier, which is realized by MLP network.In this way by two different network structures according to
Certain way is merged, and to constitute the network model of a complementary structure, can improve the convergence rate of MLP network, and drop
The low selection to RBF hidden node nuclear parameter relies on, and improves generalization ability of network performance.
In the RBF-MLP mixed structure neural network classifier of prepositus nucleus proposed by the present invention, wherein RBF network portion can
Terms of localization approach is carried out to training sample space to be used to realize, the sample of different zones in sample space is mapped to unit by it
Each vertex attachment of hypercube, the dimension of hypercube are exactly RBF network portion in RBF-MLP mixed structure neural network
The number of hidden node core.Here for convenience, the input sample that setting is covered by different RBF network hidden nodes is corresponding
RBF network hidden node immediate vicinity, and be not overlapped each other between different hidden nodes.For different classification problems,
By changing the number of preposition RBF core, the different mappings effect to sample space may be implemented.
In actual classification problem, if the number of RBF network portion hidden node core is K, for an arbitrary input
Vector x ∈ Rt, when the sample is mapped by the localization of some RBF core in the mentioned mixed structure neural network classifier of this chapter
Afterwards, this mapping relations can be expressed as f:Rt→(0,1]K.In this manner it is achieved that some bad points in original sample space
Outlier will be defined in a limited space, can using the geometric shape of sample distribution in sample space as mappings characteristics,
To form new feature vector, so that the separability in original sample space is improved;Then pass through nonlinear MLP net
Network can complete effective classification of the feature space sample after RBF nuclear mapping, can reduce to a certain extent to original RBF net
The dependence of network space nuclear mapping.Even if there is certain deviation in luv space mapping, nonlinear MLP network can be certain
It is compensated in degree.Therefore, the RBF-MLP mixed structure neural network classifier of the mentioned prepositus nucleus of the present invention can be by RBF
Local nonlinearity mapping ability at network hidden node effectively combines with the global generalization ability at MLP network hidden node,
And it is effectively improved the deficiency of single structure RBF neural and MLP neural network.
In order to further illustrate the mentioned prepositus nucleus of the present invention RBF-MLP mixed structure neural network superiority, I
Attempt theoretically to analyze it, main purpose be illustrate original sample after RBF nuclear mapping, it is certain meeting
Under the conditions of can improve the separability in original sample space.
For the simplicity of analysis, set each sample directly as the center of RBF network hidden node, it is original to complete
The higher dimensional space Nonlinear Mapping of sample.If K is the number of RBF network hidden node, training setBy two mode classes
It does not form, wherein xk∈Rt, X1With X2Respectively two pattern class set, here
X=X1∪X2, M=M1+M2.If ZBAnd ZTThe respectively between class scatter matrix and total population scatter matrix of training set,WithRespectively
The between class scatter matrix and total population scatter matrix that are training sample after RBF network nuclear mapping;μ1,μ2,μ0Respectively first
The mean value of mode class training sample, second mode class training sample and overall training sample.
In this case, we provide following theorem:
If K=M, when meeting conditionWhen, then haveHere φkIt is sample xkOutput after k-th of RBF network hidden node maps.
It proves: according to the definition of between class scatter matrix and total population scatter matrix,
If λk=Mk/ M, then have
Therefore
IfRespectively first mode class training sample, first mode class training sample and all instructions
Practice the sample output after RBF network mapping respectively.The similar above derivation, can obtain
In order to be simplified and facilitated theory analysis to problem, to an arbitrary sample xkIfIt is xkBy RBF network
Output after mapping, mapping only has a RBF network hidden node in action each time, then has
On the other hand,
Due to
Then have
Therefore
On the other hand,
Therefore, convolution (1.1) and (1.2), work as satisfaction
When, then have
Theorem shows as K=M, when original sample passes through RBF network, once the overall of original sample set spreads and class
Between distribution ratio not less than all samples after RBF nuclear mapping side and with each mode class sample standard deviation side and ratio
When, the separability of original sample can be improved, this mean that can make by adjusting RBF nuclear parameter mapping obtain it is each
A φkReach a suitable value.It, then can be complete by nonlinear MLP network under the premise of separability obtains improvement
At effective classification of the feature space sample after RBF nuclear mapping, to improve the classification performance of network.
The RBF-MLP mixed structure neural network classifier of prepositus nucleus of the present invention, is mainly made of four parts:
1, input layer, the layer are made of t source node, and wherein t is the dimension of input vector x, i.e. x ∈ Rt。
2, RBF hidden layer, this layer are made of the gaussian kernel function of one group of different parameters, if the node number in hidden layer is
K, training sample number are M, here K < M.The calculating for hiding layer unit can be represented by the formula:
Here μiFor the center of Gaussian radial basis function, σiIt is wide for the core of Gaussian radial basis function.
3.MLP hidden layer.This layer is made of RBF hidden layer to each node layer output layer, and RBF hidden layer is bipolar
Output h after changei(x) input as first hidden layer of BP network, then in i-th of node induction of first of BP hidden layer
Local field is expressed as
WhereinBe be directed toward from l-1 layers of node j l layers node i weight,For l-1 layers of front
The output signal of node j.
If selecting sigmoid function of the hyperbolic tangent function as MLP hidden layer, the output signal of l layers of node i
Are as follows:
Here a, b are constants.
If node i in first MLP hidden layer, i.e. l=1 then has
Wherein hiIt (x) is φi(x) dual polarization output, is represented by
hi(x)=2 φi(x)-1 (1.7)
4. output layer.If the depth of MLP network is L, the depth of MLP network is equal to MLP network input layer, hidden layer here
And output layer is respectively the sum of several layer by layer.For example, if l=1, L=3;If l=2, L=4.At k-th of node of output layer,
Have
In MLP network, generally uses based on the back-propagation algorithm of gradient decline and update network weight parameter.In order to
Network is set to avoid falling into Local Minimum, input variable will generally be pre-processed, so that the mean value on entire training set is close
Value in 0, or compared with standard deviation is smaller.But it is original in hybrid RBF-MLP network structural model that the present invention is mentioned
Sample is after the nuclear mapping of RBF network hidden node, and for the output valve of each hidden node between 0 to 1, whole mean value is one
Number greater than 0.Therefore, in formula (1.7), the adjustment of this dual polarization output can be averaging entire training set equal
Value is close to 0, to ensure the validity of the MLP network input after RBF nuclear mapping.
When the RBF-MLP network model of prepositus nucleus is set up, subsequent task is to set up the optimization of network model
Practise algorithm.The basis of two stages learning tasks of the algorithm learning tasks of the mentioned network model of the present invention in RBF network model
On come design realize.In the learning algorithm of various RBF networks, " k mean value, RLS " algorithm are the common calculations of trained RBF network
Method.Wherein k mean cluster is for training hidden node nuclear parameter in the network concealed layer of RBF, RLS algorithm be used to realize from hidden layer to
The right-value optimization of output layer.It, will be original in the RBF-MLP mixed structure neural network classifier for the prepositus nucleus that the present invention is mentioned
The linear classifier of the connection network concealed layer of RBF to output layer is substituted with nonlinear MLP network.Pass through RBF network first
The completion of hidden node kernel function carries out localization nuclear mapping to original sample space, then passes through sample after MLP network realization nuclear mapping
This Nonlinear Classification.According to this thinking, the specific implementation step of the learning algorithm of mentioned network classifier is given below:
1, it initializes.Main includes setting RBF network hidden node number, sets the MLP network number of plies and hidden node number, with
Machine initializes each layer weighting parameter of MLP network etc..
2, to training sampleCarry out K- mean cluster.Cluster centre is chosen using self-organizing:
(1) from training sampleMiddle random selection K different samples as initial center, and remember this K initially in
The heart is denoted as
(2) training sample x is randomly choosed from training datamAs input;
(3) it calculates that the input sample is nearest apart from which cluster centre, just it is classified as the same class of the cluster centre,
Calculate:
Corresponding k value is found, by xmIt is classified as kth class.
(4) cluster centre is updated:
Wherein η represents learning rate, and n represents iteration step length.
(5) judge whether algorithm restrains, if not restraining, go to (2) step and continue iteration.
3, mean value will be clusteredAs hidden unit i=1,2..., the Gaussian function φ of KiThe center of ().Utilize formula
(1.3) output that input sample passes through each RBF network hidden node is calculated.Here it is designed to simplify, Ke Yiqu
DmaxIt is the maximum distance between all central points.
4, it is completed using formula (1.7) to φi(x) dual polarization processing.
5, setting MLP network threshold epsilon is as iteration stopping condition.By the output h (x) of the network concealed layer of standardized RBF
As the input value of MLP network, the output of MLP network is calculated, here h (x)=(h1(x),h2(x),...,hK(x))。
6, the forward calculation of MLP network is completed using formula (1.4)-(1.6) and (1.8) respectively.
7, the total mean square error of MLP network is calculated
Wherein dkIt is exported for the target of MLP network, okFor the reality output of target network, C is the number for exporting hidden node.
Judge whether J (ω) < ε, if so, algorithm stops, otherwise carrying out in next step.
8, the retrospectively calculate of MLP network.Calculate the local gradient of network:
9, l layers of MLP network of synaptic weight is adjusted:
Wherein η is learning rate, and α is momentum term constant, and n is iteration step length.
10, the iterative calculation of MLP network.New bout sample is presented to the hybrid RBF-MLP network and basis of prepositus nucleus
5-9 step is calculated, and algorithm terminates when J (ω) < ε.
It is exercised supervision study using method of the invention to training sample, effective neural network classifier ginseng can be obtained
Number is classified to complete the sample unknown to classification.
The present invention is not limited to above-mentioned optional embodiment, anyone can show that other are each under the inspiration of the present invention
The product of kind form.Above-mentioned specific embodiment should not be understood the limitation of pairs of protection scope of the present invention, protection of the invention
Range should be subject to be defined in claims, and specification can be used for explaining claim.
Claims (7)
1. a kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus, including input layer, output layer;It is characterized by:
Further include RBF hidden layer, is made of the gaussian kernel function of one group of different parameters;
MLP hidden layer is made of RBF hidden layer to each node layer output layer.
2. a kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus according to claim 1, feature exist
In: the input layer is made of t source node, and wherein t is the dimension of input vector x, i.e. x ∈ Rt。
3. a kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus according to claim 2, feature exist
In: the calculating of RBF hidden layer is indicated with following formula:
Wherein, K is the node number in RBF hidden layer, μiFor the center of i-th of Gaussian radial basis function, σiIt is corresponding i-th
The core of a Gaussian radial basis function is wide.
4. a kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus according to claim 3, feature exist
In: MLP hidden layer is made of RBF hidden layer to each node layer output layer;By the output after RBF hidden layer dual polarization
hi(x) input as first hidden layer of MLP network then induces local field table in i-th of node of first of MLP hidden layer
It is shown as
Wherein,Be be directed toward from l-1 layers of node j l layers node i weight,For l-1 layers of the node in front
The output signal of j.
5. a kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus according to claim 4, feature exist
In output layer, if the depth of MLP network is L, the depth of MLP network is equal to MLP network input layer, hidden layer and defeated here
Layer is respectively the sum of several layer by layer out;At k-th of node of output layer, have
6. a kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus according to claim 1, feature exist
In: the MLP hidden layer is single.
7. a kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus according to claim 1, feature exist
In: the MLP hidden layer is two.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910049409.5A CN109858533A (en) | 2019-01-18 | 2019-01-18 | A kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910049409.5A CN109858533A (en) | 2019-01-18 | 2019-01-18 | A kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109858533A true CN109858533A (en) | 2019-06-07 |
Family
ID=66895197
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910049409.5A Pending CN109858533A (en) | 2019-01-18 | 2019-01-18 | A kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109858533A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110704478A (en) * | 2019-10-14 | 2020-01-17 | 南京我爱我家信息科技有限公司 | Method for checking existence of sensitive data in data |
CN110765700A (en) * | 2019-10-21 | 2020-02-07 | 国家电网公司华中分部 | Ultrahigh voltage transmission line loss prediction method based on quantum ant colony optimization RBF network |
CN113705594A (en) * | 2020-05-21 | 2021-11-26 | 北京沃东天骏信息技术有限公司 | Method and device for identifying image |
-
2019
- 2019-01-18 CN CN201910049409.5A patent/CN109858533A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110704478A (en) * | 2019-10-14 | 2020-01-17 | 南京我爱我家信息科技有限公司 | Method for checking existence of sensitive data in data |
CN110765700A (en) * | 2019-10-21 | 2020-02-07 | 国家电网公司华中分部 | Ultrahigh voltage transmission line loss prediction method based on quantum ant colony optimization RBF network |
CN113705594A (en) * | 2020-05-21 | 2021-11-26 | 北京沃东天骏信息技术有限公司 | Method and device for identifying image |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108090658A (en) | Arc fault diagnostic method based on time domain charactreristic parameter fusion | |
US11062199B2 (en) | Fraudulent transaction detection method based on sequence wide and deep learning | |
Alencar et al. | A new pruning method for extreme learning machines via genetic algorithms | |
CN110348713A (en) | A kind of platform area line loss calculation method based on association analysis and data mining | |
CN109858533A (en) | A kind of RBF-MLP mixed structure neural network classifier of prepositus nucleus | |
CN109034034A (en) | A kind of vein identification method based on nitrification enhancement optimization convolutional neural networks | |
Ma et al. | Determining the number of communities in degree-corrected stochastic block models | |
CN107506786A (en) | A kind of attributive classification recognition methods based on deep learning | |
CN106980812A (en) | Three-dimensional face features' independent positioning method based on concatenated convolutional neutral net | |
CN113496247A (en) | Estimating an implicit likelihood of generating a countermeasure network | |
Mishra et al. | Kohonen self organizing map with modified k-means clustering for high dimensional data set | |
CN106679670A (en) | Unmanned aerial vehicle flight path planning decision-making method based on fusion weighing | |
CN114597970A (en) | Active power distribution network partitioning method based on graph convolution network | |
Deng et al. | A multi-objective examples generation approach to fool the deep neural networks in the black-box scenario | |
Benala et al. | Software effort prediction using unsupervised learning (clustering) and functional link artificial neural networks | |
CN104537660A (en) | Image cutting method based on multi-target intelligent body evolution clustering algorithm | |
Zhang et al. | A memetic algorithm based extreme learning machine for classification | |
CN110691319B (en) | Method for realizing high-precision indoor positioning of heterogeneous equipment in self-adaption mode in use field | |
CN105389599A (en) | Feature selection approach based on neural-fuzzy network | |
CN109800854A (en) | A kind of Hydrophobicity of Composite Insulator grade determination method based on probabilistic neural network | |
Deng-Xu et al. | Glowworm swarm optimization algorithm for solving multi-objective optimization problem | |
Xie et al. | A novel crossover operator for particle swarm algorithm | |
Shelim et al. | Learning wireless power allocation through graph convolutional regression networks over Riemannian manifolds | |
Zhao et al. | Optimizing radial basis probabilistic neural networks using recursive orthogonal least squares algorithms combined with micro-genetic algorithms | |
Wong et al. | Rainfall prediction using neural fuzzy technique |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190607 |
|
RJ01 | Rejection of invention patent application after publication |