CN108550165A - A kind of image matching method based on local invariant feature - Google Patents
A kind of image matching method based on local invariant feature Download PDFInfo
- Publication number
- CN108550165A CN108550165A CN201810221834.3A CN201810221834A CN108550165A CN 108550165 A CN108550165 A CN 108550165A CN 201810221834 A CN201810221834 A CN 201810221834A CN 108550165 A CN108550165 A CN 108550165A
- Authority
- CN
- China
- Prior art keywords
- feature
- image
- point
- characteristic point
- matching
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
Abstract
The present invention is to provide a kind of image matching methods based on local invariant feature.Integral image and Hessian determinants of a matrix are sought to initial pictures;It establishes scale space pyramid and carries out the positioning of characteristic point;The principal direction of characteristic point is determined by Haar small echos, completes feature point extraction;Calculate the invariable rotary LBP features of image-region around each characteristic point, construction feature description;Using the nearest neighbor method of Euclidean distance, completes feature and slightly match;(6) by random sampling coherence method, remaining Mismatching point after thick matching process executes is rejected, completes the matching of feature essence.The method of the invention of the present invention ensureing match time and under conditions of accuracy, image occur scale, illumination and it is rotationally-varying when also there is certain robustness.
Description
Technical field
Invention refers to a kind of image characteristics extraction and image processing method, specifically a kind of images match side
Method.
Background technology
Images match refers to the image for carrying out being directed at and determining the relationship that corresponds to each other to the two images comprising Same Scene
Analysis and treatment technology, it is widely used in navigation, map and terrain match, living things feature recognition, Text region, medical image
The fields such as analysis, computer vision.In practical applications, need to carry out matched image often with different sensors in difference
Time obtains under different conditions, there is differences such as translation, scale, rotation, illumination, noise, visual angles between them, this
Huge challenge is brought to image matching method.Correlation technique (such as SSAD, NNPROD etc.) based on grey scale pixel value
The shortcomings that be that the transformation such as the scale to image, rotation, illumination, visual angle is sensitive, in contrast, based on local feature (such as angle point,
SURF (Speed Up Robust Features) characteristic point etc.) method it is better.The local invariant feature of image has
A variety of images convert (such as geometric transformation, light change etc.) under invariance, low redundancy, without in advance to image segmentation with
And the features such as unique, have been widely used for the fields such as images match, object identification, image classification and image retrieval.
Local feature is used in image matching technology, complicated image matching problems can be converted to feature vector
Metric question, to improve the speed and robustness of algorithm.The basic ideas of image matching method based on local invariant feature
The feature point set for usually first detecting image, is then based on characteristic point and its local neighborhood information generates feature vector, most
Similitude between measures characteristic vector completes image matching problems afterwards.SIFT (the Scale being published on IJCV for 2004
Invariant Feature Transform) and later to improve that the SIFT speeds of service propose based on Hessian matrixes and
The SURF of Haar small echos is most representative two local feature algorithms in images match field, but in illumination variation figure
The problem of picture, is upper inefficient, and accuracy rate is less high.There is the LBP (Local Binary Patterns) of robustness to illumination variation
Method is a kind of binary system description of statement gray level image pixel and surrounding pixel point magnitude relationship, is initially applied to figure
As texture description.In recent years there has been proposed the LBP much extended, it calculate it is simple, and the scale with part, rotation and
The advantages that bright dark invariance.
《The image matching algorithm that SIFT and invariable rotary LBP are combined》In, it is closed with invariable rotary LBP feature descriptions SIFT
Local image region around key point, this method, which converts the images such as scale, rotation, illumination, has very strong robustness, but
In terms of arithmetic speed, the requirement of high real-time cannot be satisfied.
It is entitled《A kind of image matching method of combination LBP feature extractions and SURF feature extracting methods》Patent
In file, in feature point description, retain the Haar Feature Descriptors in original SURF methods, and by Haar Feature Descriptors and
LBP invariable rotaries describe son and are combined so that the image matching method has better matching effect than former SURF methods, still
Two kinds of feature descriptions make method complexity increase, and arithmetic speed is declined.
《Local invariant feature is summarized》It is mentioned in one text, in fields such as image, video frequency searching and target identifications using more
Local invariant feature matching process have the matching process based on thresholding, the matching process based on arest neighbors and be based on arest neighbors
The matching process of distance rates.3 kinds of methods respectively have advantage and disadvantage, wherein the matching process simple computation amount based on thresholding is small, are based on
The matching process accuracy of nearest neighbor distance ratio is high.
Since there are there may be similar in various geometry and luminosity transformation, noise, quantization error and image between image
The influence of many factors such as partial structurtes there will still likely be the matching of mistake in the characteristic matching result based on similarity measurement.
The sample data set that random sampling coherence method includes abnormal data according to one group calculates the mathematical model parameter of data,
The method for obtaining effective sample data was proposed by Fischler and Bolles in 1981 earliest.It is widely used in image and matches
Accurate and splicing, can include at one group " exterior point " (point for not meeting optimal models) data set in, using continuous iteration
Method finds optimized parameter model.
Invention content
The purpose of the present invention is to provide one kind under conditions of ensureing match time and accuracy, and ruler occurs in image
Degree, illumination and it is rotationally-varying when also with certain robustness the image matching method based on local invariant feature.
The object of the present invention is achieved like this:
(1) integral image and Hessian determinants of a matrix are sought to initial pictures;
(2) it establishes scale space pyramid and carries out the positioning of characteristic point;
(3) principal direction of characteristic point is determined by Haar small echos, completes feature point extraction;
(4) the invariable rotary LBP features of image-region around each characteristic point, construction feature description are calculated;
(5) nearest neighbor method for utilizing Euclidean distance is completed feature and is slightly matched;
(6) by random sampling coherence method, remaining Mismatching point after thick matching process executes is rejected, feature is completed
Essence matching.
Compared with the background art the main characteristic of the invention lies in that:
1, the present invention is introduced into invariable rotary LBP features in the description of key point, and one is constructed for key point description
Kind calculates description method simple, that dimension is low, enhances the illumination robustness of image matching method.
2, the present invention utilizes Hessian matrixes in characteristic point extracting portion, the Feature Points Extraction provided using SURF
It is quickly calculated with integrogram method, while ensureing matching performance, improves efficiently the SIFT methods based on gradient map
The slow problem of calculating speed.
3, the present invention uses in feature description and describes filial generation with LBP invariable rotaries for the Haar spies in original SURF methods
Sign description has good effect in rotationally-varying, the illumination variation of reply, visual angle change, while it is matched not influence image
Speed.
4, the present invention is carried out thick to improve the accuracy of images match using the higher Euclidean distance nearest neighbor method of accuracy
Matching.
5, the present invention is when essence matches, and using random sampling coherence method, rejects present in thick matching result accidentally
Match, further increases the accuracy of images match.
The present invention surround key issues of extraction and description of local feature region, the matching of feature and removal error hiding into
Row research, proposes a kind of image matching method based on local invariant feature, both can guarantee preferable images match performance, simultaneously
With preferable scale, illumination and rotation robustness.Under conditions of ensureing match time and accuracy, ruler occurs in image
Degree, illumination and it is rotationally-varying when also have certain robustness.
The beneficial effects of the invention are as follows:In characteristic point extracting portion, carried out using Hessian matrixes and integrogram method fast
Speed calculates, and tectonic scale image pyramid, ensure that the rapidity and scale invariability of images match;In feature description portion
Point, invariable rotary LBP features are introduced into the description of key point, is constructed for key point and a kind of calculates that simple, dimension is low
Description method, enhance the robustness of scale and illumination;In the thick compatible portion of feature angle point, using the arest neighbors of Euclidean distance
Method makes corners Matching speed faster;Thick matching is rejected using random sampling coherence method in feature angle point essence compatible portion
Remaining Mismatching point after method executes, Optimized Matching is as a result, make corners Matching accuracy rate higher.
Description of the drawings
Fig. 1 is the flow chart of the present invention.
Specific implementation mode
It is clear in order to make the object, technical solutions and advantages of the present invention be more clear, it is illustrated below in conjunction with the accompanying drawings to this hair
It is bright to be further described.
In conjunction with Fig. 1, the image matching method of the invention for being extracted and being described based on local feature region includes the following steps:
Step 1:Image characteristic point extracts, including:
(1) integral image and Hessian determinants of a matrix of initial pictures are sought
The integral image of initial pictures is sought to needing to carry out matched initial pictures traversal, and seeks each point on image
Hessian determinants of a matrix.
For a pixel (x, the x)=f (x, y) given in image 1 to be matched, then Hessian squares of the pixel
Battle array H (f (x, y)) be:
H-matrix (abbreviation of Hessian matrixes, similarly hereinafter) discriminate det (H) is:
Wherein,WithFor the second-order partial differential coefficient of pixel f (x, y).
The value of discriminate is the characteristic value of H-matrix, all the points can be classified according to the symbol of judgement result, by discriminate
What is be worth is positive and negative, to judge that the point is extreme point.
Each pixel can find out a H-matrix.In order to allow characteristic point to have scale independence, H squares are being carried out
Before battle array construction, need to carry out gaussian filtering to it.The calculating of H-matrix is carried out after filtered again, then in the σ scales of pixel x
On Hessian matrix Hs (x, σ) be defined as:
Wherein Lxx(x, σ) is that the kernel function second order of gaussian sum function is ledWith image 1 in the convolution of X points, Lxy
(x, σ) and Lyy(x, σ) is calculated with same method, wherein
Can be that each pixel calculate the signals of its H determinant in image by this method, with this value come
Judging characteristic point.By the result D of frame-shaped filter and image convolutionxx、Dxy、DyyRespectively instead of Lxx、Lxy、LyyIt obtains approximate
The matrix H of Hessianapprox, determinant det (Happrox) be:
det(Happrox)=DxxDyy-(0.9Dxy)2 (4)
(2) it establishes scale space pyramid and carries out the positioning of characteristic point
In order to scale invariability, it is necessary to establish scale space pyramid.Keep image size constant, by changing box
The template size of formula filter establishes scale image pyramid, and specific construction method is:Graphical rule spatial pyramid point four
Layer carries out four filtering to each layer, and the 1st layer of first time Filtering Template size is given, is adjacent in the 9 × 9, the 1st layer
Template size differs 6 pixels, i.e., first to fourth secondary Filtering Template size is 9 × 9,15 × 15,21 × 21 Hes respectively in the 1st layer
27×27;Adjacent Filtering Template size differs 12 pixels in 2nd layer, adjacent Filtering Template size difference 24 in the 3rd layer
A pixel, adjacent Filtering Template size differs 48 pixels in the 4th layer, and each layer of first template size is equal to preceding layer
Second template size, therefore the secondary Filtering Template size of first to fourth in the 2nd layer is 15 × 15 respectively, 27 × 27,39 × 39,
First to fourth secondary Filtering Template size is in the 27 × 27,51 × 51,75 × 75,99 × 99, the 4th layer respectively in 51 × 51, the 3rd layer
First to fourth secondary Filtering Template size is 51 × 51,99 × 99,147 × 147,195 × 195 respectively, is filtered every time corresponding close
It can be calculated by formula (5) like scale.
Wherein σapproxIndicate scale.
Positioning feature point process:A threshold value is set for Hessian matrix responses, all points less than this threshold value are all
Be removed, then by non-maxima suppression by those than the point around its adjacent layers and this layer of scale response all greatly or all
Small clicking is characterized a little, and the three-dimensional quadratic function of finally fitting accurately determines position and the scale of the point of characteristic point.
(3) principal direction of characteristic point is determined by Haar small echos
Using characteristic point as the center of circle, 6 σ (σ is characterized a place scale-value) are in the circle shaped neighborhood region of radius, are 4 σ with the length of side
Haar small echo templates ask the Haar small echos of X and Y both directions to respond, be used in combination standard deviation be 2 σ Gaussian function to filtered
Sub-region right, then centered on characteristic point, run-down, calculating in circle shaped neighborhood region with the sector that a central angle is π/3
The Haar small echos for the picture point for including in the angles of each π/3 that the sector is scanned respond summation, take the side of wherein peak response
To the principal direction for this feature point, wherein σ is characterized a scale-value at place, and X and Y are that the flat square residing for circle shaped neighborhood region is sat
Both direction in mark system.
Step 2:Feature Descriptor generates, including:
Basic LBP features are the binary system descriptions to 3 × 3 neighborhoods, the disadvantage is that rotation is relevant, in order to be revolved
Turn invariance, the present invention uses the invariable rotary LBP feature descriptions of key point peripheral region.
If pi(r, c, σ, θ) is obtained a certain key point in key point extraction in step 1, wherein being (r, c) original
Position coordinates on image, σ and θ are respectively piScale and direction.According to the size of σ, in piThe gaussian pyramid at place is corresponding
On layer, with piCentered on take 9 × 9 sizes image-region be region to be described,.In order to obtain rotational invariance, according to θ
The big wisp image-region rotate to reference direction.Steps are as follows for region invariable rotary LBP feature descriptions to be described,:
(1) in the region of 8 × 8 size of region to be described, respectively with each pixel pjCentered on, it seeks in being with it
The invariable rotary LBP features of the heart, are denoted as lbpj(j=1,2 ..., 64).
(2) it intuitively says, pixel pjDistance center is remoter, it is to describing piThe information content of contribution is smaller, therefore right
lbpjIt is weighted, weighting coefficient wjFor:
wj=exp-[(rj-ri)2+(cj-ci)2]/(2σ0 2)}/(2πσ0 2) (6)
Wherein (rj,cj) and (rj,ci) it is pixel pjWith central point piCoordinate in image-region to be described, σ0For choosing
Fixed constant.
(3) all weighting LBP eigenvalue clusters being calculated are denoted as T at an one-dimensional vectori,
Ti=[w1·lbp1 w2·lbp2 … w64·lbp64] (7)
(4) in order to eliminate the influence of illumination variation, TiIt is normalized, i.e.,
To sum up, the 64 dimensional vector T finally obtainediAs key point TiThe description of peripheral region.
Step 3:Characteristics of image slightly matches, including:
When key point description vectors generate after, the present invention use range formula for
Formula (9) distance is as the similarity determination measurement between key point.Wherein, TA=[a1 a2 … an] and TB=
[b1 b2 … bn] be respectively key point A and B description vectors.Matching strategy:Some key point A in image 1 is taken, and is being schemed
As finding out 2 the key points B and C that description vectors distance is nearest therewith in 2, if nearest distance | | TA-TB||1With it is secondary close
Distance | | TA-TC||1The small a threshold value t of ratio, i.e.,
Then think that key point A is matched with apart from nearest key point B.
Step 4:Characteristics of image essence matches, including:
Random sampling coherence method is the sample data set for including abnormal data according to one group, calculates the mathematics of data
Model parameter, the method for obtaining effective sample data.The basic assumption of this method is (to adapt to mould comprising correct data in sample
The data of type), also include abnormal data (data for being not suitable with model).And one group of correct data is given, existing can count
The method for calculating the model parameter for meeting these data.
If corresponding matching characteristic point set is f in two images 1 and image 2 to be spliced, random sampling coherence method is gone
The step of except Mismatching point, is as follows:
(1) three pairs of not conllinear match points are arbitrarily selected from matching characteristic point set f, calculate its transformation matrix M.
(2) corresponding point is calculated according to transformation matrix M, such as to point (a, b), calculates b'=M (a).
(3) disconnected b' is cut at a distance from b, if in thresholding, (a, b) is interior point, is otherwise exterior point.
(4) number put in meets condition, then exits;After recycling k steps, the number of interior point does not reach requirement, then takes maximum
Interior points.It is gone in step 4 if being unsatisfactory for exit criteria (1).
Claims (1)
1. a kind of image matching method based on local invariant feature, it is characterized in that:
(1) integral image and Hessian determinants of a matrix are sought to initial pictures;
(2) it establishes scale space pyramid and carries out the positioning of characteristic point;
(3) principal direction of characteristic point is determined by Haar small echos, completes feature point extraction;
(4) the invariable rotary LBP features of image-region around each characteristic point, construction feature description are calculated;
(5) nearest neighbor method for utilizing Euclidean distance is completed feature and is slightly matched;
(6) by random sampling coherence method, remaining Mismatching point after thick matching process executes is rejected, completes feature essence
Match.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810221834.3A CN108550165A (en) | 2018-03-18 | 2018-03-18 | A kind of image matching method based on local invariant feature |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810221834.3A CN108550165A (en) | 2018-03-18 | 2018-03-18 | A kind of image matching method based on local invariant feature |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108550165A true CN108550165A (en) | 2018-09-18 |
Family
ID=63516598
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810221834.3A Pending CN108550165A (en) | 2018-03-18 | 2018-03-18 | A kind of image matching method based on local invariant feature |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108550165A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109727239A (en) * | 2018-12-27 | 2019-05-07 | 北京航天福道高技术股份有限公司 | Based on SURF feature to the method for registering of inspection figure and reference map |
CN109816674A (en) * | 2018-12-27 | 2019-05-28 | 北京航天福道高技术股份有限公司 | Registration figure edge extracting method based on Canny operator |
CN109815822A (en) * | 2018-12-27 | 2019-05-28 | 北京航天福道高技术股份有限公司 | Inspection figure components target identification method based on Generalized Hough Transform |
CN110189368A (en) * | 2019-05-31 | 2019-08-30 | 努比亚技术有限公司 | Method for registering images, mobile terminal and computer readable storage medium |
CN110208795A (en) * | 2019-06-13 | 2019-09-06 | 成都汇蓉国科微系统技术有限公司 | A kind of low slow small target detection identifying system of mobile platform high-precision and method |
CN111238488A (en) * | 2020-03-18 | 2020-06-05 | 湖南云顶智能科技有限公司 | Aircraft accurate positioning method based on heterogeneous image matching |
CN115588204A (en) * | 2022-09-23 | 2023-01-10 | 神州数码系统集成服务有限公司 | Single character image matching and identifying method based on DS evidence theory |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103903237A (en) * | 2014-03-21 | 2014-07-02 | 上海大学 | Dual-frequency identification sonar image sequence splicing method |
CN104933434A (en) * | 2015-06-16 | 2015-09-23 | 同济大学 | Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method |
CN105608671A (en) * | 2015-12-30 | 2016-05-25 | 哈尔滨工业大学 | Image connection method based on SURF algorithm |
CN106657789A (en) * | 2016-12-29 | 2017-05-10 | 核动力运行研究所 | Thread panoramic image synthesis method |
CN107481273A (en) * | 2017-07-12 | 2017-12-15 | 南京航空航天大学 | A kind of Spacecraft Autonomous Navigation rapid image matching method |
-
2018
- 2018-03-18 CN CN201810221834.3A patent/CN108550165A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103903237A (en) * | 2014-03-21 | 2014-07-02 | 上海大学 | Dual-frequency identification sonar image sequence splicing method |
CN104933434A (en) * | 2015-06-16 | 2015-09-23 | 同济大学 | Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method |
CN105608671A (en) * | 2015-12-30 | 2016-05-25 | 哈尔滨工业大学 | Image connection method based on SURF algorithm |
CN106657789A (en) * | 2016-12-29 | 2017-05-10 | 核动力运行研究所 | Thread panoramic image synthesis method |
CN107481273A (en) * | 2017-07-12 | 2017-12-15 | 南京航空航天大学 | A kind of Spacecraft Autonomous Navigation rapid image matching method |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109727239A (en) * | 2018-12-27 | 2019-05-07 | 北京航天福道高技术股份有限公司 | Based on SURF feature to the method for registering of inspection figure and reference map |
CN109816674A (en) * | 2018-12-27 | 2019-05-28 | 北京航天福道高技术股份有限公司 | Registration figure edge extracting method based on Canny operator |
CN109815822A (en) * | 2018-12-27 | 2019-05-28 | 北京航天福道高技术股份有限公司 | Inspection figure components target identification method based on Generalized Hough Transform |
CN110189368A (en) * | 2019-05-31 | 2019-08-30 | 努比亚技术有限公司 | Method for registering images, mobile terminal and computer readable storage medium |
CN110189368B (en) * | 2019-05-31 | 2023-09-19 | 努比亚技术有限公司 | Image registration method, mobile terminal and computer readable storage medium |
CN110208795A (en) * | 2019-06-13 | 2019-09-06 | 成都汇蓉国科微系统技术有限公司 | A kind of low slow small target detection identifying system of mobile platform high-precision and method |
CN110208795B (en) * | 2019-06-13 | 2021-10-15 | 成都汇蓉国科微系统技术有限公司 | High-precision low-speed small target detection and identification system and method for mobile platform |
CN111238488A (en) * | 2020-03-18 | 2020-06-05 | 湖南云顶智能科技有限公司 | Aircraft accurate positioning method based on heterogeneous image matching |
CN115588204A (en) * | 2022-09-23 | 2023-01-10 | 神州数码系统集成服务有限公司 | Single character image matching and identifying method based on DS evidence theory |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Sirmacek et al. | A probabilistic framework to detect buildings in aerial and satellite images | |
CN108550165A (en) | A kind of image matching method based on local invariant feature | |
CN110334762B (en) | Feature matching method based on quad tree combined with ORB and SIFT | |
Sirmacek et al. | Urban-area and building detection using SIFT keypoints and graph theory | |
WO2017219391A1 (en) | Face recognition system based on three-dimensional data | |
EP2385483B1 (en) | Recognition and pose determination of 3D objects in 3D scenes using geometric point pair descriptors and the generalized Hough Transform | |
CN110021024B (en) | Image segmentation method based on LBP and chain code technology | |
CN108052942B (en) | Visual image recognition method for aircraft flight attitude | |
CN109903313B (en) | Real-time pose tracking method based on target three-dimensional model | |
CN109919960B (en) | Image continuous edge detection method based on multi-scale Gabor filter | |
CN107025449B (en) | Oblique image straight line feature matching method constrained by local area with unchanged visual angle | |
CN110222661B (en) | Feature extraction method for moving target identification and tracking | |
Tang et al. | Robust pattern decoding in shape-coded structured light | |
Zhang et al. | Saliency-driven oil tank detection based on multidimensional feature vector clustering for SAR images | |
CN111753119A (en) | Image searching method and device, electronic equipment and storage medium | |
CN105139013B (en) | A kind of object identification method merging shape feature and point of interest | |
Waheed et al. | Exploiting Human Pose and Scene Information for Interaction Detection | |
CN110348307B (en) | Path edge identification method and system for crane metal structure climbing robot | |
CN111199558A (en) | Image matching method based on deep learning | |
CN114821358A (en) | Optical remote sensing image marine ship target extraction and identification method | |
Cui et al. | Global propagation of affine invariant features for robust matching | |
CN108447092B (en) | Method and device for visually positioning marker | |
CN104268550A (en) | Feature extraction method and device | |
CN116703895B (en) | Small sample 3D visual detection method and system based on generation countermeasure network | |
CN112861776A (en) | Human body posture analysis method and system based on dense key points |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180918 |
|
RJ01 | Rejection of invention patent application after publication |