CN103383775A - Method for evaluating remote-sensing image fusion effect - Google Patents
Method for evaluating remote-sensing image fusion effect Download PDFInfo
- Publication number
- CN103383775A CN103383775A CN2013102723674A CN201310272367A CN103383775A CN 103383775 A CN103383775 A CN 103383775A CN 2013102723674 A CN2013102723674 A CN 2013102723674A CN 201310272367 A CN201310272367 A CN 201310272367A CN 103383775 A CN103383775 A CN 103383775A
- Authority
- CN
- China
- Prior art keywords
- image
- sensing image
- remote sensing
- fusion
- precision
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Analysis (AREA)
Abstract
The invention relates to remote-sensing image effect evaluation, in particular to a method for evaluating a remote-sensing image fusion effect. The method for evaluating the remote-sensing image fusion effect solves the problems that the man-made interference factor of a main visual effect of an existing method for evaluating the remote-sensing image fusion effect is large, errors can occur easily, a uniform standard does not exist in index selection in an objective mathematical statistic analysis, and image quality is hard to evaluate comprehensively. The method includes the steps that first, an original image to be fused is preprocessed; second, fusion processing is conducted on a preprocessed multispectral image and a panchromatic image; third, object-oriented segmentation is conducted on the images before and after fusion; fourth, a classification rule is used for classifying the segmented remote-sensing image; fifth, the classification result is precisely evaluated from the three aspects of production precision, user precision and a Kappa coefficient, and contrastive analysis is carried out on the classification accuracy of the images before and after fusion so that the fusion effect can be evaluated. The method for evaluating the remote-sensing image fusion effect can be applied to the technical field of remote-sensing image processing.
Description
Technical field
The present invention relates to the remote sensing image recruitment evaluation, particularly relate to a kind of Remote Sensing Image Fusion effect evaluation method.Belong to Remote Sensing Image Processing Technology.
Background technology
Along with the develop rapidly of modern remote sensing and correlation technique thereof, realize that the algorithm of Remote Sensing Image Fusion has a lot, but do not form unified theoretical system, the effect of various fusion methods is difficult to quantitatively determine.Current, there are two kinds of viewpoints in the visual fusion quality assessment, a kind ofly think that any image all sees to the people, therefore should adopt the evaluation method based on visual effect, with the visual basis that is judged as of subjectivity, come the formulation standard according to statistics; Another kind thinks that subjective interpretation method is not comprehensive, has one-sidedness, and is unable to undergo rechecking, because when observation condition changes, the possibility of result of evaluation can produce difference, advocates to adopt the method for objectively evaluating that not affected by the appraiser, but all has defective.Therefore, realize the Remote Sensing Image Fusion effect assessment in the urgent need to a kind of brand-new angle.Classification of remote-sensing images is one of important application direction of image, is also most basic application.Therefore, estimating the fusion results image from the angle of nicety of grading can be so that evaluation result be directly application-oriented.
Traditional Remote Sensing Image Fusion effect can be estimated from the angle of subjective vision effect and objective Mathematical Statistics Analysis, but all has defective, and main visual effect artificial interference factor is larger, easily produces error; Objective Mathematical Statistics Analysis middle finger target is chosen does not have unified standard, is difficult to realize the thoroughly evaluating quality of image.The present invention is directed to the demand of application, proposed a kind of angle from the image classification precision evaluation and realized the Remote Sensing Image Fusion recruitment evaluation, make the result after fusion directly application-oriented.
Summary of the invention
The present invention is that will to solve the existing main visual effect artificial interference of Remote Sensing Image Fusion effect evaluation method factor larger, easily produce error, objective Mathematical Statistics Analysis middle finger target is chosen does not have unified standard, be difficult to realize the problem of the thoroughly evaluating quality of image, and a kind of Remote Sensing Image Fusion effect evaluation method is provided.
The Remote Sensing Image Fusion effect evaluation method mainly comprises the following steps:
Step 1: the pre-service of image original to be merged;
Step 2: pretreated multispectral image and panchromatic image carry out fusion treatment, the image as a result after being merged;
Step 3: the image before and after merging is carried out object-oriented cut apart, the remote sensing image after being cut apart;
Step 4: adopt classifying rules, to the Remote sensing image classification after cutting apart, the classification results of image before and after obtaining merging;
Step 5: from production precision, user's precision and three angles of Kappa coefficient, classification results is carried out precision evaluation, before and after comparative analysis is merged, the nicety of grading of image realizes the syncretizing effect evaluation.
The invention effect:
A kind of new Remote Sensing Image Fusion effect evaluation method that the present invention proposes can overcome the deficiency that has evaluation method now.Adopt the nicety of grading Appraising Methods to realize the visual fusion recruitment evaluation, namely to Remote sensing image classification before and after merging, realize effect assessment by the match stop precision, make evaluation result directly application-oriented.
Propose a kind of new Remote Sensing Image Fusion effect evaluation method, made up the deficiency of traditional evaluation method, made evaluation result directly application-oriented.The present invention nicety of grading can be improved 3%, and evaluation result is objective through new evaluation method.
Description of drawings
Fig. 1 is process flow diagram of the present invention;
Fig. 2 is SPOT multispectral image in embodiment;
Fig. 3 is SPOT panchromatic image in embodiment;
Fig. 4 is fusion results image in embodiment;
Fig. 5 is original multispectral image classification results in embodiment; Blue expression traffic land used, red expression residential area, blackish green expression nonirrigated farmland, yellow expression reservoir/swag, green expression paddy field;
Fig. 6 merges the image classification result in embodiment; Blue expression traffic land used, red expression residential area, blackish green expression nonirrigated farmland, yellow expression reservoir/swag, green expression paddy field.
Embodiment
Embodiment one: the Remote Sensing Image Fusion effect evaluation method of present embodiment mainly comprises the following steps:
Step 1: the pre-service of image original to be merged;
Step 2: pretreated multispectral image and panchromatic image carry out fusion treatment, the image as a result after being merged;
Step 3: the image before and after merging is carried out object-oriented cut apart, the remote sensing image after being cut apart;
Step 4: adopt classifying rules, to the Remote sensing image classification after cutting apart, the classification results of image before and after obtaining merging;
Step 5: from production precision, user's precision and three angles of Kappa coefficient, classification results is carried out precision evaluation, before and after comparative analysis is merged, the nicety of grading of image realizes the syncretizing effect evaluation.
The present embodiment effect:
A kind of new Remote Sensing Image Fusion effect evaluation method that present embodiment proposes can overcome the deficiency that has evaluation method now.Adopt the nicety of grading Appraising Methods to realize the visual fusion recruitment evaluation, namely to Remote sensing image classification before and after merging, realize effect assessment by the match stop precision, make evaluation result directly application-oriented.
Present embodiment has proposed a kind of new Remote Sensing Image Fusion effect evaluation method, has made up the deficiency of traditional evaluation method, makes evaluation result directly application-oriented.Present embodiment nicety of grading can be improved 3%, and evaluation result is objective through new evaluation method.
Embodiment two: what present embodiment was different from embodiment one is: the pre-service of image original to be merged described in step 1 comprises the following steps:
Adopt the quadratic polynomial method to carry out geometrical registration to original multispectral image, make itself and panchromatic image keep consistance how much;
Adopt neighbor interpolation method that multispectral image is resampled, make the Pixel size of image to be merged consistent;
On the basis of above processing, cut out the pilot region image, obtain original multispectral and panchromatic the to be merged image of same area.Other step and parameter are identical with embodiment one.
Embodiment three: what present embodiment was different from embodiment one or two is: described in step 2, pretreated multispectral image and panchromatic image are carried out fusion treatment, in image as a result after being merged, the principle that multispectral image and panchromatic image merge is: adopt certain fusion method that pretreated multispectral image and panchromatic image are merged.Other step and parameter are identical with embodiment one or two.
embodiment four: what present embodiment was different from one of embodiment one to three is: the image to before and after merging described in step 3 carries out object-oriented to be cut apart, remote sensing image after being cut apart, wherein the principle cut apart of object-oriented is: selected texture as required, average, color or shape facility, Image Segmentation is become each characteristic area, and therefrom extract technology and the process of area-of-interest, make the picture dot with identical or close characteristic properties in the same area, has the picture dot of notable difference feature in zones of different, each zone is as the unit of later stage image classification.Other step and parameter are identical with one of embodiment one to three.
Embodiment five: what present embodiment was different from one of embodiment one to four is: the employing classifying rules described in step 4, to the Remote sensing image classification after cutting apart, the classification results of image before and after obtaining merging, wherein, classifying rules mainly comprises the following aspects: at first, utilize NDVI (normalization difference vegetation index) can distinguish the arable land in conjunction with shape index, then utilize the wetland index to distinguish nonirrigated farmland and paddy field; Secondly, brightness index can extract residential area and traffic land used, and the recycling shape index is distinguished residential area and traffic land used; At last, adopt wetland exponential sum brightness index to extract reservoir/swag.Other step and parameter are identical with one of embodiment one to four.
Embodiment six: what present embodiment was different from one of embodiment one to five is: analyzing three angles from production precision, user's precision and Kappa classification results is carried out precision evaluation described in step 5, during before and after comparative analysis is merged, the nicety of grading of image realized that syncretizing effect is estimated, the production precision referred to that batch total deducts and Lou divides again divided by batch total; User's precision refers to that batch total deducts wrong minute again divided by batch total; The Kappa analysis index adopts a kind of discrete polytechnics, has considered all factors of matrix, is the index of the goodness of fit or precision between a kind of mensuration two width figure, and its computing formula is:
In formula: r is total ordered series of numbers (being total classification number) in error matrix; X
iiThat in error matrix, capable, the i of i lists picture dot quantity (i.e. the number of correct classification); X
i+And X
+ iTotal picture dot quantity that i is capable and i lists; N is total picture dot quantity that is used for accuracy evaluation.Other step and parameter are identical with one of embodiment one to five.
Embodiment seven: what present embodiment was different from one of embodiment one to six is: before and after the comparative analysis described in step 5 is merged, the nicety of grading of image realizes the syncretizing effect evaluation: before and after merging by analysis, the image classification precision is determined the effect of Remote Sensing Image Fusion; If after merging, the nicety of grading of image is higher, illustrate that the quality of image is higher, on the contrary low, thus realize the Remote Sensing Image Fusion effect assessment.Other step and parameter are identical with one of embodiment one to six.
Embodiment
(1) the multispectral and panchromatic image of the original SPOT of input, and the SPOT multispectral image is carried out geometric correction and resampling, make its picture dot corresponding to panchromatic image have consistance, then cut out SPOT multispectral image and panchromatic image to be merged after processing, as Fig. 2, shown in 3;
(2) adopt the fusion method of PCA conversion to carry out fusion treatment to pretreated SPOT is multispectral with panchromatic image, obtain the fusion results image, as shown in Figure 4;
(3) selecting to cut apart yardstick is 5, and SPOT image before and after merging is carried out dividing processing, the image after being cut apart;
(4) adopt classifying rules that the image after cutting apart is classified, the classification results of SPOT multispectral image before and after obtaining merging is as Fig. 5, shown in 6;
(5) analyze three angles from production precision, user's precision and Kappa classification results is carried out precision evaluation, draw through the statistical classification precision, production precision, user's precision and the Kappa coefficient of original SPOT multispectral image classification are respectively: 84%, 86%, 0.82; Employing PCA conversion fusion method production precision, user's precision and the Kappa coefficient of image classification as a result is respectively: 90%, 91%, 0.89.Therefore, the image after fusion is significantly improved to nicety of grading, illustrates that the effect of fusion method of PCA conversion is better, satisfies the demand of using.
Claims (7)
1. Remote Sensing Image Fusion effect evaluation method is characterized in that the Remote Sensing Image Fusion effect evaluation method mainly comprises the following steps:
Step 1: the pre-service of image original to be merged;
Step 2: pretreated multispectral image and panchromatic image carry out fusion treatment, the image as a result after being merged;
Step 3: the image before and after merging is carried out object-oriented cut apart, the remote sensing image after being cut apart;
Step 4: adopt classifying rules, to the Remote sensing image classification after cutting apart, the classification results of image before and after obtaining merging;
Step 5: from production precision, user's precision and three angles of Kappa coefficient, classification results is carried out precision evaluation, before and after comparative analysis is merged, the nicety of grading of image realizes the syncretizing effect evaluation.
2. a kind of Remote Sensing Image Fusion effect evaluation method according to claim 1 is characterized in that the pre-service of image original to be merged described in step 1 comprises the following steps:
Adopt the quadratic polynomial method to carry out geometrical registration to original multispectral image, make itself and panchromatic image keep consistance how much;
Adopt neighbor interpolation method that multispectral image is resampled, make the Pixel size of image to be merged consistent;
On the basis of above processing, cut out the pilot region image, obtain original multispectral and panchromatic the to be merged image of same area.
3. a kind of Remote Sensing Image Fusion effect evaluation method according to claim 1, it is characterized in that described in step 2, pretreated multispectral image and panchromatic image being carried out fusion treatment, in image as a result after being merged, the principle that multispectral image and panchromatic image merge is: adopt certain fusion method that pretreated multispectral image and panchromatic image are merged.
4. a kind of Remote Sensing Image Fusion effect evaluation method according to claim 1, it is characterized in that the image to before and after merging described in step 3 carries out object-oriented and cuts apart, remote sensing image after being cut apart, wherein the principle cut apart of object-oriented is: selected texture as required, average, color or shape facility, Image Segmentation is become each characteristic area, and therefrom extract technology and the process of area-of-interest, make the picture dot with identical or close characteristic properties in the same area, has the picture dot of notable difference feature in zones of different, each zone is as the unit of later stage image classification.
5. a kind of Remote Sensing Image Fusion effect evaluation method according to claim 1, it is characterized in that the employing classifying rules described in step 4, to the Remote sensing image classification after cutting apart, the classification results of image before and after obtaining merging, wherein, classifying rules mainly comprises the following aspects: at first, utilize NDVI can distinguish the arable land in conjunction with shape index, then utilize the wetland index to distinguish nonirrigated farmland and paddy field; Secondly, brightness index can extract residential area and traffic land used, and the recycling shape index is distinguished residential area and traffic land used; At last, adopt wetland exponential sum brightness index to extract reservoir/swag.
6. a kind of Remote Sensing Image Fusion effect evaluation method according to claim 1, it is characterized in that analyzing three angles from production precision, user's precision and Kappa classification results is carried out precision evaluation described in step 5, during before and after comparative analysis is merged, the nicety of grading of image realized that syncretizing effect is estimated, the production precision referred to that batch total deducts and Lou divides again divided by batch total; User's precision refers to that batch total deducts wrong minute again divided by batch total; The Kappa analysis index adopts a kind of discrete polytechnics, has considered all factors of matrix, is the index of the goodness of fit or precision between a kind of mensuration two width figure, and its computing formula is:
In formula: r is total ordered series of numbers (being total classification number) in error matrix; X
iiThat in error matrix, capable, the i of i lists picture dot quantity (i.e. the number of correct classification); X
i+And X
+ iTotal picture dot quantity that i is capable and i lists; N is total picture dot quantity that is used for accuracy evaluation.
7. a kind of Remote Sensing Image Fusion effect evaluation method according to claim 6, it is characterized in that the comparative analysis described in step 5 merge before and after the nicety of grading of image realize the syncretizing effect evaluation: merge by analysiss the effect that front and back image classification precision is determined Remote Sensing Image Fusion; If after merging, the nicety of grading of image is higher, illustrate that the quality of image is higher, on the contrary low, thus realize the Remote Sensing Image Fusion effect assessment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310272367.4A CN103383775B (en) | 2013-07-02 | 2013-07-02 | A kind of Remote Sensing Image Fusion effect evaluation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310272367.4A CN103383775B (en) | 2013-07-02 | 2013-07-02 | A kind of Remote Sensing Image Fusion effect evaluation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103383775A true CN103383775A (en) | 2013-11-06 |
CN103383775B CN103383775B (en) | 2016-08-10 |
Family
ID=49491558
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310272367.4A Expired - Fee Related CN103383775B (en) | 2013-07-02 | 2013-07-02 | A kind of Remote Sensing Image Fusion effect evaluation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103383775B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104794481A (en) * | 2015-03-25 | 2015-07-22 | 北京师范大学 | Fused panchromatic image and multispectral image classifying method and device |
CN104867154A (en) * | 2015-05-29 | 2015-08-26 | 华中科技大学 | Remote-sensing image quality evaluation method based on gradient |
CN105430378A (en) * | 2015-11-26 | 2016-03-23 | 航天恒星科技有限公司 | Image quality evaluation system and method |
CN106023111A (en) * | 2016-05-23 | 2016-10-12 | 中国科学院深圳先进技术研究院 | Image fusion quality evaluating method and system |
CN110310246A (en) * | 2019-07-05 | 2019-10-08 | 广西壮族自治区基础地理信息中心 | A kind of cane -growing region remote sensing information extracting method based on three-line imagery |
CN111681207A (en) * | 2020-05-09 | 2020-09-18 | 宁波大学 | Remote sensing image fusion quality evaluation method |
CN112506104A (en) * | 2020-12-07 | 2021-03-16 | 沈阳工程学院 | Method and system for supervising wetland resources and ecological environment |
CN112633155A (en) * | 2020-12-22 | 2021-04-09 | 生态环境部卫星环境应用中心 | Natural conservation place human activity change detection method based on multi-scale feature fusion |
CN113191440A (en) * | 2021-05-12 | 2021-07-30 | 济南大学 | Remote sensing image instance classification method, system, terminal and storage medium |
CN114937038A (en) * | 2022-07-21 | 2022-08-23 | 北京数慧时空信息技术有限公司 | Remote sensing image quality evaluation method oriented to usability |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101916436A (en) * | 2010-08-30 | 2010-12-15 | 武汉大学 | Multi-scale spatial projecting and remote sensing image fusing method |
CN102005037A (en) * | 2010-11-12 | 2011-04-06 | 湖南大学 | Multimodality image fusion method combining multi-scale bilateral filtering and direction filtering |
-
2013
- 2013-07-02 CN CN201310272367.4A patent/CN103383775B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101916436A (en) * | 2010-08-30 | 2010-12-15 | 武汉大学 | Multi-scale spatial projecting and remote sensing image fusing method |
CN102005037A (en) * | 2010-11-12 | 2011-04-06 | 湖南大学 | Multimodality image fusion method combining multi-scale bilateral filtering and direction filtering |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104794481B (en) * | 2015-03-25 | 2018-01-19 | 北京师范大学 | Merge the sorting technique and device of panchromatic image and multispectral image |
CN104794481A (en) * | 2015-03-25 | 2015-07-22 | 北京师范大学 | Fused panchromatic image and multispectral image classifying method and device |
CN104867154A (en) * | 2015-05-29 | 2015-08-26 | 华中科技大学 | Remote-sensing image quality evaluation method based on gradient |
CN105430378A (en) * | 2015-11-26 | 2016-03-23 | 航天恒星科技有限公司 | Image quality evaluation system and method |
CN106023111A (en) * | 2016-05-23 | 2016-10-12 | 中国科学院深圳先进技术研究院 | Image fusion quality evaluating method and system |
CN110310246B (en) * | 2019-07-05 | 2023-04-11 | 广西壮族自治区基础地理信息中心 | Sugarcane planting area remote sensing information extraction method based on three-linear array image |
CN110310246A (en) * | 2019-07-05 | 2019-10-08 | 广西壮族自治区基础地理信息中心 | A kind of cane -growing region remote sensing information extracting method based on three-line imagery |
CN111681207A (en) * | 2020-05-09 | 2020-09-18 | 宁波大学 | Remote sensing image fusion quality evaluation method |
CN111681207B (en) * | 2020-05-09 | 2023-10-27 | 四维高景卫星遥感有限公司 | Remote sensing image fusion quality evaluation method |
CN112506104A (en) * | 2020-12-07 | 2021-03-16 | 沈阳工程学院 | Method and system for supervising wetland resources and ecological environment |
CN112633155A (en) * | 2020-12-22 | 2021-04-09 | 生态环境部卫星环境应用中心 | Natural conservation place human activity change detection method based on multi-scale feature fusion |
CN112633155B (en) * | 2020-12-22 | 2021-08-31 | 生态环境部卫星环境应用中心 | Natural conservation place human activity change detection method based on multi-scale feature fusion |
CN113191440A (en) * | 2021-05-12 | 2021-07-30 | 济南大学 | Remote sensing image instance classification method, system, terminal and storage medium |
CN114937038A (en) * | 2022-07-21 | 2022-08-23 | 北京数慧时空信息技术有限公司 | Remote sensing image quality evaluation method oriented to usability |
CN114937038B (en) * | 2022-07-21 | 2022-09-20 | 北京数慧时空信息技术有限公司 | Usability-oriented remote sensing image quality evaluation method |
Also Published As
Publication number | Publication date |
---|---|
CN103383775B (en) | 2016-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103383775A (en) | Method for evaluating remote-sensing image fusion effect | |
Abdulkareem et al. | A new standardisation and selection framework for real-time image dehazing algorithms from multi-foggy scenes based on fuzzy Delphi and hybrid multi-criteria decision analysis methods | |
CN115082467B (en) | Building material welding surface defect detection method based on computer vision | |
CN104036493B (en) | No-reference image quality evaluation method based on multifractal spectrum | |
CN104680524A (en) | Disease diagnosis method for leaf vegetables | |
CN104182952A (en) | Multi-focusing sequence image fusion method | |
US11373309B2 (en) | Image analysis in pathology | |
CN105335749B (en) | Area limit line drawing method is not cut in meadow based on gray level co-occurrence matrixes | |
DE102011113138A1 (en) | Apparatus and method for measuring surfaces | |
CN110793462B (en) | Nylon gear reference circle measuring method based on vision technology | |
CN113033401B (en) | Human activity change recognition and supervision method for ecological protection red line | |
CN109003266A (en) | A method of based on fuzzy clustering statistical picture quality subjective evaluation result | |
CN109429051A (en) | Based on multiple view feature learning without reference stereoscopic video quality method for objectively evaluating | |
EP3982102B1 (en) | Method and device for measuring the local refractive power and/or refractive power distribution of a spectacle lens | |
CN102208103A (en) | Method of image rapid fusion and evaluation | |
CN106709501A (en) | Method for scene matching region selection and reference image optimization of image matching system | |
CN102163343A (en) | Three-dimensional model optimal viewpoint automatic obtaining method based on internet image | |
CN107292477A (en) | A kind of credit estimation method based on company management health status | |
CN103778413B (en) | A kind of remote sensing image less divided object automatic identifying method | |
Chotikapanich et al. | Inequality and poverty in Africa: Regional updates and estimation of a panel of income distributions | |
EP2636019A1 (en) | Method and evaluation device for determining the position of a structure located in an object to be examined by means of x-ray computer tomography | |
CN101799925B (en) | Performance analysis method for automatic segmentation result of image | |
CN111508022B (en) | Line laser stripe positioning method based on random sampling consistency | |
CN115511429A (en) | Smart water and soil conservation quasi-real-time refined supervision quantitative evaluation method | |
DE102011000088A1 (en) | Method for determining structures and/or geometry of workpiece, involves defining path to be passed and/or sequence of executions for specified range and/or fixed amount of measuring points and/or measuring elements |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20160810 Termination date: 20200702 |
|
CF01 | Termination of patent right due to non-payment of annual fee |