Summary of the invention
The present invention is that will to solve the existing main visual effect artificial interference of Remote Sensing Image Fusion effect evaluation method factor larger, easily produce error, objective Mathematical Statistics Analysis middle finger target is chosen does not have unified standard, be difficult to realize the problem of the thoroughly evaluating quality of image, and a kind of Remote Sensing Image Fusion effect evaluation method is provided.
The Remote Sensing Image Fusion effect evaluation method mainly comprises the following steps:
Step 1: the pre-service of image original to be merged;
Step 2: pretreated multispectral image and panchromatic image carry out fusion treatment, the image as a result after being merged;
Step 3: the image before and after merging is carried out object-oriented cut apart, the remote sensing image after being cut apart;
Step 4: adopt classifying rules, to the Remote sensing image classification after cutting apart, the classification results of image before and after obtaining merging;
Step 5: from production precision, user's precision and three angles of Kappa coefficient, classification results is carried out precision evaluation, before and after comparative analysis is merged, the nicety of grading of image realizes the syncretizing effect evaluation.
The invention effect:
A kind of new Remote Sensing Image Fusion effect evaluation method that the present invention proposes can overcome the deficiency that has evaluation method now.Adopt the nicety of grading Appraising Methods to realize the visual fusion recruitment evaluation, namely to Remote sensing image classification before and after merging, realize effect assessment by the match stop precision, make evaluation result directly application-oriented.
Propose a kind of new Remote Sensing Image Fusion effect evaluation method, made up the deficiency of traditional evaluation method, made evaluation result directly application-oriented.The present invention nicety of grading can be improved 3%, and evaluation result is objective through new evaluation method.
Embodiment
Embodiment one: the Remote Sensing Image Fusion effect evaluation method of present embodiment mainly comprises the following steps:
Step 1: the pre-service of image original to be merged;
Step 2: pretreated multispectral image and panchromatic image carry out fusion treatment, the image as a result after being merged;
Step 3: the image before and after merging is carried out object-oriented cut apart, the remote sensing image after being cut apart;
Step 4: adopt classifying rules, to the Remote sensing image classification after cutting apart, the classification results of image before and after obtaining merging;
Step 5: from production precision, user's precision and three angles of Kappa coefficient, classification results is carried out precision evaluation, before and after comparative analysis is merged, the nicety of grading of image realizes the syncretizing effect evaluation.
The present embodiment effect:
A kind of new Remote Sensing Image Fusion effect evaluation method that present embodiment proposes can overcome the deficiency that has evaluation method now.Adopt the nicety of grading Appraising Methods to realize the visual fusion recruitment evaluation, namely to Remote sensing image classification before and after merging, realize effect assessment by the match stop precision, make evaluation result directly application-oriented.
Present embodiment has proposed a kind of new Remote Sensing Image Fusion effect evaluation method, has made up the deficiency of traditional evaluation method, makes evaluation result directly application-oriented.Present embodiment nicety of grading can be improved 3%, and evaluation result is objective through new evaluation method.
Embodiment two: what present embodiment was different from embodiment one is: the pre-service of image original to be merged described in step 1 comprises the following steps:
Adopt the quadratic polynomial method to carry out geometrical registration to original multispectral image, make itself and panchromatic image keep consistance how much;
Adopt neighbor interpolation method that multispectral image is resampled, make the Pixel size of image to be merged consistent;
On the basis of above processing, cut out the pilot region image, obtain original multispectral and panchromatic the to be merged image of same area.Other step and parameter are identical with embodiment one.
Embodiment three: what present embodiment was different from embodiment one or two is: described in step 2, pretreated multispectral image and panchromatic image are carried out fusion treatment, in image as a result after being merged, the principle that multispectral image and panchromatic image merge is: adopt certain fusion method that pretreated multispectral image and panchromatic image are merged.Other step and parameter are identical with embodiment one or two.
embodiment four: what present embodiment was different from one of embodiment one to three is: the image to before and after merging described in step 3 carries out object-oriented to be cut apart, remote sensing image after being cut apart, wherein the principle cut apart of object-oriented is: selected texture as required, average, color or shape facility, Image Segmentation is become each characteristic area, and therefrom extract technology and the process of area-of-interest, make the picture dot with identical or close characteristic properties in the same area, has the picture dot of notable difference feature in zones of different, each zone is as the unit of later stage image classification.Other step and parameter are identical with one of embodiment one to three.
Embodiment five: what present embodiment was different from one of embodiment one to four is: the employing classifying rules described in step 4, to the Remote sensing image classification after cutting apart, the classification results of image before and after obtaining merging, wherein, classifying rules mainly comprises the following aspects: at first, utilize NDVI (normalization difference vegetation index) can distinguish the arable land in conjunction with shape index, then utilize the wetland index to distinguish nonirrigated farmland and paddy field; Secondly, brightness index can extract residential area and traffic land used, and the recycling shape index is distinguished residential area and traffic land used; At last, adopt wetland exponential sum brightness index to extract reservoir/swag.Other step and parameter are identical with one of embodiment one to four.
Embodiment six: what present embodiment was different from one of embodiment one to five is: analyzing three angles from production precision, user's precision and Kappa classification results is carried out precision evaluation described in step 5, during before and after comparative analysis is merged, the nicety of grading of image realized that syncretizing effect is estimated, the production precision referred to that batch total deducts and Lou divides again divided by batch total; User's precision refers to that batch total deducts wrong minute again divided by batch total; The Kappa analysis index adopts a kind of discrete polytechnics, has considered all factors of matrix, is the index of the goodness of fit or precision between a kind of mensuration two width figure, and its computing formula is:
In formula: r is total ordered series of numbers (being total classification number) in error matrix; X
iiThat in error matrix, capable, the i of i lists picture dot quantity (i.e. the number of correct classification); X
i+And X
+ iTotal picture dot quantity that i is capable and i lists; N is total picture dot quantity that is used for accuracy evaluation.Other step and parameter are identical with one of embodiment one to five.
Embodiment seven: what present embodiment was different from one of embodiment one to six is: before and after the comparative analysis described in step 5 is merged, the nicety of grading of image realizes the syncretizing effect evaluation: before and after merging by analysis, the image classification precision is determined the effect of Remote Sensing Image Fusion; If after merging, the nicety of grading of image is higher, illustrate that the quality of image is higher, on the contrary low, thus realize the Remote Sensing Image Fusion effect assessment.Other step and parameter are identical with one of embodiment one to six.
Embodiment
(1) the multispectral and panchromatic image of the original SPOT of input, and the SPOT multispectral image is carried out geometric correction and resampling, make its picture dot corresponding to panchromatic image have consistance, then cut out SPOT multispectral image and panchromatic image to be merged after processing, as Fig. 2, shown in 3;
(2) adopt the fusion method of PCA conversion to carry out fusion treatment to pretreated SPOT is multispectral with panchromatic image, obtain the fusion results image, as shown in Figure 4;
(3) selecting to cut apart yardstick is 5, and SPOT image before and after merging is carried out dividing processing, the image after being cut apart;
(4) adopt classifying rules that the image after cutting apart is classified, the classification results of SPOT multispectral image before and after obtaining merging is as Fig. 5, shown in 6;
(5) analyze three angles from production precision, user's precision and Kappa classification results is carried out precision evaluation, draw through the statistical classification precision, production precision, user's precision and the Kappa coefficient of original SPOT multispectral image classification are respectively: 84%, 86%, 0.82; Employing PCA conversion fusion method production precision, user's precision and the Kappa coefficient of image classification as a result is respectively: 90%, 91%, 0.89.Therefore, the image after fusion is significantly improved to nicety of grading, illustrates that the effect of fusion method of PCA conversion is better, satisfies the demand of using.