CN110942451A - Method for evaluating fusion performance of remote sensing image without reference image - Google Patents

Method for evaluating fusion performance of remote sensing image without reference image Download PDF

Info

Publication number
CN110942451A
CN110942451A CN201911142852.3A CN201911142852A CN110942451A CN 110942451 A CN110942451 A CN 110942451A CN 201911142852 A CN201911142852 A CN 201911142852A CN 110942451 A CN110942451 A CN 110942451A
Authority
CN
China
Prior art keywords
image
fusion
remote sensing
low
evaluating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911142852.3A
Other languages
Chinese (zh)
Other versions
CN110942451B (en
Inventor
窦闻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201911142852.3A priority Critical patent/CN110942451B/en
Publication of CN110942451A publication Critical patent/CN110942451A/en
Application granted granted Critical
Publication of CN110942451B publication Critical patent/CN110942451B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a method for evaluating fusion performance of a remote sensing image without a reference image, which mainly solves the problem of how to effectively evaluate the fusion performance of the remote sensing image without a high-resolution reference image and the problem of effectiveness of evaluation indexes. The core mechanism of the invention is that the image fusion performance is decomposed into three dimensions of spectrum retention, spatial smoothness and detail retention, sub-band decomposition is carried out on the fused image, each sub-band is respectively compared with the multi-spectral image and the full-color image which participate in fusion, indexes based on correlation coefficients are calculated, and finally, index synthesis is carried out to obtain the final evaluation index. The method of the invention has the advantages that the performance evaluation result of the fusion image is consistent with the expert experience, the different performance aspects of the fusion algorithm can be well expressed, the operation speed is high, and the method is suitable for the fusion evaluation of the remote sensing image.

Description

Method for evaluating fusion performance of remote sensing image without reference image
Technical Field
The invention relates to the technical field of aerospace optical remote sensing imaging and ground processing systems and digital image processing, in particular to a method for evaluating fusion performance of a remote sensing image without a reference image.
Background
The satellite remote sensing technology can carry out large-scale full-coverage observation on the earth surface, plays a great role in various aspects such as surveying and mapping, meteorology, oceans, agriculture, natural resource investigation, disaster monitoring, national defense safety and the like, and has great importance in all countries in the world. Because of the limitation of the signal-to-noise ratio of the sensor, the spectrum resolution and the space resolution of the image acquired by the optical remote sensing satellite are contradictory, so that the optical remote sensing satellite is generally provided with a high-resolution panchromatic waveband and a plurality of low-resolution multispectral wavebands at present.
The remote sensing image fusion technology aims at synthesizing a panchromatic image with high spatial resolution and low spectral resolution and a multispectral image with low spatial resolution and high spectral resolution into an image which can be acquired by a sensor with higher performance or the same sensor on a lower track through a software algorithm, so that the image has the same spectral resolution as the multispectral image and the same spatial resolution as the panchromatic image.
According to the utilization mode of people on remote sensing data, the evaluation on the fused image mainly comprises two aspects: spatial and spectral properties. The spatial performance mainly refers to whether the fused image is fully improved or not in visual effect compared with the original multispectral image; the spectral performance is oriented to subsequent computer quantitative processing, and the characteristic is whether the fused image can completely retain the spectral information contained in the multispectral image. Therefore, the fusion performance evaluation includes subjective evaluation and objective/quantitative evaluation. With the development of quantitative remote sensing, quantitative evaluation gradually becomes the mainstream of the academic world, and the current development situation is that although various quantitative evaluation schemes exist, the evaluation of spectral performance is mainly focused, and the comprehensive evaluation result often has great conflict with subjective evaluation. The core of the problem is that the existing quantitative evaluation method cannot perform quantitative expression on the spatial performance well.
The quantitative evaluation method can be divided into two types, namely, evaluation with a reference image and evaluation without a reference image. The basic principle of the evaluation of the image fusion algorithm with reference is to compare a fusion image obtained by a certain method with the ideal reference image, and then to evaluate the fusion algorithm. Since the ideal reference image does not exist, two methods are mainly used for simulation at present. The first method uses higher resolution data from other sources, primarily airborne data, to perform simulations based on the characteristics of the target sensor. The method has high cost and small application range, and simultaneously, the simulation method is controversial, so that the second method, namely a resolution reduction scheme, is mainly adopted in the academic world, namely, the panchromatic image and the multispectral image are degraded according to the resolution ratio of the panchromatic image and the multispectral image, and the panchromatic image and the multispectral image with reduced resolution are fused, so that the original multispectral image can be used as a reference image. Although the method is generally applied, research shows that the degradation scheme can seriously affect the evaluation result; it has also been found in practice that the fusion performance at the original scale is poor for the method scoring very high in the deresolved approach.
The evaluation method without reference image is to quantitatively evaluate the fusion performance according to the quantitative relation between the fusion image and the panchromatic and multispectral images participating in the fusion under certain theoretical assumption. The evaluation applicability of the no-reference image is wider, the stability is better, and the no-reference image is the key point of the current research, but the existing no-reference image is not successful in index construction, and has larger deviation on the representation of the spatial performance and the spectral performance. How to construct a non-reference image fusion evaluation index and method with high consistency with subjective evaluation is a problem to be solved.
Disclosure of Invention
In order to effectively evaluate the fusion performance of the remote sensing image and evaluate the effectiveness of indexes under the condition of no high-resolution reference image, the invention provides a method for evaluating the fusion performance of the remote sensing image without the reference image, which decomposes the fusion performance of the image into three dimensions of spectrum retentivity, spatial smoothness and detail retentivity, decomposes a sub-band of the fusion image, compares each sub-band with a multispectral image and a panchromatic image which participate in fusion respectively, calculates indexes based on correlation coefficients, and finally synthesizes the indexes to obtain final evaluation indexes, thereby achieving the aim, the invention provides a method for evaluating the fusion performance of the remote sensing image without the reference image, which comprises the following steps:
step 1: initializing parameters, and setting an optical MTF value M of the low-resolution multispectral image M and a nominal resolution ratio r between the multispectral image M and the panchromatic image P;
step 2: linearly combining all wave bands of the F to construct a brightness component I;
and step 3: the image frequency component extraction method comprises the following steps:
step 3.1: constructing a frequency domain low-pass filter G according to m and r(0)And corresponding highpassFilter H(0)
Step 3.2: using G(0)Low-pass filtering is respectively carried out on each wave band of the fusion image F, and the fusion image F is down resampled to the size of the multispectral image M by r times by using a box sampling method to obtain a small-size degraded image F(1)
Step 3.3: constructing a frequency domain low pass filter G from m(1)To F(1)And M wave bands are respectively subjected to low-pass filtering to obtain respective low-frequency components F(1),lAnd MlAnd the corresponding high-frequency component F(1),hAnd Mh
Step 3.4: by means of H(0)High-pass filtering I and P to obtain respective high-frequency components IhAnd PhThen using H(0)To IhAnd PhFurther performing frequency component separation to obtain high frequency component IhhAnd PhhAnd a low frequency component IhlAnd Phl
And 4, step 4: calculating the sub indexes: according to F(1),lAnd MlCalculating the Spectrum Retention Qλ(ii) a According to F(1),hAnd MhComputing spatial smoothness first component
Figure BDA0002281426280000021
According to IhlAnd PhlCalculating spatial smoothness second component
Figure BDA0002281426280000022
According to IhhAnd phhComputing spatial maintenance Qσ
And 5: and constructing a comprehensive evaluation index according to each sub-index.
As a further development of the invention, the optical MTF value M of the multispectral image M used in step 1 is estimated as M ═ M according to the following equationtot/0.6366
Wherein m istotMTF values for multispectral images, either provided by the data provider or measured by itself, were default to 0.5.
As a further improvement of the invention, in step 2, each band of F is linearly combined to construct the brightness component I, the linear combination method is not limited, the sum of the weight coefficients is not required to be 1, the weight of each band can be obtained by linear regression of F to P, and the default value is 1.
As a further improvement of the invention, after the low-pass filtering is carried out on the fusion image F in the step 3, the box sampling method carries out down r times resampling to the size of the multispectral image M, and the small-size degraded image F is obtained(1)Low pass filter G(0)The construction method comprises the following steps:
Figure BDA0002281426280000031
wherein sigma0Scale parameters of a spatial domain Gaussian filter are adopted;
Figure BDA0002281426280000032
filtering the i-th band of the fused image into
Figure BDA0002281426280000033
Corresponding high-pass filter H(0)Is G(0)And taking the central element +1 after inversion.
As a further improvement of the invention, the fused image F degraded in step 3(1)And M wave bands are respectively subjected to low-pass filtering to obtain respective low-frequency components F(1),lAnd MlAnd the corresponding high-frequency component F(1),hAnd MhLow pass filter G used(1)The construction method comprises the following steps:
Figure BDA0002281426280000034
wherein sigma1Scale parameters of a spatial domain Gaussian filter are adopted;
Figure BDA0002281426280000035
filtering the i-th band of the fused degraded image into
Figure BDA0002281426280000036
Filtering the ith wave band of the multispectral image into
Figure BDA0002281426280000037
The high frequency component of each band being obtained directly by subtracting the low frequency component from that band, i.e.
F(1),h=F(1)-F(1),l
Mh=M-Ml
As a further improvement of the invention, step 3 utilizes H(0)High-pass filtering I and P to obtain respective high-frequency components
Figure BDA0002281426280000041
And
Figure BDA0002281426280000042
then use H(0)To IhAnd phFurther performing frequency component separation to obtain high frequency component
Figure BDA0002281426280000043
And
Figure BDA0002281426280000044
and a low frequency component Ihl=Ih-IhhAnd phl=ph=phh
As a further improvement of the invention, step 4 calculates F(1),lAnd MlCorresponding to the UIQI values between the wave bands, and taking the mean value of all the UIQI values as the spectrum retention index Q of the fused imageλThe following were used:
Figure BDA0002281426280000045
where n is the number of bands, for a given image a and B, UIQI is calculated as follows:
Figure BDA0002281426280000046
wherein, muA,μBIs the mean, σ, of images A and BA,σBIs the standard deviation, σ, of images A and BABIs the covariance of images a and B.
As a further improvement of the invention, step 4 calculates F(1),hAnd MhCorrelation coefficient C between corresponding bandsMHTaking the mean value of all the band correlation coefficients as a first component of the spatial smoothness index of the fused image
Figure BDA0002281426280000047
The following were used:
Figure BDA0002281426280000048
where n is the number of bands.
As a further improvement of the invention, step 4 calculates IhlAnd PhlCoefficient of correlation C betweenPLSecond component as an indicator of spatial smoothness
Figure BDA0002281426280000049
Calculation of IhhAnd PhhCoefficient of correlation C betweenPHAs a space retention index Qσ
As a further improvement of the invention, a comprehensive evaluation index Q is constructed according to each sub-index as follows:
Q=QλQsQσ
wherein Q issIs defined as
Figure BDA00022814262800000410
Compared with the prior art, the method of the invention has the following improvement: 1. the applicability is wider, and a reference image is not needed; 2. the adopted theoretical assumption is less, the problems existing in the existing reference-free image and resolution reduction scheme are avoided, the performance evaluation result of the fusion image is more stable and reliable, and the evaluation result is consistent with the expert experience; 3. the method can well show different performance aspects of the fusion algorithm, and can help developers to pertinently improve the fusion algorithm; 4. the method is high in operation speed and suitable for fusion evaluation of remote sensing images.
Detailed Description
The present invention will be described in further detail with reference to specific embodiments below:
the invention provides a method for evaluating the fusion performance of a remote sensing image without a reference image, which decomposes the image fusion performance into three dimensions of spectrum retention, spatial smoothness and detail retention, carries out sub-band decomposition on the fused image, respectively compares each sub-band with a multispectral image and a panchromatic image participating in fusion, calculates indexes based on correlation coefficients, and finally carries out index synthesis to obtain a final evaluation index.
As a specific embodiment of the invention, the real WorldView 2 satellite-borne remote sensing multispectral image and the panchromatic light image are adopted, the IHS, Brovey, PCA, GS, GSA, HPF, ATWT and other methods are respectively adopted for fusion, and the fusion result and the multispectral image which is up-sampled to the panchromatic size through cubic convolution are evaluated by adopting the method. The implementation of the invention comprises the following steps:
step 1: and initializing parameters. According to satellite data, the optical MTF value m of the low-resolution multispectral image imgM is set to be 0.6, and the nominal resolution ratio r between the multispectral image imgM and the panchromatic image imgP is set to be 4.
Step 2: matlab code example of construction of the luminance component imgi.matlab for the image imgF to be tested, as follows:
imgI=mean(imgF,3);
and step 3: examples of constructing the low pass filters G0, G1 and the corresponding high pass filters h0.matlab code are as follows:
N=41;
alpha=sqrt((N*(0.5/r))^2/(-2*log(m)));
H=fspecial('gaussian',N,alpha);
Hd=H./max(H(:));
h=fwind1(Hd,kaiser(N));
G0=real(h);
H0=padarray(1,[(N-1)/2(N-1)/2],0,'both')-G0;
alpha=sqrt((N*0.5)^2/(-2*log(m)));
H=fspecial('gaussian',N,alpha);
Hd=H./max(H(:));
h=fwind1(Hd,kaiser(N));
G1=real(h);
and 4, step 4: each band of the attempted image to be measured, imgF, is low pass filtered separately using G0 and resampled r down to multiple spectral image sizes using the box sampling method MATLAB example code is as follows:
imgF_L=double(imfilter(imgF,G1,'circular'));
imgF_L=imresize(imgF_L,1/r,'box');
and 5: each of the imgF _ L and imgM bands is filtered by G1, and low-frequency and high-frequency components of each band are extracted. MATLAB example code is as follows:
imgF_LL=double(imfilter(imgF_L,G1,'circular'));
imgF_LH=imgF_L-imgF_LL;
imgM_L=double(imfilter(imgM,G1,'circular'));
imgM_H=imgM-imgM_L;
step 6: example code to calculate the spectral maintenance index Q _ Lamb and the spatial smoothness first component Q _ sig1.matlab of an image is as follows:
for ii=1:nbands
Q_ML(ii)=img_qi(imgF_LL(:,:,ii),imgM_LL(:,:,ii));
Q_MH(ii)=corr2(imgF_LH(:,:,ii),imgM_LH(:,:,ii));
end
Q_Lamb=mean(Q_ML);
Q_Sig1=mean(Q_MH);
and 7: the imgI and imgP are filtered by G0 and H0, respectively, to extract low and high frequency components of each band. MATLAB example code is as follows:
imgI_H=double(imfilter(imgI,H0,'circular'));
imgI_HL=double(imfilter(imgI_H,G0,'circular'));
imgI_HH=imgI_H-imgI_HL;
imgP_H=double(imfilter(imgP,H0,'circular'));
imgP_HL=double(imfilter(imgP_H,G0,'circular'));
imgP_HH=imgP_H-imgP_HL;
and 8: the second component Q _ Sig2 and the spatial smoothness index Q _ Spa and MATLAB example code for computing the spatial smoothness of an image is as follows:
Q_Sig2=corr2(imgI_HL,imgP_HL);
Q_Spa=corr2(imgI_HH,imgP_HH);
and step 9: and calculating and outputting an evaluation result Q. MATLAB example code is as follows:
Q_Sig=sqrt(Q_Sig1*Q_Sig2);
Q=Q_Lamb*Q_Sig*Q_Spa;
the fused image evaluation results were as follows:
Figure BDA0002281426280000061
Figure BDA0002281426280000071
the up-sampled MS is good in spectrum preservation, but insufficient in spatial smoothness and completely not enhanced in details, so the score is very low, only 0.155; IHS and Brovey are both component replacement methods based on color space, and the scores are very close to each other and accord with theoretical expectation; the PCA method developed subsequently mainly improves the spatial smoothness at the edges, and the GS method further improves the spectrum retention capability of the PCA (the PCA is proved by theory to be a special case of the GS method); the GS method is improved in the aspect of brightness component construction by the GSA method, so that the spectrum retention capability is greatly improved; the multi-scale analysis method represented by HPF has good spectrum retention capability naturally, so that the spectrum retention is high, but the space retention is reduced, and the fused image is not sharp enough; ATWT improves spatial detail retention by filter design, with the highest overall score. As can be seen from the evaluation results, the three sub-indexes can well reflect the performances of the image in the three aspects, and the results are consistent with the existing research and subjective evaluation conclusions.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, but any modifications or equivalent variations made according to the technical spirit of the present invention are within the scope of the present invention as claimed.

Claims (10)

1. A method for evaluating the fusion performance of a remote sensing image without a reference image comprises the following steps:
step 1: initializing parameters, and setting an optical MTF value M of the low-resolution multispectral image M and a nominal resolution ratio r between the multispectral image M and the panchromatic image P;
step 2: linearly combining all wave bands of the F to construct a brightness component I;
and step 3: the image frequency component extraction method comprises the following steps:
step 3.1: constructing a frequency domain low-pass filter G according to m and r(0)And a corresponding high-pass filter H(0)
Step 3.2: using G(0)Low-pass filtering is respectively carried out on each wave band of the fusion image F, and the fusion image F is down resampled to the size of the multispectral image M by r times by using a box sampling method to obtain a small-size degraded image F(1)
Step 3.3: constructing a frequency domain low pass filter G from m(1)To F(1)And M wave bands are respectively subjected to low-pass filtering to obtain respective low-frequency components F(1),lAnd MlAnd the corresponding high-frequency component F(1),hAnd Mh
Step 3.4: by means of H(0)High-pass filtering I and P to obtain respective high-frequency components IhAnd PhThen using H(0)To IhAnd PhFurther performing frequency component separation to obtainObtaining a high frequency component IhhAnd PhhAnd a low frequency component IhlAnd Phl
And 4, step 4: calculating the sub indexes: according to F(1),lAnd MlCalculating the Spectrum Retention Qλ(ii) a According to F(1),hAnd MhComputing spatial smoothness first component
Figure FDA0002281426270000011
According to IhlAnd PhlCalculating spatial smoothness second component
Figure FDA0002281426270000012
According to IhhAnd PhhComputing spatial maintenance Qσ
And 5: and constructing a comprehensive evaluation index according to each sub-index.
2. The method for evaluating the fusion performance of the remote sensing images without the reference images according to claim 1, characterized in that: the optical MTF value M of the multispectral image M used in step 1 is estimated according to the following formula
m=mtot/0.6366
Wherein m istotMTF values for multispectral images, either provided by the data provider or measured by itself, were default to 0.5.
3. The method for evaluating the fusion performance of the remote sensing images without the reference images according to claim 1, characterized in that: in step 2, each wave band of F is linearly combined to construct a brightness component I, the linear combination method is not limited, the sum of weight coefficients does not need to be 1, the weight of each wave band can be obtained by performing linear regression on P by F, and the default value is 1.
4. The method for evaluating the fusion performance of the remote sensing images without the reference images according to claim 1, characterized in that: 3, after the low-pass filtering is carried out on the fusion image F, the box sampling method carries out down r times resampling to the size of the multispectral image M, and a small-size degraded image F is obtained(1)Low pass filter G(0)The construction method comprises the following steps:
Figure FDA0002281426270000013
wherein sigma0Scale parameters of a spatial domain Gaussian filter are adopted;
Figure FDA0002281426270000021
filtering the i-th band of the fused image into
Figure FDA0002281426270000022
Corresponding high-pass filter H(0)Is G(0)And taking the central element +1 after inversion.
5. The method for evaluating the fusion performance of the remote sensing images without the reference images according to claim 1, characterized in that: step 3. degraded fused image F(1)And M wave bands are respectively subjected to low-pass filtering to obtain respective low-frequency components F(1),lAnd MlAnd the corresponding high-frequency component F(1),hAnd MhLow pass filter G used(1)The construction method comprises the following steps:
Figure FDA0002281426270000023
wherein sigma1Scale parameters of a spatial domain Gaussian filter are adopted;
Figure FDA0002281426270000024
filtering the i-th band of the fused degraded image into
Figure FDA0002281426270000025
Filtering the ith wave band of the multispectral image into
Figure FDA0002281426270000026
The high frequency component of each band being obtained directly by subtracting the low frequency component from that band, i.e.
F(1),h=F(1)-F(1),l
Mh=M-Ml
6. The method for evaluating the fusion performance of the remote sensing images without the reference images according to claim 1, characterized in that: step 3 by H(0)High-pass filtering I and P to obtain respective high-frequency components
Figure FDA0002281426270000027
And
Figure FDA0002281426270000028
then use H(0)To IhAnd PhFurther performing frequency component separation to obtain high frequency component
Figure FDA0002281426270000029
And
Figure FDA00022814262700000210
and a low frequency component Ihl=Ih-IhhAnd Phl=Ph-Phh
7. The method for evaluating the fusion performance of the remote sensing images without the reference images according to claim 1, characterized in that: step 4 calculation of F(1),lAnd MlCorresponding to the UIQI values between the wave bands, and taking the mean value of all the UIQI values as the spectrum retention index Q of the fused imageλThe following were used:
Figure FDA00022814262700000211
where n is the number of bands, for a given image a and B, UIQI is calculated as follows:
Figure FDA0002281426270000031
wherein, muA,μBIs the mean, σ, of images A and BA,σBIs the standard deviation, σ, of images A and BABIs the covariance of images a and B.
8. The method for evaluating the fusion performance of the remote sensing images without the reference images according to claim 1, characterized in that: step 4 calculation of F(1),hCorrelation coefficient C between corresponding bands of MhMHTaking the mean value of all the band correlation coefficients as a first component of the spatial smoothness index of the fused image
Figure FDA0002281426270000032
The following were used:
Figure FDA0002281426270000033
where n is the number of bands.
9. The method for evaluating the fusion performance of the remote sensing images without the reference images according to claim 1, characterized in that: step 4 calculation of IhlAnd PhlCoefficient of correlation C betweenPLSecond component as an indicator of spatial smoothness
Figure FDA0002281426270000034
Calculation of IhhAnd PhhCoefficient of correlation C betweenPHAs a space retention index Qσ
10. The method for evaluating the fusion performance of the remote sensing images without the reference images according to the claims 1 and 6, characterized in that: the comprehensive evaluation index Q is constructed according to the sub-indexes as follows:
Q=QλQsQσ
wherein Q issIs defined as
Figure FDA0002281426270000035
CN201911142852.3A 2019-11-20 2019-11-20 Method for evaluating fusion performance of remote sensing image without reference image Active CN110942451B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911142852.3A CN110942451B (en) 2019-11-20 2019-11-20 Method for evaluating fusion performance of remote sensing image without reference image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911142852.3A CN110942451B (en) 2019-11-20 2019-11-20 Method for evaluating fusion performance of remote sensing image without reference image

Publications (2)

Publication Number Publication Date
CN110942451A true CN110942451A (en) 2020-03-31
CN110942451B CN110942451B (en) 2022-11-18

Family

ID=69906972

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911142852.3A Active CN110942451B (en) 2019-11-20 2019-11-20 Method for evaluating fusion performance of remote sensing image without reference image

Country Status (1)

Country Link
CN (1) CN110942451B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111948653A (en) * 2020-07-31 2020-11-17 上海卫星工程研究所 Method and system for detecting forest target based on P-band synthetic aperture radar
CN113436069A (en) * 2021-06-16 2021-09-24 中国电子科技集团公司第五十四研究所 Remote sensing image fusion method based on maximum signal-to-noise ratio projection

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106023111A (en) * 2016-05-23 2016-10-12 中国科学院深圳先进技术研究院 Image fusion quality evaluating method and system
CN106157317A (en) * 2016-07-21 2016-11-23 武汉大学 The high-resolution remote sensing image fusion rules method guided based on dispersion tensor

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106023111A (en) * 2016-05-23 2016-10-12 中国科学院深圳先进技术研究院 Image fusion quality evaluating method and system
CN106157317A (en) * 2016-07-21 2016-11-23 武汉大学 The high-resolution remote sensing image fusion rules method guided based on dispersion tensor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王力彦等: "基于光谱响应函数的ZY-3卫星图像融合算法研究", 《宇航学报》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111948653A (en) * 2020-07-31 2020-11-17 上海卫星工程研究所 Method and system for detecting forest target based on P-band synthetic aperture radar
CN113436069A (en) * 2021-06-16 2021-09-24 中国电子科技集团公司第五十四研究所 Remote sensing image fusion method based on maximum signal-to-noise ratio projection
CN113436069B (en) * 2021-06-16 2022-03-01 中国电子科技集团公司第五十四研究所 Remote sensing image fusion method based on maximum signal-to-noise ratio projection

Also Published As

Publication number Publication date
CN110942451B (en) 2022-11-18

Similar Documents

Publication Publication Date Title
CN108830796B (en) Hyperspectral image super-resolution reconstruction method based on spectral-spatial combination and gradient domain loss
CN111932457B (en) High space-time fusion processing algorithm and device for remote sensing image
CN107194904B (en) NSCT area image fusion method based on supplement mechanism and PCNN
CN109886870B (en) Remote sensing image fusion method based on dual-channel neural network
Perez et al. Poverty prediction with public landsat 7 satellite imagery and machine learning
CN106251320B (en) Remote sensing image fusion method based on joint sparse and structure dictionary
CN109727207B (en) Hyperspectral image sharpening method based on spectrum prediction residual convolution neural network
CN112991288B (en) Hyperspectral remote sensing image fusion method based on abundance image sharpening reconstruction
CN103810755B (en) Compressed sensing spectrum picture method for reconstructing based on documents structured Cluster rarefaction representation
CN111127374A (en) Pan-sharing method based on multi-scale dense network
CN113327218B (en) Hyperspectral and full-color image fusion method based on cascade network
CN110942451B (en) Method for evaluating fusion performance of remote sensing image without reference image
CN109886908B (en) Infrared image and visible light image fusion method
CN111798135B (en) High-speed rail settlement hazard assessment method based on multi-source data integration
CN108765359A (en) Fusion method of hyperspectral remote sensing image and full-color image based on JSK model and NSCT technology
CN111797891B (en) Method and device for generating unpaired heterogeneous face image based on generation countermeasure network
CN103020939A (en) Method for removing large-area thick clouds for optical remote sensing images through multi-temporal data
CN112819737A (en) Remote sensing image fusion method of multi-scale attention depth convolution network based on 3D convolution
CN113793289B (en) Multispectral image and full-color image fuzzy fusion method based on CNN and NSCT
Zhou et al. No-reference quality assessment for pansharpened images via opinion-unaware learning
CN105894507B (en) Image quality evaluating method based on amount of image information natural scene statistical nature
CN115984110A (en) Swin-transform-based second-order spectral attention hyperspectral image super-resolution method
CN105719262B (en) PAN and multi-spectral remote sensing image fusion method based on the sparse reconstruct of sub- dictionary
CN115760814A (en) Remote sensing image fusion method and system based on double-coupling deep neural network
CN116309136A (en) Remote sensing image cloud zone reconstruction method based on SAR priori knowledge guidance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant