CN110533600B - Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method - Google Patents
Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method Download PDFInfo
- Publication number
- CN110533600B CN110533600B CN201910619429.1A CN201910619429A CN110533600B CN 110533600 B CN110533600 B CN 110533600B CN 201910619429 A CN201910619429 A CN 201910619429A CN 110533600 B CN110533600 B CN 110533600B
- Authority
- CN
- China
- Prior art keywords
- remote sensing
- sensing image
- spatial
- spectral
- fusion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001228 spectrum Methods 0.000 title claims abstract description 48
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 24
- 238000000034 method Methods 0.000 claims abstract description 58
- 230000003595 spectral effect Effects 0.000 claims abstract description 57
- 230000004927 fusion Effects 0.000 claims abstract description 55
- 238000012545 processing Methods 0.000 claims abstract description 21
- 238000012952 Resampling Methods 0.000 claims description 30
- 238000005070 sampling Methods 0.000 claims description 13
- 238000002347 injection Methods 0.000 claims description 12
- 239000007924 injection Substances 0.000 claims description 12
- 238000005516 engineering process Methods 0.000 claims description 11
- 230000005855 radiation Effects 0.000 claims description 9
- 238000001914 filtration Methods 0.000 claims description 6
- 238000007781 pre-processing Methods 0.000 claims description 6
- 241000287196 Asthenes Species 0.000 claims description 5
- 238000012937 correction Methods 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 238000012935 Averaging Methods 0.000 claims description 2
- 230000006870 function Effects 0.000 description 4
- 230000000295 complement effect Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 101100272279 Beauveria bassiana Beas gene Proteins 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
- G01S13/9021—SAR image post-processing techniques
- G01S13/9027—Pattern recognition for feature extraction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N2021/1793—Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Artificial Intelligence (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method, which can simultaneously meet the requirements of full-color/multi-spectrum, full-color/high-spectrum homogeneous image high-fidelity fusion and high-spatial resolution SAR image and low-spatial resolution multi-spectral heterogeneous image high-fidelity fusion, generate a remote sensing image with high spatial resolution and high spectral resolution, and extract high-frequency information of a high-spatial resolution remote sensing image based on a simple and quick component replacement fusion basic framework in consideration of the requirement of practical engineering application on quick processing; the method has the advantages that the method is more robust to spatial registration among heterogeneous images, and simultaneously takes the influences of spatial enhancement, spectral compensation and noise into consideration, so that the method can simultaneously meet the requirements of space-spectrum high-fidelity fusion of homogeneous/heterogeneous images.
Description
Technical Field
The invention relates to a remote sensing image processing technology, in particular to a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method.
Background
The remote sensing image is an important carrier for obtaining earth surface information, wherein the remote sensing image with high spatial resolution and high spectral resolution plays an important role in the fields of information interpretation, ground object classification and identification and the like. However, due to the limitation of the satellite sensor and other factors, the obtained remote sensing image is restricted in both high spatial resolution and high spectral resolution and cannot be obtained at the same time. Remote sensing satellites at home and abroad such as IKONOS, Quickbird, high-resolution one, high-resolution two and the like provide a panchromatic image and a multispectral image simultaneously, wherein the panchromatic image has high spatial resolution but only one waveband; multispectral images have relatively good spectral resolution, but tend to have low spatial resolution. The space-spectrum fusion technology can integrate the complementary advantages of high spatial resolution and high spectral resolution between multi-source remote sensing images to generate the remote sensing images with high spatial resolution and high spectral resolution simultaneously.
At present, a large number of space-spectrum fusion methods such as panchromatic/multispectral image fusion, panchromatic/hyperspectral image fusion and multispectral/hyperspectral image fusion exist, and representative methods include component replacement fusion methods, multiresolution analysis fusion methods, variational optimization fusion methods, fusion methods based on deep learning and the like. These methods generally refer to a space-spectrum fusion method under the understanding of knight-errant, i.e., space-spectrum fusion between homogeneous remote sensing images. Secondly, although the pixel-level fusion of the SAR/multispectral image belongs to heterogeneous remote sensing image fusion, if the pixel-level fusion aims at obtaining the multispectral image with high spatial resolution by utilizing the space and spectrum complementary information of the SAR image and the multispectral image, the pixel-level fusion also belongs to the category of generalized space-spectrum fusion. At present, panchromatic/multispectral fusion frames and methods are mostly adopted or used for reference in SAR/multispectral image fusion methods, but compared with homogeneous remote sensing image space-spectrum fusion, the existing methods are more challenging in the aspects of fusion image space enhancement and spectrum fidelity due to the fact that the difference between SAR images and multispectral images is large. In general, on one hand, although partial homogeneous remote sensing image space-spectrum fusion methods such as a component replacement fusion method and a multi-resolution analysis fusion method are integrated in a professional remote sensing software platform, most of the existing methods are difficult to obtain in the aspects of fusion precision and efficiency; on the other hand, although most of the existing panchromatic/multispectral homogeneous remote sensing image fusion and SAR/multispectral image heterogeneous remote sensing image fusion adopt similar or consistent frames, the cooperative processing of the space-spectrum fusion of the homogeneous remote sensing image and the space-spectrum fusion of the heterogeneous remote sensing image is difficult to perform. Therefore, how to develop the same/heterogeneous collaborative fusion method of the multisource remote sensing images and meet the requirements of high-fidelity fusion of the same-quality remote sensing images such as panchromatic/multispectral images, panchromatic/hyperspectral images and the like and high-fidelity fusion of SAR/multispectral heterogeneous remote sensing images undoubtedly has important significance and practical application value.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method, which generates a remote sensing image with high spatial resolution and high spectral resolution simultaneously by fusing space and spectrum complementary information of a multi-source homogeneous or heterogeneous remote sensing image, and can meet the requirements of full-color/multi-spectrum, full-color/high-spectrum, multi-spectrum/high-spectrum homogeneous remote sensing image high-fidelity space-spectrum fusion and SAR/multi-spectrum heterogeneous remote sensing image high-fidelity space-spectrum fusion.
The technical scheme adopted by the invention for solving the technical problems is as follows: a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method is characterized by comprising the following steps:
step 1: selecting multi-source remote sensing images aiming at the same scene, namely an original high-spatial-resolution remote sensing image and an original high-spectral-resolution remote sensing image, and marking the images as IgkAnd Igg(ii) a Then to IgkAnd IggPreprocessing is carried out, and the high spatial resolution remote sensing image and the high spectral resolution remote sensing image obtained after preprocessing are correspondingly marked as I* gkAnd I* gg;
And 2, step: judgment of I* gkIf the noise is contained, adopting a denoising algorithm to process I* gkCarrying out rapid denoising treatment, and recording the high spatial resolution remote sensing image obtained after denoising treatment asIf no noise is contained, directly connecting I* gkIs newly recorded as
step 3_ 1: in IgkSpectral range and IggIn the case of a known spectral range according to IgkSpectral range of (1) andggis selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range ofThe band of (2); then according toSize of (1), pair I* ggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling asThen according to I* ggAnd by selectingIs linearly combined to obtainA luminance component of (a);
or in IgkSpectral range and IggIn the case of unknown spectral range, according toSize of (1), pair I* ggPerforming spatial resampling to obtain hyperspectrumResolution remote sensing image is recordedThen according to I* ggAnd pass through a pairAll the wave bands are linearly combined to obtainA luminance component of (a);
as mentioned above, toThe process of obtaining the combination coefficient adopted by the linear combination of all the wave bands is as follows: to pairCarrying out space down-sampling, and recording the high-spatial-resolution remote sensing image obtained after the space down-samplingIn IgkSpectral range of (1) andggin the case of a known spectral range according to IgkSpectral range of (1) andggis selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range ofUsing least square method toAnd selected I* ggThe wave bands are processed, and the combination coefficient is obtained by solving; or in IgkSpectral range of (1) andggunder the condition that the spectral range of the light source is unknown, the least square method is adopted to carry out the pairAnd I* ggAll the wave bands are processed, and the combination coefficient is obtained by solving;
step 3_ 2: will be provided withAs a reference, toPerforming moment matching to attenuateLuminance component of andthe radiation difference between them, the high space resolution remote sensing image obtained after the moment matching is recorded as
Step 3_ 3: will be provided withAndsubtracting the luminance components of the image data to obtain a difference imageA fundamental high-frequency component of (a); then passes through a laplacian sharpening filter pairIs spatially enhanced to obtainThe adjustable high frequency component of (a); then willBasic high frequency component of (2)Is weighted and combined to obtain the adjustable high frequency componentThe high frequency information of (a); wherein,the value range of the weight of the adjustable high-frequency component is [0,5 ]];
And 4, step 4: by means of I* ggThe gradient of each band of (1), calculating I* ggRequired for each band ofThe injection weight of the high frequency information of (1);
and 5: will I* ggRequired for each band ofAnd the injection weight of the high frequency informationMultiplying the high frequency information of (1); then injecting the obtained result intoObtaining a primary spatial height sharpening fusion image with high spatial resolution and high spectral resolution simultaneously, and marking as Irh;
Step 6: by means of I* ggTo IrhAnd carrying out spectrum compensation and correction to obtain a spectrum high-fidelity optimal space height sharpening fusion image.
In the step 1, I* gkAnd I* ggThe acquisition process comprises the following steps:
step 1_ 1: according to IgkAnd IggSpatial resolution ratio of, to IggPerforming spatial resamplingAnd marking the high-spectral-resolution remote sensing image obtained after sampling asSize and I ofgkThe sizes of the components are consistent; then will beFor reference, to IgkPerforming geometric registration, and taking the high-spatial-resolution remote sensing image obtained after the geometric registration as a first high-spatial-resolution remote sensing image, which is marked as I'gk(ii) a And will IggIs taken as a first high spectral resolution remote sensing image and is marked as I'gg;
Step 1_ 2: will be provided withFor reference, to l'gkRadiation registration to attenuate I'gkAnd l'ggTaking the high spatial resolution remote sensing image obtained after radiation registration as a second high spatial resolution remote sensing image, and marking as I* gk(ii) a And is prepared from'ggAs a second high spectral resolution remote sensing image, it is marked as I* gg。
In the step 1_1, professional remote sensing image processing software ENVI/ERDAS is used for geometric registration.
In the step 1_2, I 'is subjected to moment matching technology'gkThe registration of the radiation is carried out,wherein, Pgk,stdIs represented by l'gkStandard deviation of pixel values of all pixel points in (1), Pgk,meanIs represented by l'gkThe mean of the pixel values of all the pixel points in (a),to representOf all pixels in the luminance componentThe standard deviation of the pixel values is then determined,to representThe average of the pixel values of all the pixels in the luminance component.
SaidThe process of obtaining the luminance component of (a) is: to pairIs averaged over all bands to obtainThe luminance component of (a).
In the step 2, a denoising algorithm is adopted for I* gkThe process of carrying out the rapid denoising treatment comprises the following steps: for homogeneous remote sensing image fusion, wiener filtering technology is adopted as a denoising algorithm pair I* gkCarrying out rapid denoising treatment; for heterogeneous remote sensing image fusion, a Lee filtering technology is adopted as a denoising algorithm pair I* gkAnd carrying out rapid denoising treatment.
The specific process of the step 4 is as follows:
step 4_ 1: to I* ggTaking the mean value of all the wave bands to obtain a mean value component;
step 4_ 2: calculation of I* ggAnd calculating the average gradient of the mean component;
step 4_ 3: will I* ggIs divided by the average gradient of the mean component to obtain I* ggRequired for the corresponding bandThe injection weight of the high frequency information of (1); for I* ggQ wave band of (2)Is shown by* ggIs divided by the average gradient of the mean component to obtain the result as I* ggRequired for the q-th bandThe injection weight of the high frequency information of (2); wherein Q is a positive integer, the initial value of Q is 1, Q is more than or equal to 1 and less than or equal to Q, and Q represents I* ggThe number of bands of (c).
The specific process of the step 6 is as follows:
step 6_ 1: to IrhBlurring treatment was performed, and the blurred fused image was designated as I'rh(ii) a Wherein, the fuzzy function adopted by the fuzzy processing selects the modulation and demodulation function MTF of the image sensor;
step 6_ 2: to l'rhPerforming down-sampling to obtain a fused image with size and I* ggThe sizes of the images are consistent, and the fused image obtained after the down-sampling treatment is recorded as I "rh;
Step 6_ 3: is prepared from "rhAnd I* ggSubtracting to obtain a residual component;
step 6_ 4: carrying out resampling treatment on the residual error component so as to ensure that the size and I of the residual error component obtained after resampling treatmentrhThe sizes of the components are consistent;
step 6_ 5: adding residual components obtained after resampling treatment into IrhAnd obtaining the optimal spatial height sharpened fusion image.
Compared with the prior art, the invention has the advantages that:
1) the method of the invention fully considers the influence of noise in the remote sensing image and the phenomenon that the fusion of heterogeneous remote sensing images often generates spectrum distortion, introduces the spectrum compensation strategy of the fusion remote sensing image, and can not only meet the fusion requirements of homogeneous panchromatic/multispectral remote sensing images, panchromatic/hyperspectral remote sensing images and multispectral/hyperspectral remote sensing images, but also meet the fusion requirements of heterogeneous remote sensing images such as SAR/multispectral and the like.
2) The method simultaneously considers space enhancement and spectrum compensation, can improve the spatial resolution of the high-spectral-resolution remote sensing image, obtains the fused image with the sharpened spatial height, meets the visual interpretation requirement of the fused image, simultaneously obtains the optimal fused image with the sharpened spatial height, has higher spectral fidelity and can well meet the quantitative application requirement.
3) The method is simple and efficient, and can meet the rapid fusion processing requirement of practical engineering application.
Drawings
FIG. 1 is a general flow diagram of the process of the present invention;
FIG. 2a is a full-color remote sensing image;
FIG. 2b is a multispectral remote sensing image obtained by processing the multispectral remote sensing image of the same scene as FIG. 2a in step 3_1 of the method of the present invention after spatial resampling;
FIG. 2c is a fused image with high spectral fidelity and optimal spatial height sharpening obtained by processing the images in FIGS. 2a and 2b according to the method of the present invention;
FIG. 3a is a panchromatic remote sensing image;
FIG. 3b is a hyperspectral remote sensing image after spatial resampling obtained by processing the hyperspectral remote sensing image of the same scene as that of FIG. 3a in step 3_1 of the method of the invention;
FIG. 3c is a fused image with high spectral fidelity and optimal spatial height sharpening obtained by processing the images in FIGS. 3a and 3b according to the method of the present invention;
FIG. 4a is an SAR remote sensing image;
FIG. 4b is a multispectral remote sensing image obtained by processing the multispectral remote sensing image of the same scene as the scene in FIG. 4a in step 3_1 of the method of the present invention and then performing spatial resampling;
fig. 4c is a fused image with high spectral fidelity and optimal spatial height sharpening obtained by processing fig. 4a and 4b by using the method of the present invention.
Detailed Description
The invention is described in further detail below with reference to the accompanying examples.
The invention provides a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method, which can generate an optimal fusion image with high spatial height and simultaneously has high spectral fidelity of the optimal fusion image and can simultaneously meet the requirements of homogeneous remote sensing image fusion and heterogeneous remote sensing image fusion.
The invention provides a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method, the general flow block diagram of which is shown in figure 1, and the method comprises the following steps:
step 1: selecting multi-source remote sensing images aiming at the same scene, namely an original high-spatial-resolution remote sensing image and an original high-spectral-resolution remote sensing image, and marking the images as IgkAnd Igg(ii) a Then to IgkAnd IggPreprocessing is carried out, and the high spatial resolution remote sensing image and the high spectral resolution remote sensing image obtained after preprocessing are correspondingly marked as I* gkAnd I* gg。
In this embodiment, step 1, I* gkAnd I* ggThe acquisition process comprises the following steps:
step 1_ 1: according to IgkAnd IggSpatial resolution ratio of, to IggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling as Size and I ofgkThe sizes of the components are consistent; then will beAs reference, to IgkPerforming geometric registration, and taking the high-spatial-resolution remote sensing image obtained after the geometric registration as a first high-spatial-resolution remote sensing image recorded as I'gk(ii) a And will IggIs taken as a first high spectral resolution remote sensing image and is marked as I'gg。
Step 1_ 2: will be provided withFor reference, to I'gkRadiation registration to attenuate I'gkAnd l'ggTaking the high spatial resolution remote sensing image obtained after radiation registration as a second high spatial resolution remote sensing image, and marking as I* gk(ii) a And mixing I'ggAs a second high spectral resolution remote sensing image, it is marked as I* gg。
In this particular embodiment, in step 1_1, the geometric registration utilizes specialized remote sensing image processing software ENVI/ERDAS.
In this embodiment, in step 1_2, in consideration of possible difference between the acquisition time of the high-spatial-resolution remote sensing image and the acquisition time of the high-spectral-resolution remote sensing image, especially that the heterogeneous remote sensing images are fused commonly, a moment matching technology is adopted for I'gkThe registration of the radiation is carried out,wherein, Pgk,stdIs represented by I'gkStandard deviation of pixel values of all pixel points in (1), Pgk,meanIs represented by I'gkThe mean of the pixel values of all the pixel points in (a),to representThe standard deviation of the pixel values of all the pixel points in the luminance component of (1),representThe average of the pixel values of all the pixels in the luminance component.
In the case of this particular embodiment of the present invention,the process of obtaining the luminance component of (a) is: to pairIs averaged over all bands to obtainThe luminance component of (a).
Step 2: judgment of I* gkIf the noise is contained, adopting a denoising algorithm to process I* gkCarrying out rapid denoising treatment, and recording the high-spatial-resolution remote sensing image obtained after denoising treatment asIf no noise is contained, directly adding I* gkIs newly recorded as
In this embodiment, in step 2, a denoising algorithm is used to pair I ·gkThe process of carrying out the rapid denoising treatment comprises the following steps: for homogeneous remote sensing image fusion, e.g. IgkIs a full-color image, and IggIn the case of multispectral image or hyperspectral image, wiener filtering technology is adopted as a denoising algorithm pair I* gkCarrying out rapid denoising treatment; for heterogeneous remote-sensing image fusion, e.g. IgkIs an SAR image, and IggIn the case of multispectral images or hyperspectral images, Lee filtering technology is adopted as a denoising algorithm pair I* gkAnd carrying out rapid denoising treatment.
step 3_ 1: in IgkSpectral range and IggSpectrum ofWith known range, according to IgkSpectral range and IggIs selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range ofThe band of (2); then according toSize of (1), pair I* ggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling asThen according to I* ggAnd by selectingIs linearly combined to obtainThe luminance component of (a).
Or in IgkSpectral range and IggIn the case of unknown spectral range, according toSize of (1), pair I* ggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling asThen according to I* ggAnd pass throughIs linearly combined to obtainLuminance component of。
Above, forThe process of obtaining the combination coefficient adopted by the linear combination of all the wave bands is as follows: in response to the demand for fast image processing, toCarrying out space down-sampling, and recording the high-spatial-resolution remote sensing image obtained after the space down-samplingIn IgkSpectral range and IggIn the case of a known spectral range according to IgkSpectral range of (1) andggis selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range ofBy least square method, toAnd selected I* ggProcessing the wave bands, and solving to obtain a combination coefficient; or in IgkSpectral range of (1) andunder the condition that the spectral range of the light source is unknown, the least square method is adopted to carry out the pairAnd I* ggAll the wave bands are processed and solved to obtain a combination coefficient.
Step 3_ 2: will be provided withAs a reference, toPerforming moment matching to attenuateLuminance component of andthe radiation difference between them, the high spatial resolution remote sensing image obtained after the moment matching is recorded as
Step 3_ 3: will be provided withAndsubtracting the luminance components of the image data to obtain a difference imageA base high-frequency component of (a); then passes through a laplacian sharpening filter pairIs spatially enhanced to obtainThe adjustable high frequency component of (a); then willOf the fundamental high-frequency component ofIs obtained by weighted combination of the adjustable high-frequency componentsHigh frequency information of (2); wherein,the value range of the weight of the adjustable high-frequency component is [0,5 ]]At the time of actual processingThe weight of the adjustable high-frequency component is manually set according to requirements.
And 4, step 4: by means of I* ggThe gradient of each band of (a), calculating I* ggRequired for each band ofThe injection weight of the high frequency information.
In this embodiment, the specific process of step 4 is:
step 4_ 1: to I* ggAnd averaging all the wave bands to obtain an average component.
Step 4_ 2: calculation of I* ggAnd the average gradient of the mean component is calculated.
Step 4_ 3: will I* ggIs divided by the average gradient of the mean component to obtain I* ggRequired for the corresponding bandThe injection weight of the high frequency information of (2); for I* ggQ wave band of (1), will I* ggIs divided by the average gradient of the mean component to obtain the result as I* ggRequired for the q-th bandThe injection weight of the high frequency information of (1); wherein Q is a positive integer, the initial value of Q is 1, Q is more than or equal to 1 and less than or equal to Q, and Q represents I* ggThe number of bands of (c).
And 5: will I* ggRequired for each band ofAnd injection weight of high frequency informationMultiplying the high frequency information of (1); then injecting the obtained result intoObtaining a preliminary spatial height sharpening fusion image with high spatial resolution and high spectral resolution simultaneously, and marking as Irh。
Step 6: by means of I* ggTo IrhAnd carrying out spectrum compensation and correction to obtain a spectrum high-fidelity optimal space height sharpening fusion image.
In this embodiment, the specific process of step 6 is:
step 6_ 1: to IrhBlurring treatment was performed, and the blurred fused image was designated as I'rh(ii) a The fuzzy function adopted by the fuzzy processing selects the modulation and demodulation function MTF of the image sensor.
Step 6_ 2: to l'rhPerforming down-sampling to obtain a fused image with size and I* ggThe sizes of the images are consistent, and the fused image obtained after the down-sampling treatment is recorded as I "rh。
Step 6_ 3: will I "rhAnd I* ggThe subtraction yields the residual component.
Step 6_ 4: carrying out resampling treatment on the residual error component to ensure that the size and I of the residual error component obtained after resampling treatmentrhAre consistent in size.
Step 6_ 5: adding residual components obtained after resampling into IrhThe spectral fidelity of the fused image is further improved, and the optimal spatial height sharpened fused image is obtained.
In order to verify the feasibility and effectiveness of the method of the invention, experiments were carried out on the method of the invention.
Fig. 2a shows a panchromatic remote sensing image, fig. 2b shows a multispectral remote sensing image obtained by processing the multispectral remote sensing image of the same scene as fig. 2a in step 3_1 of the method of the present invention after spatial resampling, and fig. 2c shows a high-fidelity fused image of the spectrum obtained by processing fig. 2a and fig. 2b with the method of the present invention with the optimal spatial height. Fig. 3a shows a panchromatic remote sensing image, fig. 3b shows a hyperspectral remote sensing image obtained by spatial resampling after the hyperspectral remote sensing image of the same scene as fig. 3a is processed by the step 3_1 of the method of the invention, and fig. 3c shows a spectral high-fidelity optimal spatial height sharpening fusion image obtained by processing fig. 3a and fig. 3b by the method of the invention. Fig. 4a shows an SAR remote sensing image, fig. 4b shows a multispectral remote sensing image obtained after spatial resampling by processing the multispectral remote sensing image of the same scene as fig. 4a in step 3_1 of the method of the present invention, and fig. 4c shows a high-fidelity fused image of the spectrum obtained by processing fig. 4a and fig. 4b with the method of the present invention. Observing fig. 2c, fig. 3c and fig. 4c, it can be seen that no matter panchromatic/multispectral remote sensing image fusion, panchromatic/hyperspectral remote sensing image fusion and SAR/multispectral remote sensing image fusion, the fusion result has sharper spatial structure information, meanwhile, the spectrum color of the fusion result has higher consistency with the original hyperspectral resolution remote sensing image, and the spectrum retention performance is better.
Claims (6)
1. A same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method is characterized by comprising the following steps:
step 1: selecting multi-source remote sensing images aiming at the same scene, namely an original high-spatial-resolution remote sensing image and an original high-spectral-resolution remote sensing image, and correspondingly marking as IgkAnd Igg(ii) a Then to IgkAnd IggPreprocessing is carried out, and the high-spatial-resolution remote sensing image and the high-spectral-resolution remote sensing image obtained after preprocessing are correspondingly marked as I* gkAnd I* gg;
Step 2: judgment of I* gkWhether or not to include noise thereinIf the noise is contained, adopting a denoising algorithm to the I* gkCarrying out rapid denoising treatment, and recording the high-spatial-resolution remote sensing image obtained after denoising treatment asIf no noise is contained, directly adding I* gkIs newly recorded as
In the step 2, a denoising algorithm is adopted for I* gkThe process of carrying out the rapid denoising treatment comprises the following steps: for homogeneous remote sensing image fusion, wiener filtering technology is adopted as a denoising algorithm pair I* gkCarrying out rapid denoising treatment; for heterogeneous remote sensing image fusion, Lee filtering technology is adopted as a denoising algorithm pair I* gkCarrying out rapid denoising treatment;
step 3_ 1: in IgkSpectral range of (1) andggin the case of a known spectral range according to IgkSpectral range and IggIs selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range ofThe band of (2); then according toSize of (1), pair I* ggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling asThen according to I* ggAnd by selectingIs linearly combined to obtainA luminance component of (a);
or in IgkSpectral range of (1) andggin the case of unknown spectral range, according toSize of (1), pair I* ggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling asThen according to I* ggAnd pass through a pairAll the wave bands are linearly combined to obtainA luminance component of (a);
as mentioned above, toThe process of obtaining the combination coefficient adopted by the linear combination of all the wave bands is as follows: to pairCarrying out space down-sampling, and recording the high-spatial-resolution remote sensing image obtained after the space down-samplingIn IgkSpectral range ofEnclose and IggIn the case of a known spectral range according to IgkSpectral range and IggIs selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range ofBy least square method, toAnd selected I* ggProcessing the wave bands, and solving to obtain a combination coefficient; or in IgkSpectral range and IggUnder the condition that the spectral range of the light source is unknown, the least square method is adopted to carry out the pairAnd I* ggAll the wave bands are processed, and the combination coefficient is obtained by solving;
step 3_ 2: will be provided withAs a reference, toPerforming moment matching to attenuateLuminance component of andthe radiation difference between them, the high space resolution remote sensing image obtained after the moment matching is recorded as
Step 3_ 3: will be provided withAndis subtracted from the luminance component of (b), and the resulting difference image is taken asA base high-frequency component of (a); then passes through a laplacian sharpening filter pairIs spatially enhanced to obtainThe adjustable high frequency component of (a); then will beOf the fundamental high-frequency component ofIs weighted and combined to obtain the adjustable high frequency componentHigh frequency information of (2); wherein,the value range of the weight of the adjustable high-frequency component is [0,5 ]];
And 4, step 4: by means of I* ggThe gradient of each band of (1), calculating I* ggRequired for each band ofThe injection weight of the high frequency information of (1);
and 5: will I* ggRequired for each band ofAnd the injection weight of the high frequency informationMultiplying the high frequency information of (1); then injecting the obtained result intoObtaining a primary spatial height sharpening fusion image with high spatial resolution and high spectral resolution simultaneously, and marking as Irh;
Step 6: by means of I* ggTo IrhPerforming spectral compensation and correction to obtain a spectral high-fidelity optimal space height sharpening fusion image;
the specific process of the step 6 is as follows:
step 6_ 1: to IrhBlurring treatment was performed, and the blurred fused image was designated as I'rh(ii) a Wherein, the fuzzy function adopted by the fuzzy processing selects the modulation and demodulation function MTF of the image sensor;
step 6_ 2: to l'rhPerforming down-sampling to obtain a fused image with size and I* ggIs identical in size, and the fused image obtained after down-sampling is recorded as I'rh;
Step 6_ 3: is prepared from'rhAnd I* ggSubtracting to obtain a residual component;
step 6_ 4: carrying out resampling treatment on the residual error component to ensure that the size and I of the residual error component obtained after resampling treatmentrhThe sizes of the components are consistent;
step 6_ 5: adding residual components obtained after resampling into IrhAnd obtaining the optimal spatial height sharpened fusion image.
2. The same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method according to claim 1, which comprisesCharacterized in that in the step 1, I* gkAnd I* ggThe acquisition process comprises the following steps:
step 1_ 1: according to IgkAnd IggSpatial resolution ratio between, to IggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling as Size and I ofgkThe sizes of the components are consistent; then will beFor reference, to IgkPerforming geometric registration, and taking the high-spatial-resolution remote sensing image obtained after the geometric registration as a first high-spatial-resolution remote sensing image, which is marked as I'gk(ii) a And will IggIs taken as a first high spectral resolution remote sensing image and is marked as I'gg;
Step 1_ 2: will be provided withFor reference, to I'gkRadiation registration to attenuate I'gkAnd I'ggTaking the high spatial resolution remote sensing image obtained after radiation registration as a second high spatial resolution remote sensing image, and marking as I* gk(ii) a And is prepared from'ggAs a second high spectral resolution remote sensing image, it is marked as I* gg。
3. The method for high-fidelity generalized space-spectrum fusion of the same/heterogeneous remote sensing images according to claim 2, wherein in the step 1_1, professional remote sensing image processing software ENVI/ERDAS is used for geometric registration.
4. A method according to claim 2 or 3A same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method is characterized in that in the step 1_2, a moment matching technology is adopted to pair I'gkThe registration of the radiation is carried out,wherein, Pgk,stdIs represented by l'gkStandard deviation, P, of pixel values of all pixels in (1)gk,meanIs represented by l'gkThe average of the pixel values of all the pixel points in (1),representThe standard deviation of the pixel values of all the pixel points in the luminance component of (1),to representThe average of the pixel values of all the pixels in the luminance component.
6. The same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method according to claim 1, characterized in that the specific process of the step 4 is as follows:
step 4_ 1: to I* ggTaking the mean value of all the wave bands to obtain a mean value component;
step 4_ 2: calculating I* ggAnd calculating the average gradient of the mean component;
step 4_ 3: will I* ggIs divided by the average gradient of the mean component to obtain I* ggRequired for the corresponding bandThe injection weight of the high frequency information of (1); for I* ggQ wave band of (1), will I* ggIs divided by the average gradient of the mean component to obtain the result as I* ggRequired for the q-th bandThe injection weight of the high frequency information of (1); wherein Q is a positive integer, the initial value of Q is 1, Q is more than or equal to 1 and less than or equal to Q, and Q represents I* ggThe number of bands of (a).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910619429.1A CN110533600B (en) | 2019-07-10 | 2019-07-10 | Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910619429.1A CN110533600B (en) | 2019-07-10 | 2019-07-10 | Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110533600A CN110533600A (en) | 2019-12-03 |
CN110533600B true CN110533600B (en) | 2022-07-19 |
Family
ID=68659619
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910619429.1A Active CN110533600B (en) | 2019-07-10 | 2019-07-10 | Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110533600B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111145132B (en) * | 2019-12-05 | 2022-11-18 | 东南大学 | Spectrum-preserving type rapid remote sensing image fusion method |
CN111091113A (en) * | 2019-12-30 | 2020-05-01 | 贵阳欧比特宇航科技有限公司 | Hyperspectral image data fusion method |
CN113436123B (en) * | 2021-06-22 | 2022-02-01 | 宁波大学 | High-resolution SAR and low-resolution multispectral image fusion method based on cloud removal-resolution improvement cooperation |
CN117253125B (en) * | 2023-10-07 | 2024-03-22 | 珠江水利委员会珠江水利科学研究院 | Space-spectrum mutual injection image fusion method, system and readable storage medium |
CN117274763B (en) * | 2023-11-21 | 2024-04-05 | 珠江水利委员会珠江水利科学研究院 | Remote sensing image space-spectrum fusion method, system, equipment and medium based on balance point analysis |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102446351A (en) * | 2010-10-15 | 2012-05-09 | 江南大学 | Multispectral and high-resolution full-color image fusion method study |
CN103236047A (en) * | 2013-03-28 | 2013-08-07 | 北京航空航天大学 | Method for fusing full-color and multi-spectral images on basis of fitting for substituted components |
CN103942769A (en) * | 2013-12-10 | 2014-07-23 | 珠江水利委员会珠江水利科学研究院 | Satellite remote sensing image fusion method |
CN112166692B (en) * | 2012-06-29 | 2014-07-30 | 二十一世纪空间技术应用股份有限公司 | Remote sensing image cross fusion method |
CN104851091A (en) * | 2015-04-28 | 2015-08-19 | 中山大学 | Remote sensing image fusion method based on convolution enhancement and HCS transform |
CN106651800A (en) * | 2016-12-23 | 2017-05-10 | 中国科学院遥感与数字地球研究所 | PAN modulation and multiple linear regression-based MS and PAN image fusion method |
CN108765359A (en) * | 2018-05-31 | 2018-11-06 | 安徽大学 | Fusion method of hyperspectral remote sensing image and full-color image based on JSK model and NSCT technology |
CN109166089A (en) * | 2018-07-24 | 2019-01-08 | 重庆三峡学院 | The method that a kind of pair of multispectral image and full-colour image are merged |
CN109509160A (en) * | 2018-11-28 | 2019-03-22 | 长沙理工大学 | Hierarchical remote sensing image fusion method utilizing layer-by-layer iteration super-resolution |
CN109886870A (en) * | 2018-12-29 | 2019-06-14 | 西北大学 | Remote sensing image fusion method based on binary channels neural network |
-
2019
- 2019-07-10 CN CN201910619429.1A patent/CN110533600B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102446351A (en) * | 2010-10-15 | 2012-05-09 | 江南大学 | Multispectral and high-resolution full-color image fusion method study |
CN112166692B (en) * | 2012-06-29 | 2014-07-30 | 二十一世纪空间技术应用股份有限公司 | Remote sensing image cross fusion method |
CN103236047A (en) * | 2013-03-28 | 2013-08-07 | 北京航空航天大学 | Method for fusing full-color and multi-spectral images on basis of fitting for substituted components |
CN103942769A (en) * | 2013-12-10 | 2014-07-23 | 珠江水利委员会珠江水利科学研究院 | Satellite remote sensing image fusion method |
CN104851091A (en) * | 2015-04-28 | 2015-08-19 | 中山大学 | Remote sensing image fusion method based on convolution enhancement and HCS transform |
CN106651800A (en) * | 2016-12-23 | 2017-05-10 | 中国科学院遥感与数字地球研究所 | PAN modulation and multiple linear regression-based MS and PAN image fusion method |
CN108765359A (en) * | 2018-05-31 | 2018-11-06 | 安徽大学 | Fusion method of hyperspectral remote sensing image and full-color image based on JSK model and NSCT technology |
CN109166089A (en) * | 2018-07-24 | 2019-01-08 | 重庆三峡学院 | The method that a kind of pair of multispectral image and full-colour image are merged |
CN109509160A (en) * | 2018-11-28 | 2019-03-22 | 长沙理工大学 | Hierarchical remote sensing image fusion method utilizing layer-by-layer iteration super-resolution |
CN109886870A (en) * | 2018-12-29 | 2019-06-14 | 西北大学 | Remote sensing image fusion method based on binary channels neural network |
Non-Patent Citations (3)
Title |
---|
Multi spectral image fusion by deep convolutional neural network;Sajjad Eghbalian等;《International Journal of Remote Sensing》;20181230;第3983-4002页 * |
多源时—空—谱光学遥感影像的变分融合方法;孟祥超;《中国优秀博硕士学位论文全文数据库(博士)基础科学辑》;20180615;第2-5章 * |
顾及相机抖动与运动目标的多重曝光影像融合;朱吉;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20190615;第4章 * |
Also Published As
Publication number | Publication date |
---|---|
CN110533600A (en) | 2019-12-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110533600B (en) | Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method | |
CN107563964B (en) | Rapid splicing method for large-area-array sub-meter-level night scene remote sensing images | |
Klonus et al. | Image fusion using the Ehlers spectral characteristics preservation algorithm | |
Huang et al. | Cloud removal from optical satellite imagery with SAR imagery using sparse representation | |
Aly et al. | A regularized model-based optimization framework for pan-sharpening | |
US8761506B1 (en) | Pan sharpening digital imagery | |
CN111986134B (en) | Remote sensing imaging method and device for area-array camera | |
CN108090872B (en) | Single-frame multispectral image super-resolution reconstruction method and system based on gradient extraction | |
Song et al. | An adaptive pansharpening method by using weighted least squares filter | |
CN111340895B (en) | Image color uniformizing method based on pyramid multi-scale fusion | |
Tao et al. | Hyperspectral image recovery based on fusion of coded aperture snapshot spectral imaging and RGB images by guided filtering | |
Seo et al. | UPSNet: Unsupervised pan-sharpening network with registration learning between panchromatic and multi-spectral images | |
KR20190060481A (en) | Method for satellite image processing and recording medium Therefor | |
Duran et al. | Restoration of pansharpened images by conditional filtering in the PCA domain | |
KR20210096925A (en) | Flexible Color Correction Method for Massive Aerial Orthoimages | |
CN115100075A (en) | Hyperspectral panchromatic sharpening method based on spectral constraint and residual error attention network | |
CN116109535A (en) | Image fusion method, device and computer readable storage medium | |
Zhong et al. | Attention_FPNet: Two-branch remote sensing image pansharpening network based on attention feature fusion | |
He et al. | Pansharpening using total variation regularization | |
Kundu et al. | Enhanced IHS Pan-sharpening using K-means segmentation guided adaptive intensity histogram matching and CLAHE enhancement | |
KR102397148B1 (en) | Color Correction Method Using Low Resolution Color Image Of Large-capacity Aerial Orthoimage | |
CN111524079B (en) | Multispectral remote sensing image full-color sharpening method based on component replacement and low-pass filtering | |
Duran et al. | Pansharpening by a nonlocal channel-decoupled variational method | |
CN113888421A (en) | Fusion method of multispectral satellite remote sensing image | |
Alsmadi et al. | Pansharpening via deep guided filtering network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |