CN110533600B - Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method - Google Patents

Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method Download PDF

Info

Publication number
CN110533600B
CN110533600B CN201910619429.1A CN201910619429A CN110533600B CN 110533600 B CN110533600 B CN 110533600B CN 201910619429 A CN201910619429 A CN 201910619429A CN 110533600 B CN110533600 B CN 110533600B
Authority
CN
China
Prior art keywords
remote sensing
sensing image
spatial
spectral
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910619429.1A
Other languages
Chinese (zh)
Other versions
CN110533600A (en
Inventor
孟祥超
邵枫
杨刚
孙伟伟
符冉迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo University
Original Assignee
Ningbo University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo University filed Critical Ningbo University
Priority to CN201910619429.1A priority Critical patent/CN110533600B/en
Publication of CN110533600A publication Critical patent/CN110533600A/en
Application granted granted Critical
Publication of CN110533600B publication Critical patent/CN110533600B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9021SAR image post-processing techniques
    • G01S13/9027Pattern recognition for feature extraction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N2021/1793Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Artificial Intelligence (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method, which can simultaneously meet the requirements of full-color/multi-spectrum, full-color/high-spectrum homogeneous image high-fidelity fusion and high-spatial resolution SAR image and low-spatial resolution multi-spectral heterogeneous image high-fidelity fusion, generate a remote sensing image with high spatial resolution and high spectral resolution, and extract high-frequency information of a high-spatial resolution remote sensing image based on a simple and quick component replacement fusion basic framework in consideration of the requirement of practical engineering application on quick processing; the method has the advantages that the method is more robust to spatial registration among heterogeneous images, and simultaneously takes the influences of spatial enhancement, spectral compensation and noise into consideration, so that the method can simultaneously meet the requirements of space-spectrum high-fidelity fusion of homogeneous/heterogeneous images.

Description

Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method
Technical Field
The invention relates to a remote sensing image processing technology, in particular to a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method.
Background
The remote sensing image is an important carrier for obtaining earth surface information, wherein the remote sensing image with high spatial resolution and high spectral resolution plays an important role in the fields of information interpretation, ground object classification and identification and the like. However, due to the limitation of the satellite sensor and other factors, the obtained remote sensing image is restricted in both high spatial resolution and high spectral resolution and cannot be obtained at the same time. Remote sensing satellites at home and abroad such as IKONOS, Quickbird, high-resolution one, high-resolution two and the like provide a panchromatic image and a multispectral image simultaneously, wherein the panchromatic image has high spatial resolution but only one waveband; multispectral images have relatively good spectral resolution, but tend to have low spatial resolution. The space-spectrum fusion technology can integrate the complementary advantages of high spatial resolution and high spectral resolution between multi-source remote sensing images to generate the remote sensing images with high spatial resolution and high spectral resolution simultaneously.
At present, a large number of space-spectrum fusion methods such as panchromatic/multispectral image fusion, panchromatic/hyperspectral image fusion and multispectral/hyperspectral image fusion exist, and representative methods include component replacement fusion methods, multiresolution analysis fusion methods, variational optimization fusion methods, fusion methods based on deep learning and the like. These methods generally refer to a space-spectrum fusion method under the understanding of knight-errant, i.e., space-spectrum fusion between homogeneous remote sensing images. Secondly, although the pixel-level fusion of the SAR/multispectral image belongs to heterogeneous remote sensing image fusion, if the pixel-level fusion aims at obtaining the multispectral image with high spatial resolution by utilizing the space and spectrum complementary information of the SAR image and the multispectral image, the pixel-level fusion also belongs to the category of generalized space-spectrum fusion. At present, panchromatic/multispectral fusion frames and methods are mostly adopted or used for reference in SAR/multispectral image fusion methods, but compared with homogeneous remote sensing image space-spectrum fusion, the existing methods are more challenging in the aspects of fusion image space enhancement and spectrum fidelity due to the fact that the difference between SAR images and multispectral images is large. In general, on one hand, although partial homogeneous remote sensing image space-spectrum fusion methods such as a component replacement fusion method and a multi-resolution analysis fusion method are integrated in a professional remote sensing software platform, most of the existing methods are difficult to obtain in the aspects of fusion precision and efficiency; on the other hand, although most of the existing panchromatic/multispectral homogeneous remote sensing image fusion and SAR/multispectral image heterogeneous remote sensing image fusion adopt similar or consistent frames, the cooperative processing of the space-spectrum fusion of the homogeneous remote sensing image and the space-spectrum fusion of the heterogeneous remote sensing image is difficult to perform. Therefore, how to develop the same/heterogeneous collaborative fusion method of the multisource remote sensing images and meet the requirements of high-fidelity fusion of the same-quality remote sensing images such as panchromatic/multispectral images, panchromatic/hyperspectral images and the like and high-fidelity fusion of SAR/multispectral heterogeneous remote sensing images undoubtedly has important significance and practical application value.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method, which generates a remote sensing image with high spatial resolution and high spectral resolution simultaneously by fusing space and spectrum complementary information of a multi-source homogeneous or heterogeneous remote sensing image, and can meet the requirements of full-color/multi-spectrum, full-color/high-spectrum, multi-spectrum/high-spectrum homogeneous remote sensing image high-fidelity space-spectrum fusion and SAR/multi-spectrum heterogeneous remote sensing image high-fidelity space-spectrum fusion.
The technical scheme adopted by the invention for solving the technical problems is as follows: a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method is characterized by comprising the following steps:
step 1: selecting multi-source remote sensing images aiming at the same scene, namely an original high-spatial-resolution remote sensing image and an original high-spectral-resolution remote sensing image, and marking the images as IgkAnd Igg(ii) a Then to IgkAnd IggPreprocessing is carried out, and the high spatial resolution remote sensing image and the high spectral resolution remote sensing image obtained after preprocessing are correspondingly marked as I* gkAnd I* gg
And 2, step: judgment of I* gkIf the noise is contained, adopting a denoising algorithm to process I* gkCarrying out rapid denoising treatment, and recording the high spatial resolution remote sensing image obtained after denoising treatment as
Figure BDA0002124701380000021
If no noise is contained, directly connecting I* gkIs newly recorded as
Figure BDA0002124701380000022
And step 3: extraction of
Figure BDA0002124701380000023
The specific process of the high-frequency information is as follows:
step 3_ 1: in IgkSpectral range and IggIn the case of a known spectral range according to IgkSpectral range of (1) andggis selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range of
Figure BDA0002124701380000024
The band of (2); then according to
Figure BDA0002124701380000025
Size of (1), pair I* ggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling as
Figure BDA0002124701380000031
Then according to I* ggAnd by selecting
Figure BDA0002124701380000032
Is linearly combined to obtain
Figure BDA0002124701380000033
A luminance component of (a);
or in IgkSpectral range and IggIn the case of unknown spectral range, according to
Figure BDA0002124701380000034
Size of (1), pair I* ggPerforming spatial resampling to obtain hyperspectrumResolution remote sensing image is recorded
Figure BDA0002124701380000035
Then according to I* ggAnd pass through a pair
Figure BDA0002124701380000036
All the wave bands are linearly combined to obtain
Figure BDA0002124701380000037
A luminance component of (a);
as mentioned above, to
Figure BDA0002124701380000038
The process of obtaining the combination coefficient adopted by the linear combination of all the wave bands is as follows: to pair
Figure BDA0002124701380000039
Carrying out space down-sampling, and recording the high-spatial-resolution remote sensing image obtained after the space down-sampling
Figure BDA00021247013800000310
In IgkSpectral range of (1) andggin the case of a known spectral range according to IgkSpectral range of (1) andggis selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range of
Figure BDA00021247013800000311
Using least square method to
Figure BDA00021247013800000312
And selected I* ggThe wave bands are processed, and the combination coefficient is obtained by solving; or in IgkSpectral range of (1) andggunder the condition that the spectral range of the light source is unknown, the least square method is adopted to carry out the pair
Figure BDA00021247013800000313
And I* ggAll the wave bands are processed, and the combination coefficient is obtained by solving;
step 3_ 2: will be provided with
Figure BDA00021247013800000314
As a reference, to
Figure BDA00021247013800000315
Performing moment matching to attenuate
Figure BDA00021247013800000316
Luminance component of and
Figure BDA00021247013800000317
the radiation difference between them, the high space resolution remote sensing image obtained after the moment matching is recorded as
Figure BDA00021247013800000318
Step 3_ 3: will be provided with
Figure BDA00021247013800000319
And
Figure BDA00021247013800000320
subtracting the luminance components of the image data to obtain a difference image
Figure BDA00021247013800000321
A fundamental high-frequency component of (a); then passes through a laplacian sharpening filter pair
Figure BDA00021247013800000322
Is spatially enhanced to obtain
Figure BDA00021247013800000323
The adjustable high frequency component of (a); then will
Figure BDA00021247013800000324
Basic high frequency component of (2)
Figure BDA00021247013800000325
Is weighted and combined to obtain the adjustable high frequency component
Figure BDA00021247013800000326
The high frequency information of (a); wherein,
Figure BDA00021247013800000327
the value range of the weight of the adjustable high-frequency component is [0,5 ]];
And 4, step 4: by means of I* ggThe gradient of each band of (1), calculating I* ggRequired for each band of
Figure BDA00021247013800000328
The injection weight of the high frequency information of (1);
and 5: will I* ggRequired for each band of
Figure BDA00021247013800000329
And the injection weight of the high frequency information
Figure BDA00021247013800000330
Multiplying the high frequency information of (1); then injecting the obtained result into
Figure BDA00021247013800000331
Obtaining a primary spatial height sharpening fusion image with high spatial resolution and high spectral resolution simultaneously, and marking as Irh
Step 6: by means of I* ggTo IrhAnd carrying out spectrum compensation and correction to obtain a spectrum high-fidelity optimal space height sharpening fusion image.
In the step 1, I* gkAnd I* ggThe acquisition process comprises the following steps:
step 1_ 1: according to IgkAnd IggSpatial resolution ratio of, to IggPerforming spatial resamplingAnd marking the high-spectral-resolution remote sensing image obtained after sampling as
Figure BDA0002124701380000041
Size and I ofgkThe sizes of the components are consistent; then will be
Figure BDA0002124701380000042
For reference, to IgkPerforming geometric registration, and taking the high-spatial-resolution remote sensing image obtained after the geometric registration as a first high-spatial-resolution remote sensing image, which is marked as I'gk(ii) a And will IggIs taken as a first high spectral resolution remote sensing image and is marked as I'gg
Step 1_ 2: will be provided with
Figure BDA0002124701380000043
For reference, to l'gkRadiation registration to attenuate I'gkAnd l'ggTaking the high spatial resolution remote sensing image obtained after radiation registration as a second high spatial resolution remote sensing image, and marking as I* gk(ii) a And is prepared from'ggAs a second high spectral resolution remote sensing image, it is marked as I* gg
In the step 1_1, professional remote sensing image processing software ENVI/ERDAS is used for geometric registration.
In the step 1_2, I 'is subjected to moment matching technology'gkThe registration of the radiation is carried out,
Figure BDA0002124701380000044
wherein, Pgk,stdIs represented by l'gkStandard deviation of pixel values of all pixel points in (1), Pgk,meanIs represented by l'gkThe mean of the pixel values of all the pixel points in (a),
Figure BDA0002124701380000045
to represent
Figure BDA0002124701380000046
Of all pixels in the luminance componentThe standard deviation of the pixel values is then determined,
Figure BDA0002124701380000047
to represent
Figure BDA0002124701380000048
The average of the pixel values of all the pixels in the luminance component.
Said
Figure BDA0002124701380000049
The process of obtaining the luminance component of (a) is: to pair
Figure BDA00021247013800000410
Is averaged over all bands to obtain
Figure BDA00021247013800000411
The luminance component of (a).
In the step 2, a denoising algorithm is adopted for I* gkThe process of carrying out the rapid denoising treatment comprises the following steps: for homogeneous remote sensing image fusion, wiener filtering technology is adopted as a denoising algorithm pair I* gkCarrying out rapid denoising treatment; for heterogeneous remote sensing image fusion, a Lee filtering technology is adopted as a denoising algorithm pair I* gkAnd carrying out rapid denoising treatment.
The specific process of the step 4 is as follows:
step 4_ 1: to I* ggTaking the mean value of all the wave bands to obtain a mean value component;
step 4_ 2: calculation of I* ggAnd calculating the average gradient of the mean component;
step 4_ 3: will I* ggIs divided by the average gradient of the mean component to obtain I* ggRequired for the corresponding band
Figure BDA0002124701380000052
The injection weight of the high frequency information of (1); for I* ggQ wave band of (2)Is shown by* ggIs divided by the average gradient of the mean component to obtain the result as I* ggRequired for the q-th band
Figure BDA0002124701380000051
The injection weight of the high frequency information of (2); wherein Q is a positive integer, the initial value of Q is 1, Q is more than or equal to 1 and less than or equal to Q, and Q represents I* ggThe number of bands of (c).
The specific process of the step 6 is as follows:
step 6_ 1: to IrhBlurring treatment was performed, and the blurred fused image was designated as I'rh(ii) a Wherein, the fuzzy function adopted by the fuzzy processing selects the modulation and demodulation function MTF of the image sensor;
step 6_ 2: to l'rhPerforming down-sampling to obtain a fused image with size and I* ggThe sizes of the images are consistent, and the fused image obtained after the down-sampling treatment is recorded as I "rh
Step 6_ 3: is prepared from "rhAnd I* ggSubtracting to obtain a residual component;
step 6_ 4: carrying out resampling treatment on the residual error component so as to ensure that the size and I of the residual error component obtained after resampling treatmentrhThe sizes of the components are consistent;
step 6_ 5: adding residual components obtained after resampling treatment into IrhAnd obtaining the optimal spatial height sharpened fusion image.
Compared with the prior art, the invention has the advantages that:
1) the method of the invention fully considers the influence of noise in the remote sensing image and the phenomenon that the fusion of heterogeneous remote sensing images often generates spectrum distortion, introduces the spectrum compensation strategy of the fusion remote sensing image, and can not only meet the fusion requirements of homogeneous panchromatic/multispectral remote sensing images, panchromatic/hyperspectral remote sensing images and multispectral/hyperspectral remote sensing images, but also meet the fusion requirements of heterogeneous remote sensing images such as SAR/multispectral and the like.
2) The method simultaneously considers space enhancement and spectrum compensation, can improve the spatial resolution of the high-spectral-resolution remote sensing image, obtains the fused image with the sharpened spatial height, meets the visual interpretation requirement of the fused image, simultaneously obtains the optimal fused image with the sharpened spatial height, has higher spectral fidelity and can well meet the quantitative application requirement.
3) The method is simple and efficient, and can meet the rapid fusion processing requirement of practical engineering application.
Drawings
FIG. 1 is a general flow diagram of the process of the present invention;
FIG. 2a is a full-color remote sensing image;
FIG. 2b is a multispectral remote sensing image obtained by processing the multispectral remote sensing image of the same scene as FIG. 2a in step 3_1 of the method of the present invention after spatial resampling;
FIG. 2c is a fused image with high spectral fidelity and optimal spatial height sharpening obtained by processing the images in FIGS. 2a and 2b according to the method of the present invention;
FIG. 3a is a panchromatic remote sensing image;
FIG. 3b is a hyperspectral remote sensing image after spatial resampling obtained by processing the hyperspectral remote sensing image of the same scene as that of FIG. 3a in step 3_1 of the method of the invention;
FIG. 3c is a fused image with high spectral fidelity and optimal spatial height sharpening obtained by processing the images in FIGS. 3a and 3b according to the method of the present invention;
FIG. 4a is an SAR remote sensing image;
FIG. 4b is a multispectral remote sensing image obtained by processing the multispectral remote sensing image of the same scene as the scene in FIG. 4a in step 3_1 of the method of the present invention and then performing spatial resampling;
fig. 4c is a fused image with high spectral fidelity and optimal spatial height sharpening obtained by processing fig. 4a and 4b by using the method of the present invention.
Detailed Description
The invention is described in further detail below with reference to the accompanying examples.
The invention provides a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method, which can generate an optimal fusion image with high spatial height and simultaneously has high spectral fidelity of the optimal fusion image and can simultaneously meet the requirements of homogeneous remote sensing image fusion and heterogeneous remote sensing image fusion.
The invention provides a same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method, the general flow block diagram of which is shown in figure 1, and the method comprises the following steps:
step 1: selecting multi-source remote sensing images aiming at the same scene, namely an original high-spatial-resolution remote sensing image and an original high-spectral-resolution remote sensing image, and marking the images as IgkAnd Igg(ii) a Then to IgkAnd IggPreprocessing is carried out, and the high spatial resolution remote sensing image and the high spectral resolution remote sensing image obtained after preprocessing are correspondingly marked as I* gkAnd I* gg
In this embodiment, step 1, I* gkAnd I* ggThe acquisition process comprises the following steps:
step 1_ 1: according to IgkAnd IggSpatial resolution ratio of, to IggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling as
Figure BDA0002124701380000071
Figure BDA0002124701380000072
Size and I ofgkThe sizes of the components are consistent; then will be
Figure BDA0002124701380000073
As reference, to IgkPerforming geometric registration, and taking the high-spatial-resolution remote sensing image obtained after the geometric registration as a first high-spatial-resolution remote sensing image recorded as I'gk(ii) a And will IggIs taken as a first high spectral resolution remote sensing image and is marked as I'gg
Step 1_ 2: will be provided with
Figure BDA0002124701380000074
For reference, to I'gkRadiation registration to attenuate I'gkAnd l'ggTaking the high spatial resolution remote sensing image obtained after radiation registration as a second high spatial resolution remote sensing image, and marking as I* gk(ii) a And mixing I'ggAs a second high spectral resolution remote sensing image, it is marked as I* gg
In this particular embodiment, in step 1_1, the geometric registration utilizes specialized remote sensing image processing software ENVI/ERDAS.
In this embodiment, in step 1_2, in consideration of possible difference between the acquisition time of the high-spatial-resolution remote sensing image and the acquisition time of the high-spectral-resolution remote sensing image, especially that the heterogeneous remote sensing images are fused commonly, a moment matching technology is adopted for I'gkThe registration of the radiation is carried out,
Figure BDA0002124701380000075
wherein, Pgk,stdIs represented by I'gkStandard deviation of pixel values of all pixel points in (1), Pgk,meanIs represented by I'gkThe mean of the pixel values of all the pixel points in (a),
Figure BDA0002124701380000076
to represent
Figure BDA0002124701380000077
The standard deviation of the pixel values of all the pixel points in the luminance component of (1),
Figure BDA0002124701380000078
represent
Figure BDA0002124701380000079
The average of the pixel values of all the pixels in the luminance component.
In the case of this particular embodiment of the present invention,
Figure BDA0002124701380000081
the process of obtaining the luminance component of (a) is: to pair
Figure BDA0002124701380000082
Is averaged over all bands to obtain
Figure BDA0002124701380000083
The luminance component of (a).
Step 2: judgment of I* gkIf the noise is contained, adopting a denoising algorithm to process I* gkCarrying out rapid denoising treatment, and recording the high-spatial-resolution remote sensing image obtained after denoising treatment as
Figure BDA0002124701380000084
If no noise is contained, directly adding I* gkIs newly recorded as
Figure BDA0002124701380000085
In this embodiment, in step 2, a denoising algorithm is used to pair I ·gkThe process of carrying out the rapid denoising treatment comprises the following steps: for homogeneous remote sensing image fusion, e.g. IgkIs a full-color image, and IggIn the case of multispectral image or hyperspectral image, wiener filtering technology is adopted as a denoising algorithm pair I* gkCarrying out rapid denoising treatment; for heterogeneous remote-sensing image fusion, e.g. IgkIs an SAR image, and IggIn the case of multispectral images or hyperspectral images, Lee filtering technology is adopted as a denoising algorithm pair I* gkAnd carrying out rapid denoising treatment.
And step 3: extraction of
Figure BDA0002124701380000086
The specific process of the high-frequency information is as follows:
step 3_ 1: in IgkSpectral range and IggSpectrum ofWith known range, according to IgkSpectral range and IggIs selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range of
Figure BDA0002124701380000087
The band of (2); then according to
Figure BDA0002124701380000088
Size of (1), pair I* ggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling as
Figure BDA0002124701380000089
Then according to I* ggAnd by selecting
Figure BDA00021247013800000810
Is linearly combined to obtain
Figure BDA00021247013800000811
The luminance component of (a).
Or in IgkSpectral range and IggIn the case of unknown spectral range, according to
Figure BDA00021247013800000812
Size of (1), pair I* ggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling as
Figure BDA00021247013800000813
Then according to I* ggAnd pass through
Figure BDA00021247013800000814
Is linearly combined to obtain
Figure BDA00021247013800000815
Luminance component of。
Above, for
Figure BDA00021247013800000816
The process of obtaining the combination coefficient adopted by the linear combination of all the wave bands is as follows: in response to the demand for fast image processing, to
Figure BDA00021247013800000817
Carrying out space down-sampling, and recording the high-spatial-resolution remote sensing image obtained after the space down-sampling
Figure BDA00021247013800000818
In IgkSpectral range and IggIn the case of a known spectral range according to IgkSpectral range of (1) andggis selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range of
Figure BDA0002124701380000091
By least square method, to
Figure BDA0002124701380000092
And selected I* ggProcessing the wave bands, and solving to obtain a combination coefficient; or in IgkSpectral range of (1) and
Figure BDA00021247013800000925
under the condition that the spectral range of the light source is unknown, the least square method is adopted to carry out the pair
Figure BDA0002124701380000093
And I* ggAll the wave bands are processed and solved to obtain a combination coefficient.
Step 3_ 2: will be provided with
Figure BDA0002124701380000094
As a reference, to
Figure BDA0002124701380000095
Performing moment matching to attenuate
Figure BDA0002124701380000096
Luminance component of and
Figure BDA0002124701380000097
the radiation difference between them, the high spatial resolution remote sensing image obtained after the moment matching is recorded as
Figure BDA0002124701380000098
Step 3_ 3: will be provided with
Figure BDA0002124701380000099
And
Figure BDA00021247013800000910
subtracting the luminance components of the image data to obtain a difference image
Figure BDA00021247013800000911
A base high-frequency component of (a); then passes through a laplacian sharpening filter pair
Figure BDA00021247013800000912
Is spatially enhanced to obtain
Figure BDA00021247013800000913
The adjustable high frequency component of (a); then will
Figure BDA00021247013800000914
Of the fundamental high-frequency component of
Figure BDA00021247013800000915
Is obtained by weighted combination of the adjustable high-frequency components
Figure BDA00021247013800000916
High frequency information of (2); wherein,
Figure BDA00021247013800000917
the value range of the weight of the adjustable high-frequency component is [0,5 ]]At the time of actual processing
Figure BDA00021247013800000918
The weight of the adjustable high-frequency component is manually set according to requirements.
And 4, step 4: by means of I* ggThe gradient of each band of (a), calculating I* ggRequired for each band of
Figure BDA00021247013800000919
The injection weight of the high frequency information.
In this embodiment, the specific process of step 4 is:
step 4_ 1: to I* ggAnd averaging all the wave bands to obtain an average component.
Step 4_ 2: calculation of I* ggAnd the average gradient of the mean component is calculated.
Step 4_ 3: will I* ggIs divided by the average gradient of the mean component to obtain I* ggRequired for the corresponding band
Figure BDA00021247013800000920
The injection weight of the high frequency information of (2); for I* ggQ wave band of (1), will I* ggIs divided by the average gradient of the mean component to obtain the result as I* ggRequired for the q-th band
Figure BDA00021247013800000921
The injection weight of the high frequency information of (1); wherein Q is a positive integer, the initial value of Q is 1, Q is more than or equal to 1 and less than or equal to Q, and Q represents I* ggThe number of bands of (c).
And 5: will I* ggRequired for each band of
Figure BDA00021247013800000922
And injection weight of high frequency information
Figure BDA00021247013800000923
Multiplying the high frequency information of (1); then injecting the obtained result into
Figure BDA00021247013800000924
Obtaining a preliminary spatial height sharpening fusion image with high spatial resolution and high spectral resolution simultaneously, and marking as Irh
Step 6: by means of I* ggTo IrhAnd carrying out spectrum compensation and correction to obtain a spectrum high-fidelity optimal space height sharpening fusion image.
In this embodiment, the specific process of step 6 is:
step 6_ 1: to IrhBlurring treatment was performed, and the blurred fused image was designated as I'rh(ii) a The fuzzy function adopted by the fuzzy processing selects the modulation and demodulation function MTF of the image sensor.
Step 6_ 2: to l'rhPerforming down-sampling to obtain a fused image with size and I* ggThe sizes of the images are consistent, and the fused image obtained after the down-sampling treatment is recorded as I "rh
Step 6_ 3: will I "rhAnd I* ggThe subtraction yields the residual component.
Step 6_ 4: carrying out resampling treatment on the residual error component to ensure that the size and I of the residual error component obtained after resampling treatmentrhAre consistent in size.
Step 6_ 5: adding residual components obtained after resampling into IrhThe spectral fidelity of the fused image is further improved, and the optimal spatial height sharpened fused image is obtained.
In order to verify the feasibility and effectiveness of the method of the invention, experiments were carried out on the method of the invention.
Fig. 2a shows a panchromatic remote sensing image, fig. 2b shows a multispectral remote sensing image obtained by processing the multispectral remote sensing image of the same scene as fig. 2a in step 3_1 of the method of the present invention after spatial resampling, and fig. 2c shows a high-fidelity fused image of the spectrum obtained by processing fig. 2a and fig. 2b with the method of the present invention with the optimal spatial height. Fig. 3a shows a panchromatic remote sensing image, fig. 3b shows a hyperspectral remote sensing image obtained by spatial resampling after the hyperspectral remote sensing image of the same scene as fig. 3a is processed by the step 3_1 of the method of the invention, and fig. 3c shows a spectral high-fidelity optimal spatial height sharpening fusion image obtained by processing fig. 3a and fig. 3b by the method of the invention. Fig. 4a shows an SAR remote sensing image, fig. 4b shows a multispectral remote sensing image obtained after spatial resampling by processing the multispectral remote sensing image of the same scene as fig. 4a in step 3_1 of the method of the present invention, and fig. 4c shows a high-fidelity fused image of the spectrum obtained by processing fig. 4a and fig. 4b with the method of the present invention. Observing fig. 2c, fig. 3c and fig. 4c, it can be seen that no matter panchromatic/multispectral remote sensing image fusion, panchromatic/hyperspectral remote sensing image fusion and SAR/multispectral remote sensing image fusion, the fusion result has sharper spatial structure information, meanwhile, the spectrum color of the fusion result has higher consistency with the original hyperspectral resolution remote sensing image, and the spectrum retention performance is better.

Claims (6)

1. A same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method is characterized by comprising the following steps:
step 1: selecting multi-source remote sensing images aiming at the same scene, namely an original high-spatial-resolution remote sensing image and an original high-spectral-resolution remote sensing image, and correspondingly marking as IgkAnd Igg(ii) a Then to IgkAnd IggPreprocessing is carried out, and the high-spatial-resolution remote sensing image and the high-spectral-resolution remote sensing image obtained after preprocessing are correspondingly marked as I* gkAnd I* gg
Step 2: judgment of I* gkWhether or not to include noise thereinIf the noise is contained, adopting a denoising algorithm to the I* gkCarrying out rapid denoising treatment, and recording the high-spatial-resolution remote sensing image obtained after denoising treatment as
Figure FDA0003612000760000011
If no noise is contained, directly adding I* gkIs newly recorded as
Figure FDA0003612000760000012
In the step 2, a denoising algorithm is adopted for I* gkThe process of carrying out the rapid denoising treatment comprises the following steps: for homogeneous remote sensing image fusion, wiener filtering technology is adopted as a denoising algorithm pair I* gkCarrying out rapid denoising treatment; for heterogeneous remote sensing image fusion, Lee filtering technology is adopted as a denoising algorithm pair I* gkCarrying out rapid denoising treatment;
and step 3: extraction of
Figure FDA0003612000760000013
The specific process of the high-frequency information is as follows:
step 3_ 1: in IgkSpectral range of (1) andggin the case of a known spectral range according to IgkSpectral range and IggIs selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range of
Figure FDA0003612000760000014
The band of (2); then according to
Figure FDA0003612000760000015
Size of (1), pair I* ggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling as
Figure FDA0003612000760000016
Then according to I* ggAnd by selecting
Figure FDA0003612000760000017
Is linearly combined to obtain
Figure FDA0003612000760000018
A luminance component of (a);
or in IgkSpectral range of (1) andggin the case of unknown spectral range, according to
Figure FDA0003612000760000019
Size of (1), pair I* ggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling as
Figure FDA00036120007600000110
Then according to I* ggAnd pass through a pair
Figure FDA00036120007600000111
All the wave bands are linearly combined to obtain
Figure FDA00036120007600000112
A luminance component of (a);
as mentioned above, to
Figure FDA00036120007600000113
The process of obtaining the combination coefficient adopted by the linear combination of all the wave bands is as follows: to pair
Figure FDA00036120007600000114
Carrying out space down-sampling, and recording the high-spatial-resolution remote sensing image obtained after the space down-sampling
Figure FDA00036120007600000115
In IgkSpectral range ofEnclose and IggIn the case of a known spectral range according to IgkSpectral range and IggIs selected from the spectral range ofgkCoverage of the spectral range ofggIs selected bygkCovering the spectral range of
Figure FDA0003612000760000021
By least square method, to
Figure FDA0003612000760000022
And selected I* ggProcessing the wave bands, and solving to obtain a combination coefficient; or in IgkSpectral range and IggUnder the condition that the spectral range of the light source is unknown, the least square method is adopted to carry out the pair
Figure FDA0003612000760000023
And I* ggAll the wave bands are processed, and the combination coefficient is obtained by solving;
step 3_ 2: will be provided with
Figure FDA0003612000760000024
As a reference, to
Figure FDA0003612000760000025
Performing moment matching to attenuate
Figure FDA0003612000760000026
Luminance component of and
Figure FDA0003612000760000027
the radiation difference between them, the high space resolution remote sensing image obtained after the moment matching is recorded as
Figure FDA0003612000760000028
Step 3_ 3: will be provided with
Figure FDA0003612000760000029
And
Figure FDA00036120007600000210
is subtracted from the luminance component of (b), and the resulting difference image is taken as
Figure FDA00036120007600000211
A base high-frequency component of (a); then passes through a laplacian sharpening filter pair
Figure FDA00036120007600000212
Is spatially enhanced to obtain
Figure FDA00036120007600000213
The adjustable high frequency component of (a); then will be
Figure FDA00036120007600000214
Of the fundamental high-frequency component of
Figure FDA00036120007600000215
Is weighted and combined to obtain the adjustable high frequency component
Figure FDA00036120007600000216
High frequency information of (2); wherein,
Figure FDA00036120007600000217
the value range of the weight of the adjustable high-frequency component is [0,5 ]];
And 4, step 4: by means of I* ggThe gradient of each band of (1), calculating I* ggRequired for each band of
Figure FDA00036120007600000218
The injection weight of the high frequency information of (1);
and 5: will I* ggRequired for each band of
Figure FDA00036120007600000219
And the injection weight of the high frequency information
Figure FDA00036120007600000220
Multiplying the high frequency information of (1); then injecting the obtained result into
Figure FDA00036120007600000221
Obtaining a primary spatial height sharpening fusion image with high spatial resolution and high spectral resolution simultaneously, and marking as Irh
Step 6: by means of I* ggTo IrhPerforming spectral compensation and correction to obtain a spectral high-fidelity optimal space height sharpening fusion image;
the specific process of the step 6 is as follows:
step 6_ 1: to IrhBlurring treatment was performed, and the blurred fused image was designated as I'rh(ii) a Wherein, the fuzzy function adopted by the fuzzy processing selects the modulation and demodulation function MTF of the image sensor;
step 6_ 2: to l'rhPerforming down-sampling to obtain a fused image with size and I* ggIs identical in size, and the fused image obtained after down-sampling is recorded as I'rh
Step 6_ 3: is prepared from'rhAnd I* ggSubtracting to obtain a residual component;
step 6_ 4: carrying out resampling treatment on the residual error component to ensure that the size and I of the residual error component obtained after resampling treatmentrhThe sizes of the components are consistent;
step 6_ 5: adding residual components obtained after resampling into IrhAnd obtaining the optimal spatial height sharpened fusion image.
2. The same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method according to claim 1, which comprisesCharacterized in that in the step 1, I* gkAnd I* ggThe acquisition process comprises the following steps:
step 1_ 1: according to IgkAnd IggSpatial resolution ratio between, to IggCarrying out spatial resampling, and recording the high-spectral-resolution remote sensing image obtained after spatial resampling as
Figure FDA0003612000760000031
Figure FDA0003612000760000032
Size and I ofgkThe sizes of the components are consistent; then will be
Figure FDA0003612000760000033
For reference, to IgkPerforming geometric registration, and taking the high-spatial-resolution remote sensing image obtained after the geometric registration as a first high-spatial-resolution remote sensing image, which is marked as I'gk(ii) a And will IggIs taken as a first high spectral resolution remote sensing image and is marked as I'gg
Step 1_ 2: will be provided with
Figure FDA0003612000760000034
For reference, to I'gkRadiation registration to attenuate I'gkAnd I'ggTaking the high spatial resolution remote sensing image obtained after radiation registration as a second high spatial resolution remote sensing image, and marking as I* gk(ii) a And is prepared from'ggAs a second high spectral resolution remote sensing image, it is marked as I* gg
3. The method for high-fidelity generalized space-spectrum fusion of the same/heterogeneous remote sensing images according to claim 2, wherein in the step 1_1, professional remote sensing image processing software ENVI/ERDAS is used for geometric registration.
4. A method according to claim 2 or 3A same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method is characterized in that in the step 1_2, a moment matching technology is adopted to pair I'gkThe registration of the radiation is carried out,
Figure FDA0003612000760000035
wherein, Pgk,stdIs represented by l'gkStandard deviation, P, of pixel values of all pixels in (1)gk,meanIs represented by l'gkThe average of the pixel values of all the pixel points in (1),
Figure FDA0003612000760000036
represent
Figure FDA0003612000760000037
The standard deviation of the pixel values of all the pixel points in the luminance component of (1),
Figure FDA0003612000760000038
to represent
Figure FDA0003612000760000039
The average of the pixel values of all the pixels in the luminance component.
5. The same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method according to claim 4, characterized in that
Figure FDA0003612000760000041
The process of obtaining the luminance component of (a) is: to pair
Figure FDA0003612000760000042
Is obtained by averaging all the bands
Figure FDA0003612000760000043
The luminance component of (a).
6. The same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method according to claim 1, characterized in that the specific process of the step 4 is as follows:
step 4_ 1: to I* ggTaking the mean value of all the wave bands to obtain a mean value component;
step 4_ 2: calculating I* ggAnd calculating the average gradient of the mean component;
step 4_ 3: will I* ggIs divided by the average gradient of the mean component to obtain I* ggRequired for the corresponding band
Figure FDA0003612000760000044
The injection weight of the high frequency information of (1); for I* ggQ wave band of (1), will I* ggIs divided by the average gradient of the mean component to obtain the result as I* ggRequired for the q-th band
Figure FDA0003612000760000045
The injection weight of the high frequency information of (1); wherein Q is a positive integer, the initial value of Q is 1, Q is more than or equal to 1 and less than or equal to Q, and Q represents I* ggThe number of bands of (a).
CN201910619429.1A 2019-07-10 2019-07-10 Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method Active CN110533600B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910619429.1A CN110533600B (en) 2019-07-10 2019-07-10 Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910619429.1A CN110533600B (en) 2019-07-10 2019-07-10 Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method

Publications (2)

Publication Number Publication Date
CN110533600A CN110533600A (en) 2019-12-03
CN110533600B true CN110533600B (en) 2022-07-19

Family

ID=68659619

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910619429.1A Active CN110533600B (en) 2019-07-10 2019-07-10 Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method

Country Status (1)

Country Link
CN (1) CN110533600B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111145132B (en) * 2019-12-05 2022-11-18 东南大学 Spectrum-preserving type rapid remote sensing image fusion method
CN111091113A (en) * 2019-12-30 2020-05-01 贵阳欧比特宇航科技有限公司 Hyperspectral image data fusion method
CN113436123B (en) * 2021-06-22 2022-02-01 宁波大学 High-resolution SAR and low-resolution multispectral image fusion method based on cloud removal-resolution improvement cooperation
CN117253125B (en) * 2023-10-07 2024-03-22 珠江水利委员会珠江水利科学研究院 Space-spectrum mutual injection image fusion method, system and readable storage medium
CN117274763B (en) * 2023-11-21 2024-04-05 珠江水利委员会珠江水利科学研究院 Remote sensing image space-spectrum fusion method, system, equipment and medium based on balance point analysis

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102446351A (en) * 2010-10-15 2012-05-09 江南大学 Multispectral and high-resolution full-color image fusion method study
CN103236047A (en) * 2013-03-28 2013-08-07 北京航空航天大学 Method for fusing full-color and multi-spectral images on basis of fitting for substituted components
CN103942769A (en) * 2013-12-10 2014-07-23 珠江水利委员会珠江水利科学研究院 Satellite remote sensing image fusion method
CN112166692B (en) * 2012-06-29 2014-07-30 二十一世纪空间技术应用股份有限公司 Remote sensing image cross fusion method
CN104851091A (en) * 2015-04-28 2015-08-19 中山大学 Remote sensing image fusion method based on convolution enhancement and HCS transform
CN106651800A (en) * 2016-12-23 2017-05-10 中国科学院遥感与数字地球研究所 PAN modulation and multiple linear regression-based MS and PAN image fusion method
CN108765359A (en) * 2018-05-31 2018-11-06 安徽大学 Fusion method of hyperspectral remote sensing image and full-color image based on JSK model and NSCT technology
CN109166089A (en) * 2018-07-24 2019-01-08 重庆三峡学院 The method that a kind of pair of multispectral image and full-colour image are merged
CN109509160A (en) * 2018-11-28 2019-03-22 长沙理工大学 Hierarchical remote sensing image fusion method utilizing layer-by-layer iteration super-resolution
CN109886870A (en) * 2018-12-29 2019-06-14 西北大学 Remote sensing image fusion method based on binary channels neural network

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102446351A (en) * 2010-10-15 2012-05-09 江南大学 Multispectral and high-resolution full-color image fusion method study
CN112166692B (en) * 2012-06-29 2014-07-30 二十一世纪空间技术应用股份有限公司 Remote sensing image cross fusion method
CN103236047A (en) * 2013-03-28 2013-08-07 北京航空航天大学 Method for fusing full-color and multi-spectral images on basis of fitting for substituted components
CN103942769A (en) * 2013-12-10 2014-07-23 珠江水利委员会珠江水利科学研究院 Satellite remote sensing image fusion method
CN104851091A (en) * 2015-04-28 2015-08-19 中山大学 Remote sensing image fusion method based on convolution enhancement and HCS transform
CN106651800A (en) * 2016-12-23 2017-05-10 中国科学院遥感与数字地球研究所 PAN modulation and multiple linear regression-based MS and PAN image fusion method
CN108765359A (en) * 2018-05-31 2018-11-06 安徽大学 Fusion method of hyperspectral remote sensing image and full-color image based on JSK model and NSCT technology
CN109166089A (en) * 2018-07-24 2019-01-08 重庆三峡学院 The method that a kind of pair of multispectral image and full-colour image are merged
CN109509160A (en) * 2018-11-28 2019-03-22 长沙理工大学 Hierarchical remote sensing image fusion method utilizing layer-by-layer iteration super-resolution
CN109886870A (en) * 2018-12-29 2019-06-14 西北大学 Remote sensing image fusion method based on binary channels neural network

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Multi spectral image fusion by deep convolutional neural network;Sajjad Eghbalian等;《International Journal of Remote Sensing》;20181230;第3983-4002页 *
多源时—空—谱光学遥感影像的变分融合方法;孟祥超;《中国优秀博硕士学位论文全文数据库(博士)基础科学辑》;20180615;第2-5章 *
顾及相机抖动与运动目标的多重曝光影像融合;朱吉;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20190615;第4章 *

Also Published As

Publication number Publication date
CN110533600A (en) 2019-12-03

Similar Documents

Publication Publication Date Title
CN110533600B (en) Same/heterogeneous remote sensing image high-fidelity generalized space-spectrum fusion method
CN107563964B (en) Rapid splicing method for large-area-array sub-meter-level night scene remote sensing images
Klonus et al. Image fusion using the Ehlers spectral characteristics preservation algorithm
Huang et al. Cloud removal from optical satellite imagery with SAR imagery using sparse representation
Aly et al. A regularized model-based optimization framework for pan-sharpening
US8761506B1 (en) Pan sharpening digital imagery
CN111986134B (en) Remote sensing imaging method and device for area-array camera
CN108090872B (en) Single-frame multispectral image super-resolution reconstruction method and system based on gradient extraction
Song et al. An adaptive pansharpening method by using weighted least squares filter
CN111340895B (en) Image color uniformizing method based on pyramid multi-scale fusion
Tao et al. Hyperspectral image recovery based on fusion of coded aperture snapshot spectral imaging and RGB images by guided filtering
Seo et al. UPSNet: Unsupervised pan-sharpening network with registration learning between panchromatic and multi-spectral images
KR20190060481A (en) Method for satellite image processing and recording medium Therefor
Duran et al. Restoration of pansharpened images by conditional filtering in the PCA domain
KR20210096925A (en) Flexible Color Correction Method for Massive Aerial Orthoimages
CN115100075A (en) Hyperspectral panchromatic sharpening method based on spectral constraint and residual error attention network
CN116109535A (en) Image fusion method, device and computer readable storage medium
Zhong et al. Attention_FPNet: Two-branch remote sensing image pansharpening network based on attention feature fusion
He et al. Pansharpening using total variation regularization
Kundu et al. Enhanced IHS Pan-sharpening using K-means segmentation guided adaptive intensity histogram matching and CLAHE enhancement
KR102397148B1 (en) Color Correction Method Using Low Resolution Color Image Of Large-capacity Aerial Orthoimage
CN111524079B (en) Multispectral remote sensing image full-color sharpening method based on component replacement and low-pass filtering
Duran et al. Pansharpening by a nonlocal channel-decoupled variational method
CN113888421A (en) Fusion method of multispectral satellite remote sensing image
Alsmadi et al. Pansharpening via deep guided filtering network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant