CN114331936B - Remote sensing image fusion method based on wavelet decomposition and IHS algorithm improvement - Google Patents

Remote sensing image fusion method based on wavelet decomposition and IHS algorithm improvement Download PDF

Info

Publication number
CN114331936B
CN114331936B CN202111598647.5A CN202111598647A CN114331936B CN 114331936 B CN114331936 B CN 114331936B CN 202111598647 A CN202111598647 A CN 202111598647A CN 114331936 B CN114331936 B CN 114331936B
Authority
CN
China
Prior art keywords
component
image
multispectral
fusion
fused
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111598647.5A
Other languages
Chinese (zh)
Other versions
CN114331936A (en
Inventor
杨鑫
张文亮
武志强
王贺彬
李志明
张路路
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhengzhou Xinda Institute of Advanced Technology
Original Assignee
Zhengzhou Xinda Institute of Advanced Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhengzhou Xinda Institute of Advanced Technology filed Critical Zhengzhou Xinda Institute of Advanced Technology
Priority to CN202111598647.5A priority Critical patent/CN114331936B/en
Publication of CN114331936A publication Critical patent/CN114331936A/en
Application granted granted Critical
Publication of CN114331936B publication Critical patent/CN114331936B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention provides a remote sensing image fusion method based on wavelet decomposition and an improved IHS algorithm, which comprises the following steps: extracting an I component of a multispectral image and a high-resolution remote sensing image, performing high-pass filtering and weighted flattening algorithm processing after fusion to obtain a fused I new1 component, performing histogram matching on the fused I new1 component and the panchromatic image to obtain a matched fused I new2 component, performing HIS inverse transformation on the fused I new2 component and the like to obtain a multispectral image MI_2, and then obtaining a brand new fused image by using a wavelet decomposition method. The invention solves the problem of spectrum distortion of the fused image caused by IHS conversion, not only effectively inhibits spectrum degradation phenomenon, but also increases the definition of the fused image; meanwhile, the problem of image brightness reduction caused by high-pass filtering processing is effectively solved, and the low-frequency information of the image, namely the overall brightness of the image, can be enhanced.

Description

Remote sensing image fusion method based on wavelet decomposition and IHS algorithm improvement
Technical Field
The invention relates to the field of remote sensing information processing, in particular to a remote sensing image fusion method based on wavelet decomposition and an improved IHS algorithm.
Background
In recent years, explosive growth of remote sensing image data, production of high-performance computers and the like provide a basis for fusion of multi-element data, and remote sensing data fusion provides possibility that single satellite images are difficult to achieve multi-spectrum and high resolution at the same time, and provides good basis for small-area ground feature identification, crop monitoring, fine agricultural monitoring and the like.
At present, the remote sensing image fusion mainly adopts IHS transformation, brovery transformation, principal component transformation and other methods to fuse multispectral images and full-color images. However, existing fusion multispectral approaches exist: and the problems of spectrum distortion of the fused image, low definition of the fused image and the like are solved.
In addition, the historical remote sensing images of each year are used as the basis of geographical national condition monitoring, and play a key role in researching coastline change, urban and rural proportion change, land utilization change, vegetation change, water system change and the like; for the historical remote sensing image, besides the problems, the existing fusion multispectral method also comprises the following steps: low brightness of the fusion image, and the like.
In order to solve the above problems, an ideal technical solution is always sought.
Disclosure of Invention
The invention aims at overcoming the defects of the prior art, and provides a remote sensing image fusion method based on wavelet decomposition and an improved IHS algorithm.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
The invention provides a remote sensing image fusion method based on wavelet decomposition and an improved IHS algorithm, which is characterized by comprising the following steps:
Acquiring an original multispectral image MI_0, an original high-resolution image HRI_0 and an original full-color image PI_0 of the same region;
Resampling the original multispectral image MI_0 to obtain a target multispectral image MI_1; the resolution of the target multispectral image MI_1 is the same as that of the original high-resolution image HRI_0;
IHS transformation is carried out on the target multispectral image MI_1 to obtain a multispectral I h component, a multispectral H h component and a multispectral S h component;
IHS conversion is carried out on the original high-resolution image HRI_0 to obtain a high-resolution I m component;
Performing high-pass filtering treatment on the multispectral I h component and the high-resolution I m component to obtain a fusion I F component;
performing weighted average processing on the fusion I F component and the high-resolution I m component to obtain a fusion I new1 component;
Simulating and generating a new full-color image PI_1 according to the original full-color image PI_0, and performing histogram matching on the image histogram of the fusion I new1 component based on the new full-color image PI_1 to obtain a matched fusion I new2 component;
Performing IHS inverse transformation on the fusion I new2 component, the multispectral H h component and the multispectral S h component to obtain a multispectral image MI_2 after IHS inverse transformation;
Performing wavelet transformation on the multispectral image MI_2 to obtain a high-frequency component and a low-frequency component of the multispectral image MI_2; performing wavelet transformation on the new full-color image PI_1 to obtain a high-frequency component and a low-frequency component of the new full-color image PI_1;
Fusing the high-frequency component of the multispectral image MI_2 and the high-frequency component of the new full-color image PI_1 to obtain a fused high-frequency component; fusing the low-frequency component of the multispectral image MI_2 and the low-frequency component of the new full-color image PI_1 to obtain a fused low-frequency component;
And carrying out wavelet inverse transformation on the fused high-frequency component and the fused low-frequency component to obtain a fused remote sensing image.
The second aspect of the present invention provides a remote sensing image fusion apparatus based on a wavelet decomposition and an improved IHS algorithm, the remote sensing image fusion apparatus comprising a memory, a processor and a remote sensing image fusion program based on the wavelet decomposition and the improved IHS algorithm stored on the memory and operable on the processor, wherein the remote sensing image fusion program based on the wavelet decomposition and the improved IHS algorithm implements the steps of the remote sensing image fusion method based on the wavelet decomposition and the improved IHS algorithm as described above when being executed by the processor.
A third aspect of the present invention provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of a remote sensing image fusion method based on wavelet decomposition and an improved IHS algorithm as described above.
The beneficial effects of the invention are as follows:
1) The invention can quickly and effectively fuse remote sensing images, solves the problem of spectrum distortion of fused images caused by IHS conversion by carrying out high-pass filtering treatment on the multispectral I h component and the high-resolution I m component, and the improved method not only effectively inhibits spectrum degradation phenomenon, but also increases the definition of fused images;
Then, the invention uses the method of histogram matching and wavelet transformation to solve the problem of image brightness reduction caused by high-pass filtering processing, can effectively enhance the overall brightness of the image, and simultaneously inhibit noise interference in high-frequency information;
2) The method has the advantages of simplicity, good fusion effect, suitability for fusion of remote sensing images extracted from ground objects and the like.
Drawings
FIG. 1 is a flow I of the remote sensing image fusion method based on wavelet decomposition and improved IHS algorithm of the present invention;
FIG. 2 is a second flow chart of the remote sensing image fusion method based on wavelet decomposition and improved IHS algorithm of the invention.
Detailed Description
The technical scheme of the invention is further described in detail through the following specific embodiments.
Example 1
As shown in fig. 1 and fig. 2, a remote sensing image fusion method based on wavelet decomposition and improved IHS algorithm comprises the following steps:
Acquiring an original multispectral image MI_0, an original high-resolution image HRI_0 and an original full-color image PI_0 of the same region;
Resampling the original multispectral image MI_0 to obtain a target multispectral image MI_1; the resolution of the target multispectral image MI_1 is the same as that of the original high-resolution image HRI_0;
IHS transformation is carried out on the target multispectral image MI_1 to obtain a multispectral I h component, a multispectral H h component and a multispectral S h component;
IHS conversion is carried out on the original high-resolution image HRI_0 to obtain a high-resolution I m component;
Performing high-pass filtering treatment on the multispectral I h component and the high-resolution I m component to obtain a fusion I F component;
performing weighted average processing on the fusion I F component and the high-resolution I m component to obtain a fusion I new1 component;
Simulating and generating a new full-color image PI_1 according to the original full-color image PI_0, and performing histogram matching on the image histogram of the fusion I new1 component based on the new full-color image PI_1 to obtain a matched fusion I new2 component;
Performing IHS inverse transformation on the fusion I new2 component, the multispectral H h component and the multispectral S h component to obtain a multispectral image MI_2 after IHS inverse transformation;
Performing wavelet transformation on the multispectral image MI_2 to obtain a high-frequency component and a low-frequency component of the multispectral image MI_2; performing wavelet transformation on the new full-color image PI_1 to obtain a high-frequency component and a low-frequency component of the new full-color image PI_1;
Fusing the high-frequency component of the multispectral image MI_2 and the high-frequency component of the new full-color image PI_1 to obtain a fused high-frequency component; fusing the low-frequency component of the multispectral image MI_2 and the low-frequency component of the new full-color image PI_1 to obtain a fused low-frequency component;
And carrying out wavelet inverse transformation on the fused high-frequency component and the fused low-frequency component to obtain a fused remote sensing image.
It should be noted that, aiming at the characteristics of multiple spectrums of the multispectral image, such as lower resolution, and the characteristics of higher spatial resolution of the high-resolution remote sensing image, such as single band range, the invention respectively carries out IHS conversion on the multispectral image, extracts the I components of the two images, obtains a new I component (fusion I new1 component) through high-pass filtering and weighted flat algorithm processing after fusion, carries out histogram matching on the image histogram of the fusion I new1 component based on the new panchromatic image PI_1, and obtains a fusion I new2 component after matching; and replacing the approximate component in the multispectral remote sensing image by using the fused I new2 component, and then performing IHS inverse transformation, so that the HIS is improved.
Specifically, the original multispectral image mi_0, the original high-resolution image hri_0 and the original panchromatic image pi_0 are remote sensing images of the same area, taking a MODIS image as an example, downloading the multispectral remote sensing image of a certain area, and obtaining the panchromatic image and the high-resolution image through google maps and the like.
In a specific embodiment, resampling is performed on the original multispectral image mi_0 by using a bilinear interpolation method, where the bilinear interpolation is an extension of the linear interpolation on a two-dimensional right-angle grid, and is used for interpolating the bivariate function. The core idea is to perform linear interpolation in two directions respectively, and this embodiment is not described here again.
Further, when performing IHS transformation on the target multispectral image mi_1 to obtain a multispectral I h component, a multispectral H h component and a multispectral S h component, the following formula is adopted:
Wherein, R h、Gh、Bh is the image gray matrix of the three spectral ranges of red, green and blue of the target multispectral image MI_1; the multispectral I h component reflects the spatial detail of the target multispectral image mi_1, the multispectral H h component and the multispectral S h component reflect the spectral information of the target multispectral image mi_1, and V h1 and V h2 are intermediate variables.
It can be appreciated that when the IHS transformation is performed on the original high-resolution image hri_0 to obtain the high-resolution I m component, a similar method is adopted, and this embodiment is not described herein again.
Further, when the multispectral I h component and the high-resolution I m component are subjected to high-pass filtering processing to obtain a fused I F component, performing:
Reading the multispectral I h component to obtain a gray value G (I h(i,j)) of each pixel (I, j) in the multispectral I h component;
Reading the high-resolution I m component, obtaining a gray value G (I m(i,j)) of each pixel (I, j) in the high-resolution I m component, and a gray value average M (I m(i,j,m,n)) of an mxn area centered on the pixel (I, j);
Subtracting the gray value average value M (I m(i,j,m,n)) from the gray value G (I m(i,j)) for each pixel (I, j) in the high-resolution I m component to obtain high-frequency information P n(Im(i,j); wherein the P n(Im(i,j)) represents the high-frequency information in each pixel (I, j) in the extracted high-resolution I m component;
For each pixel (I, j) in the multispectral I h component, performing superposition processing on the gray value G (I h(i,j)) and the high-frequency information P n(Im(i,j) to obtain a gray value G (I F(i,j)) of each pixel (I, j) of the filtered image in the spatial domain;
The fused I F component is derived based on the gray value G (I F(i,j)) of each pixel (I, j) of the filtered image in the spatial domain.
It will be appreciated that when each pixel is traversed, the gray value G (I F(i,j)) for each pixel (I, j) of the filtered image in the domain is obtained, and a whole fused image component (fused I F component) is obtained.
When the multispectral I h component and the high-resolution I m component are subjected to high-pass filtering, high-frequency information (detail information) of the high-resolution I m component is extracted; then, the extracted high-frequency information (detail information) is superimposed on the multispectral I h component (low-resolution image) by adopting a pixel addition method, so that the detail information of the high-resolution panchromatic image is added on the basis of reserving the multispectral image with low resolution as much as possible, the data fusion between the multispectral low-resolution image and the high-resolution panchromatic image is realized, the problem of spectrum distortion of the fused image caused by IHS conversion is solved, the spectrum degradation phenomenon is effectively restrained, and the definition of the fused image is increased.
Specifically, for each pixel (I, j) in the multispectral I h component, the gray value G (I h(i,j)) and the high-frequency information P n(Im(i,j) are subjected to superposition processing, so that when the gray value G (I F(i,j)) of each pixel (I, j) of the filtered image in the spatial domain is obtained, the following formula is adopted:
G(IF(i,j))=G(In(i,j))+Pn(Im(i,j))
Pn(Im(i,j))=G(Im(i,j))-M(Im(i,j,m,n))
Wherein G (I F(i,j)) represents the gray value of each pixel (I, j) in the fused I F component, G (I h(i,j)) represents the gray value of each pixel (I, j) in the multispectral I h component, and P n(Im(i,j)) represents the high frequency information of each pixel (I, j) in the high resolution I m component;
G (I m(i,j)) represents the gray value of each pixel (I, j) in the high-resolution I m component, and M (I m(i,j,m,n)) represents the gray average value of the mxn region centered on pixel (I, j) in the high-resolution I m component.
Further, when the fused I F component and the high-resolution I m component are subjected to weighted average processing to obtain a fused I new1 component, performing:
the fused I new1 component is obtained using the following formula:
Inew1=w1×Im+w2×IF
w2=1-w1
Wherein I new1 represents an I component obtained by performing weighted average processing on the multispectral I F component and the high-resolution I m component; i m represents an I component obtained by IHS conversion of the original high-resolution image HRI_0; i F represents an I component obtained by performing high-pass filtering processing on the multispectral I h component and the high-resolution I m component; w 1 represents a first weight, and w 2 represents a second weight.
It should be noted that, the weighted average processing is performed on the fused I F component and the high-resolution I m component, so that the new image (fused I new1 component) is obtained by directly performing processes such as selecting, averaging, weighted averaging and the like on the pixels on the basis of the fused I F component (source image) without performing any transformation on the fused I F component (source image).
Specifically, the first weight w 1 is the weight of the high-resolution I m component, and the second weight w 2 is the weight of the fusion I F component; the two images targeted by the general weighted fusion are relatively equal, so the sum of the weighting coefficients is 1. The invention considers that the specific gravity of the spatial information carried by the high-resolution image is extremely high, and through experiments, w 1=1,w2 =0.2 of the invention.
Specifically, when the new full-color image pi_1 is generated according to the simulation of the original full-color image pi_0, the following formula is adopted:
wherein PAN new represents the image gray matrix of the full-color spectrum of the new full-color image pi_1, R p、Gp、Bp represents the image gray matrix of the three color spectrums of red, green and blue of the original full-color image pi_0, IR represents the image gray matrix of the near-infrared color spectrum of the original full-color image pi_0, and PAN represents the image gray matrix of the full-color spectrum of the original full-color image pi_0.
Further, when the histogram matching is performed on the image histogram of the fused I new1 component based on the new panchromatic image pi_1 to obtain a matched fused I new2 component, the following steps are performed:
Based on the fusion I new1 component, obtaining an image histogram of the fusion I new1 component, and generating a continuous probability density function p r (r) corresponding to the image histogram of the fusion I new1 component; wherein r represents the gray level of the image histogram of the fusion I new1 component;
Reading an image histogram of the new panchromatic image pi_1, generating a continuous probability density function p z (z) of the new panchromatic image pi_1, wherein z represents the gray level of the histogram of the new panchromatic image pi_1;
generating intermediate image gray levels s based on the continuous probability density function p r (r); generating a transformation function G (z) from the probability density function p z (z); obtaining a mapping relationship between the gray level z of the histogram of the new full-color image pi_1 and the gray level r of the image histogram of the fusion I new1 component according to the set relationship of the transformation function G (z) =s;
And replacing gray values in the image histogram of the fusion I new1 component based on the mapping relation to obtain a matched fusion I new2 component.
It should be noted that, based on the new panchromatic image pi_1, histogram matching is performed on the image histogram of the fusion I new1 component, so as to obtain a matched fusion I new2 component; the matched fusion I new2 component eliminates the illumination intensity difference between the fusion I new1 component and the new full-color image PI_1, and effectively solves the problem of image brightness reduction caused by high-pass filtering processing.
The calculation formula of the gray level s of the intermediate image is as follows:
Wherein s represents the gray level of the intermediate image, and w is an artificial variable;
The calculation formula of the transformation function G (z) is:
Wherein t is a dummy variable;
The mapping relationship between the gray level z of the histogram of the new panchromatic image pi_1 and the gray level r of the image histogram of the fusion I new1 component is expressed as:
z=G-1[T(r)]=G-1(s)
based on the above formula, the transformation relation from r to z is obtained, and the fusion I new2 component (new I component) can be finally obtained through histogram matching according to the image histogram of the input fusion I new1 component.
It will be appreciated that the fused I new2 component is an image with a gray value to which a particular probability density function is required.
Further, when performing an IHS inverse transformation on the fused I new2 component, the multispectral H h component, and the multispectral S h component, the following formula is adopted:
Wherein, R MI_2、GMI_2、BMI_2 is the image gray matrix of the three spectrum segments of red, green and blue of the multispectral image mi_2, and the multispectral image mi_2 after the IHS inverse transformation is obtained based on R MI_2、GMI_2、BMI_2.
Further, when the high-frequency component of the multispectral image mi_2 and the high-frequency component of the new full-color image pi_1 are fused to obtain a fused high-frequency component, performing:
Obtaining a high-frequency component C N,HMI_2 after the N-layer wavelet decomposition of the multispectral image MI_2 and a high-frequency component C N,HPI_1 after the N-layer wavelet decomposition of the new full-color image PI_1;
and carrying out weighted average on the high-frequency component C N,HMI_2 and the high-frequency component C N,HPI_1 to obtain a fused high-frequency component C N,HF.
Further, when the low frequency component of the multispectral image mi_2 and the low frequency component of the new full-color image pi_1 are fused to obtain a fused low frequency component, performing:
Obtaining a low-frequency component C N,LMI_2 after the N-layer wavelet decomposition of the multispectral image MI_2 and a low-frequency component C N,LPI_1 after the N-layer wavelet decomposition of the new full-color image PI_1;
And carrying out weighted average on the low-frequency component C N,LMI_2 and the low-frequency component C N,LPI_1 to obtain a fused low-frequency component C N,LF.
In one specific embodiment, the method for performing wavelet decomposition on the multispectral image mi_2 and the new panchromatic image pi_1 to obtain high-frequency components and low-frequency components, and further performing image reconstruction to obtain a fused remote sensing image mainly includes:
(1) And selecting proper wavelet base and decomposition layer number, and carrying out multi-layer wavelet division on the multispectral image MI_2 and the new full-color image PI_1 (original image). Performing wavelet decomposition on the image by adopting wavedec functions in a MATLAB wavelet toolbox, wherein the calling format is [ c, s ] = wavedec2 (X, N, haar'), wherein X represents an original image signal, N represents N layers of decomposition on the signal, haar is a mother function for performing wavelet transformation, c is decomposition coefficient of each layer, and s is decomposition coefficient length of each layer;
(2) Fusion rule for selecting wavelet coefficients
(A) Wavelet high frequency coefficient fusion rule:
Fusing the high-frequency components into weighted averages of the high-frequency components after wavelet decomposition of the multispectral image MI_2 and the new full-color image PI_1, namely:
Wherein C N,HF represents a high-frequency part, i.e., a fused high-frequency component, of the multispectral image mi_2 and the new panchromatic image pi_1 after N layers of wavelet decomposition and weighted averaging, respectively; c N,HMI_2 represents the high-frequency component after N-layer wavelet decomposition of the multispectral image mi_2, and C N,HPI_1 represents the high-frequency component after N-layer wavelet decomposition of the new full-color image pi_1;
(b) Wavelet low-frequency coefficient fusion rule
Fusing the low-frequency components into low-frequency components after wavelet decomposition of the multispectral image MI_2 and the new panchromatic image PI_1, namely:
Wherein C N,LF represents a low-frequency part, i.e., a fused low-frequency component, of the multispectral image mi_2 and the new panchromatic image pi_1 after N layers of wavelet decomposition and weighted averaging, respectively; c N,LMI_2 represents the low-frequency component after N-layer wavelet decomposition of the multispectral image mi_2, and C N,LPI_1 represents the low-frequency component after N-layer wavelet decomposition of the new full-color image pi_1.
On the highest decomposition layer, comparing wavelet coefficients of high-frequency components in 3 directions of the multispectral image MI_2 and the new full-color image PI_1, and taking the wavelet coefficient with a large absolute value as the wavelet coefficient of the fused high-frequency component; on the middle decomposition layer, taking the wavelet coefficient (of the multispectral image MI_2 or the new panchromatic image PI_1) with the largest mean variance of the local area (taking 3 multiplied by 3 here) of the pixel center as the wavelet coefficient corresponding to the fused high-frequency component; thereby concentrating the entire information of the multispectral image mi_2 and the new panchromatic image pi_1 into a portion of wavelet coefficients having a large magnitude.
It can be understood that the multispectral image mi_2 and the new panchromatic image pi_1 in the present invention have relatively rich high frequency components, relatively high brightness and contrast, and the above-mentioned wavelet high frequency coefficient fusion rule is suitable for obtaining accurate fusion high frequency components.
After the high-frequency component and the low-frequency component are fused, inverse wavelet transformation is carried out, and image reconstruction is carried out based on the high-frequency part and the low-frequency part of the fused remote sensing image, so that the finally output remote sensing image is obtained.
Example 2
The present embodiment provides a remote sensing image fusion apparatus based on a wavelet decomposition and modification IHS algorithm, which includes a memory, a processor, and a remote sensing image fusion program based on a wavelet decomposition and modification IHS algorithm stored on the memory and executable on the processor, the remote sensing image fusion program based on a wavelet decomposition and modification IHS algorithm implementing the steps of the remote sensing image fusion method based on a wavelet decomposition and modification IHS algorithm as in embodiment 1 when being executed by the processor.
The present embodiment also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of wavelet decomposition based and improved remote sensing image fusion of IHS algorithm as in embodiment 1.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the above-described division of modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules described above, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of each method embodiment may be implemented. The computer program comprises computer program code, and the computer program code can be in a source code form, an object code form, an executable file or some intermediate form and the like.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same; while the invention has been described in detail with reference to the preferred embodiments, those skilled in the art will appreciate that: modifications may be made to the specific embodiments of the present invention or equivalents may be substituted for part of the technical features thereof; without departing from the spirit of the invention, it is intended to cover the scope of the invention as claimed.

Claims (8)

1. The remote sensing image fusion method based on wavelet decomposition and improved IHS algorithm is characterized by comprising the following steps:
Acquiring an original multispectral image MI_0, an original high-resolution image HRI_0 and an original full-color image PI_0 of the same region;
Resampling the original multispectral image MI_0 to obtain a target multispectral image MI_1; the resolution of the target multispectral image MI_1 is the same as that of the original high-resolution image HRI_0;
IHS transformation is carried out on the target multispectral image MI_1 to obtain a multispectral I h component, a multispectral H h component and a multispectral S h component;
IHS conversion is carried out on the original high-resolution image HRI_0 to obtain a high-resolution I m component;
Performing high-pass filtering treatment on the multispectral I h component and the high-resolution I m component to obtain a fusion I F component;
performing weighted average processing on the fusion I F component and the high-resolution I m component to obtain a fusion I new1 component;
Simulating and generating a new full-color image PI_1 according to the original full-color image PI_0, and performing histogram matching on the image histogram of the fusion I new1 component based on the new full-color image PI_1 to obtain a matched fusion I new2 component;
Performing IHS inverse transformation on the fusion I new2 component, the multispectral H h component and the multispectral S h component to obtain a multispectral image MI_2 after IHS inverse transformation;
Performing wavelet transformation on the multispectral image MI_2 to obtain a high-frequency component and a low-frequency component of the multispectral image MI_2; performing wavelet transformation on the new full-color image PI_1 to obtain a high-frequency component and a low-frequency component of the new full-color image PI_1;
Fusing the high-frequency component of the multispectral image MI_2 and the high-frequency component of the new full-color image PI_1 to obtain a fused high-frequency component; fusing the low-frequency component of the multispectral image MI_2 and the low-frequency component of the new full-color image PI_1 to obtain a fused low-frequency component;
And carrying out wavelet inverse transformation on the fused high-frequency component and the fused low-frequency component to obtain a fused remote sensing image.
2. The remote sensing image fusion method based on wavelet decomposition and IHS algorithm according to claim 1, wherein when the multispectral I h component and the high-resolution I m component are subjected to high-pass filtering processing to obtain a fused I F component, the method is implemented:
Reading the multispectral I h component to obtain a gray value G (I h(i,j)) of each pixel (I, j) in the multispectral I h component;
Reading the high-resolution I m component, obtaining a gray value G (I m(i,j)) of each pixel (I, j) in the high-resolution I m component, and a gray value average M (I m(i,j,m,n)) of an mxn area centered on the pixel (I, j);
Subtracting the gray value average value M (I m(i,j,m,n)) from the gray value G (I m(i,j)) for each pixel (I, j) in the high-resolution I m component to obtain high-frequency information P n(Im(i,j); wherein the P n(Im(i,j)) represents the high-frequency information in each pixel (I, j) in the extracted high-resolution I m component;
For each pixel (I, j) in the multispectral I h component, performing superposition processing on the gray value G (I h(i,j)) and the high-frequency information P n(Im(i,j) to obtain a gray value G (I F(i,j)) of each pixel (I, j) of the filtered image in the spatial domain;
The fused I F component is derived based on the gray value G (I F(i,j)) of each pixel (I, j) of the filtered image in the spatial domain.
3. The remote sensing image fusion method based on wavelet decomposition and IHS algorithm according to claim 1, wherein when the fused I F component and the high resolution I m component are subjected to weighted average processing, the fusion I new1 component is obtained, the method is implemented:
the fused I new1 component is obtained using the following formula:
Wherein I new1 represents an I component obtained by performing weighted average processing on the multispectral I F component and the high-resolution I m component; i m represents an I component obtained by IHS conversion of the original high-resolution image HRI_0; i F represents an I component obtained by performing high-pass filtering processing on the multispectral I h component and the high-resolution I m component; w 1 represents a first weight, and w 2 represents a second weight.
4. The remote sensing image fusion method based on wavelet decomposition and IHS algorithm according to claim 1, wherein when the histogram matching is performed on the image histogram of the fusion I new1 component based on the new panchromatic image PI_1 to obtain a matched fusion I new2 component, the method is performed:
Based on the fusion I new1 component, obtaining an image histogram of the fusion I new1 component, and generating a continuous probability density function p r (r) corresponding to the image histogram of the fusion I new1 component;
Reading the new panchromatic image PI_1, and generating a continuous probability density function p z (z) of the new panchromatic image PI_1;
generating intermediate image gray levels s based on the continuous probability density function p r (r); generating a transformation function G (z) from the probability density function p z (z); obtaining a mapping relationship between the gray level z of the histogram of the new full-color image pi_1 and the gray level r of the image histogram of the fusion I new1 component according to the set relationship of the transformation function G (z) =s;
And replacing gray values in the image histogram of the fusion I new1 component based on the mapping relation to obtain a matched fusion I new2 component.
5. The remote sensing image fusion method based on wavelet decomposition and IHS algorithm according to claim 1, wherein when the high frequency component of the multispectral image MI_2 and the high frequency component of the new panchromatic image PI_1 are fused to obtain a fused high frequency component, the method is executed:
obtaining a high-frequency component C N, HMI_2 after the N-layer wavelet decomposition of the multispectral image MI_2 and a high-frequency component C N, HPI_1 after the N-layer wavelet decomposition of the new full-color image PI_1;
And carrying out weighted average on the high-frequency component C N, HMI_2 and the high-frequency component C N, HPI_1 to obtain a fused high-frequency component C N, HF.
6. The remote sensing image fusion method based on wavelet decomposition and IHS algorithm according to claim 1, wherein when the low frequency component of the multispectral image MI_2 and the low frequency component of the new panchromatic image PI_1 are fused to obtain a fused low frequency component, the method is executed:
obtaining a low-frequency component C N, LMI_2 after the N-layer wavelet decomposition of the multispectral image MI_2 and a low-frequency component C N, LPI_1 after the N-layer wavelet decomposition of the new full-color image PI_1;
And carrying out weighted average on the low-frequency component C N, LMI_2 and the low-frequency component C N, LPI_1 to obtain a fused low-frequency component C N, LF.
7. Remote sensing image fusion equipment based on wavelet decomposition and improved IHS algorithm, characterized in that: a remote sensing image fusion program based on wavelet decomposition and modification of an IHS algorithm, comprising a memory, a processor and a remote sensing image fusion program based on wavelet decomposition and modification of an IHS algorithm stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the remote sensing image fusion method based on wavelet decomposition and modification of an IHS algorithm as claimed in any one of claims 1-6.
8. A computer-readable storage medium having stored thereon a computer program, characterized by: the computer program, when executed by a processor, performs the steps of the remote sensing image fusion method based on wavelet decomposition and improved IHS algorithm as claimed in any one of claims 1-6.
CN202111598647.5A 2021-12-24 2021-12-24 Remote sensing image fusion method based on wavelet decomposition and IHS algorithm improvement Active CN114331936B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111598647.5A CN114331936B (en) 2021-12-24 2021-12-24 Remote sensing image fusion method based on wavelet decomposition and IHS algorithm improvement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111598647.5A CN114331936B (en) 2021-12-24 2021-12-24 Remote sensing image fusion method based on wavelet decomposition and IHS algorithm improvement

Publications (2)

Publication Number Publication Date
CN114331936A CN114331936A (en) 2022-04-12
CN114331936B true CN114331936B (en) 2024-04-16

Family

ID=81012135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111598647.5A Active CN114331936B (en) 2021-12-24 2021-12-24 Remote sensing image fusion method based on wavelet decomposition and IHS algorithm improvement

Country Status (1)

Country Link
CN (1) CN114331936B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1472544A (en) * 2003-06-05 2004-02-04 上海交通大学 Remote sensing image picture element and characteristic combination optimizing mixing method
CN1581230A (en) * 2004-05-20 2005-02-16 上海交通大学 Remote-senstive image interfusion method based on image local spectrum characteristic
CN102063710A (en) * 2009-11-13 2011-05-18 烟台海岸带可持续发展研究所 Method for realizing fusion and enhancement of remote sensing image
CN102915523A (en) * 2012-09-13 2013-02-06 中国科学院东北地理与农业生态研究所 Improved wavelet transformation remote-sensing image fusion method and improved wavelet transformation remote-sensing image fusion system
KR101291219B1 (en) * 2012-03-16 2013-07-31 한국항공우주연구원 Method for data fusion of panchromatic and multi-spectral images and apparatus thereof
CN109447922A (en) * 2018-07-10 2019-03-08 中国资源卫星应用中心 A kind of improved IHS transformation remote sensing image fusing method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1472544A (en) * 2003-06-05 2004-02-04 上海交通大学 Remote sensing image picture element and characteristic combination optimizing mixing method
CN1581230A (en) * 2004-05-20 2005-02-16 上海交通大学 Remote-senstive image interfusion method based on image local spectrum characteristic
CN102063710A (en) * 2009-11-13 2011-05-18 烟台海岸带可持续发展研究所 Method for realizing fusion and enhancement of remote sensing image
KR101291219B1 (en) * 2012-03-16 2013-07-31 한국항공우주연구원 Method for data fusion of panchromatic and multi-spectral images and apparatus thereof
CN102915523A (en) * 2012-09-13 2013-02-06 中国科学院东北地理与农业生态研究所 Improved wavelet transformation remote-sensing image fusion method and improved wavelet transformation remote-sensing image fusion system
CN109447922A (en) * 2018-07-10 2019-03-08 中国资源卫星应用中心 A kind of improved IHS transformation remote sensing image fusing method and system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种基于IHS变换和小波变换的多源遥感影像融合新方法;梁苏苏;张鹤鸣;段彩梅;;现代测绘;20130325(第02期);全文 *
一种自适应的基于局部小波系数特征的遥感图像融合方法;宋杨;万幼川;;遥感信息;20070228(第01期);全文 *
基于试验分析确立融合模型的IHS―小波变换融合方法;宋鹏飞;张霞;王晋年;田庆久;;遥感技术与应用;20091215(第06期);全文 *

Also Published As

Publication number Publication date
CN114331936A (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN108830796B (en) Hyperspectral image super-resolution reconstruction method based on spectral-spatial combination and gradient domain loss
CN107123089B (en) Remote sensing image super-resolution reconstruction method and system based on depth convolution network
Shao et al. Remote sensing image fusion with deep convolutional neural network
CN107194904B (en) NSCT area image fusion method based on supplement mechanism and PCNN
CN109509160A (en) Hierarchical remote sensing image fusion method utilizing layer-by-layer iteration super-resolution
CN111260580B (en) Image denoising method, computer device and computer readable storage medium
CN110189286B (en) Infrared and visible light image fusion method based on ResNet
CN104021532A (en) Image detail enhancement method for infrared image
CN113793289B (en) Multispectral image and full-color image fuzzy fusion method based on CNN and NSCT
CN112184591A (en) Image restoration method based on deep learning image Moire elimination
Masi et al. CNN-based pansharpening of multi-resolution remote-sensing images
CN113870110A (en) Image fusion method and device for remote sensing image, electronic equipment and storage medium
CN108491869B (en) Main component transformation remote sensing image fusion method for panchromatic waveband gray value self-adaptive inversion
CN111563866B (en) Multisource remote sensing image fusion method
Xiao et al. Physics-based GAN with iterative refinement unit for hyperspectral and multispectral image fusion
Mahmood et al. Human visual enhancement using multi scale Retinex
CN111311503A (en) Night low-brightness image enhancement system
Chen et al. Neural classification of SPOT imagery through integration of intensity and fractal information
CN103400360A (en) Multi-source image fusing method based on Wedgelet and NSCT (Non Subsampled Contourlet Transform)
CN114331936B (en) Remote sensing image fusion method based on wavelet decomposition and IHS algorithm improvement
CN113284067A (en) Hyperspectral panchromatic sharpening method based on depth detail injection network
Huang Wavelet for image fusion
CN109886904B (en) SAR image and low-resolution multispectral image fusion method and system
CN114638761B (en) Full-color sharpening method, equipment and medium for hyperspectral image
CN114897757B (en) NSST and parameter self-adaptive PCNN-based remote sensing image fusion method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant