CN111476746A - Remote sensing image fusion method based on IHS transformation and self-adaptive region characteristics - Google Patents
Remote sensing image fusion method based on IHS transformation and self-adaptive region characteristics Download PDFInfo
- Publication number
- CN111476746A CN111476746A CN202010196505.5A CN202010196505A CN111476746A CN 111476746 A CN111476746 A CN 111476746A CN 202010196505 A CN202010196505 A CN 202010196505A CN 111476746 A CN111476746 A CN 111476746A
- Authority
- CN
- China
- Prior art keywords
- image
- remote sensing
- fusion
- adaptive
- self
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000009466 transformation Effects 0.000 title claims abstract description 22
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 15
- 230000004927 fusion Effects 0.000 claims abstract description 53
- 238000011156 evaluation Methods 0.000 claims description 27
- 230000003044 adaptive effect Effects 0.000 claims description 10
- 238000011158 quantitative evaluation Methods 0.000 claims description 6
- 238000000034 method Methods 0.000 description 9
- 230000003595 spectral effect Effects 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 6
- 230000007547 defect Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 3
- 230000001737 promoting effect Effects 0.000 description 3
- 238000005034 decoration Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The invention relates to a remote sensing image fusion method based on IHS transformation and self-adaptive region characteristics, which comprises the following steps: acquiring a remote sensing image to be fused; repairing and correcting the remote sensing image through the self-adaptive region characteristics; converting R, G, B three wave bands of the multispectral remote sensing image into an IHS space; fusing a brightness component I obtained by IHS transformation of the PAN image and the multispectral MS image of the full-color image to obtain a new fused component I', and keeping the components H and S unchanged; IHS inverse transformation is carried out to obtain an enhanced multispectral image, a luminance component is replaced by a fusion component I', and the multispectral image and the H, S component image are transformed into an RGB space together to obtain a fusion image; the fusion result is evaluated, the self-adaptive region characteristics are applied to remote sensing image fusion, the problem of image distortion when the multispectral MS image and the panchromatic PAN image are directly adopted for fusion in the prior art is solved, and the obtained fusion result image is high in resolution.
Description
Technical Field
The invention relates to the technical field of image fusion, in particular to a remote sensing image fusion method based on IHS transformation and self-adaptive region characteristics.
Background
With the successive rising of more and more remote sensing satellite platforms, the remote sensing satellite is urgently needed to provide remote sensing images with high spectral resolution and high spatial resolution, but the high spatial resolution and the high spectral resolution of the remote sensing imaging equipment are difficult to realize simultaneously due to the restriction of materials and process levels at the present stage. The image fusion technology provides an effective technical means for solving the difficult problems, and according to the processing level of data during image fusion, a fusion algorithm can be divided into pixel-level fusion, feature-level fusion and decision-level fusion, wherein the pixel-level fusion can retain the detail information of an image, and the algorithm is the simplest and is the most extensive algorithm in the current fusion technology research. The existing fusion method can not be adjusted according to the property of the remote sensing image, and the defects of loss of spatial information or distortion of spectral information generally exist, so that the image resolution of the fusion result is low, part of color information is lost, and the image is distorted.
Disclosure of Invention
The invention aims to solve the technical problem of overcoming the defects in the prior art and provides a remote sensing image fusion method based on IHS transformation and self-adaptive region characteristics.
The invention is realized by the following technical scheme:
a remote sensing image fusion method based on IHS transformation and self-adaptive region features is characterized by comprising the following steps: selecting a multispectral MS image and a full-color PAN image shot for the same target from a database as remote sensing images to be fused; repairing and correcting the remote sensing image through the self-adaptive region characteristics; converting R, G, B three wave bands of the multispectral remote sensing image into an IHS space to obtain I, H, S three components; fusing a brightness component I obtained by IHS transformation of the PAN image and the multispectral MS image under a certain fusion rule to obtain a new fusion component I', and keeping the components H and S unchanged; IHS inverse transformation is carried out to obtain an enhanced multispectral image, the luminance component is replaced by the fusion component I', and the multispectral image and the H, S component image are transformed into an RGB space together to obtain a fusion image; the fusion results were evaluated.
According to the above technical solution, preferably, the "repairing and correcting the remote sensing image by the adaptive regional feature" includes the following steps: taking the remote sensing image to be fused as an input image; determining a region to be repaired according to pixel values constituting the input image; determining the area to be repaired as a self-adaptive area, and dividing the input image into a plurality of self-adaptive areas; calculating a point spread function of each self-adaptive area; interpolating point spread functions of pixels located between the representative pixels according to the calculated point spread functions; generating an output image by correcting the input image using the interpolated point spread function.
According to the above technical solution, preferably, the representative pixel is a pixel representing each of the plurality of adaptive regions.
According to the above technical solution, preferably, the "evaluating the fusion result" includes subjective observation evaluation and objective quantitative evaluation.
According to the technical scheme, preferably, the objective quantitative evaluation comprises mean evaluation, root mean square error RMSE evaluation, entropy evaluation, gradient evaluation and correlation coefficient evaluation.
The invention has the beneficial effects that:
the remote sensing image is repaired and corrected through the self-adaptive region characteristics, the problem of image distortion when a multispectral MS image and a panchromatic image PAN image are directly adopted for fusion in the prior art is solved, the defects of loss of space information and distortion of spectral information in the traditional method are overcome through a new algorithm, and when the self-adaptive region characteristics are applied to the fusion of the remote sensing image, the obtained fusion result image is high in resolution, the evaluation parameters of the fusion image are high, and the method has extremely important significance for improving and promoting the performance of the fusion method based on IHS transformation.
Drawings
FIG. 1 is a schematic flow diagram of the process of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood by those skilled in the art, the present invention will be further described in detail with reference to the accompanying drawings and preferred embodiments.
As shown in the figure, the invention comprises the following steps: selecting a multispectral MS image and a full-color PAN image shot for the same target from a database as remote sensing images to be fused; repairing and correcting the remote sensing image through the self-adaptive region characteristics; converting R, G, B three wave bands of the multispectral remote sensing image into an IHS space to obtain I, H, S three components; fusing a brightness component I obtained by IHS transformation of the PAN image and the multispectral MS image under a certain fusion rule to obtain a new fusion component I', and keeping the components H and S unchanged; IHS inverse transformation is carried out to obtain an enhanced multispectral image, the luminance component is replaced by the fusion component I', and the multispectral image and the H, S component image are transformed into an RGB space together to obtain a fusion image; the fusion results were evaluated. The remote sensing image is repaired and corrected through the self-adaptive region characteristics, the problem of image distortion when a multispectral MS image and a panchromatic image PAN image are directly adopted for fusion in the prior art is solved, the defects of loss of space information and distortion of spectral information in the traditional method are overcome through a new algorithm, and when the self-adaptive region characteristics are applied to the fusion of the remote sensing image, the obtained fusion result image is high in resolution, the evaluation parameters of the fusion image are high, and the method has extremely important significance for improving and promoting the performance of the fusion method based on IHS transformation.
According to the above embodiment, preferably, the "repairing and correcting the remote sensing image by the adaptive regional feature" includes the following steps: taking the remote sensing image to be fused as an input image; determining a region to be repaired according to pixel values constituting the input image; determining the area to be repaired as a self-adaptive area, and dividing the input image into a plurality of self-adaptive areas; calculating a point spread function of each self-adaptive area; interpolating point spread functions of pixels located between the representative pixels according to the calculated point spread functions; generating an output image by correcting the input image using the interpolated point spread function.
According to the above-described embodiment, preferably, the representative pixel is a pixel representing each of the plurality of adaptive regions.
According to the embodiment, preferably, the "evaluating the fusion result" includes subjective observation evaluation and objective quantitative evaluation, and the fusion effect of the remote sensing image can be evaluated from the perspective of subjective visual effect and objective mathematical statistical analysis, so that the image quality can be comprehensively evaluated, and the fused result is directly oriented to the application.
According to the above embodiments, preferably, the objective quantitative evaluation includes mean evaluation, root mean square error RMSE evaluation, entropy evaluation, gradient evaluation, and correlation coefficient evaluation.
(1) And (3) mean value evaluation: the average value is the gray level average value of pixels in the image and is the average brightness in response to human eyes, and the calculation formula of the image average value is as follows:
in the formula: z represents the mean value of the image, Z (x)i,yi) The gray values of the image in the ith row and the jth column are shown, M is the total row number of the image, N is the total column number of the image, and i and j respectively represent the row and column numbers of the pixels which are taken sequentially.
(2) Root mean square error RMSE evaluation: for evaluating the degree of difference between the fused image and the reference image. The difference is small, the fusion effect is good, and the root mean square error calculation formula is as follows:
in the formula: RMSE denotes the root mean square error, R (x)i,yi) The gray value of the fused image in the ith row and the jth column, F (x)i,yj) Before fusion, the gray value of the image in the ith row and the jth column, M is the total row number of the image, N is the total column number of the image, i and j respectively represent the row and the column of the pixels taken in sequenceNumber (n).
(3) Entropy evaluation: the entropy is an important index for measuring the richness of image information, the method for evaluating the information quantity of the image can be selected, the larger the entropy is, the richer the information contained in the image is, the better the image quality is, and the calculation formula of the entropy is as follows:
where E is the entropy of the image, L is the total gray level of the image, PiNumber of pixels N representing a gray value iiAnd the total number of pixels N of the image.
(4) Gradient evaluation: the magnitude of the gradient response image sharpness is indicated by G, and the larger the G, the sharper the image. The image gradient is calculated by the formula:
in the formula: g denotes the image gradient value, Z (x)i,yj) The gray values of the image in the ith row and the jth column are shown, M is the total row number of the image, N is the total column number of the image, and i and j respectively represent the row and column numbers of pixels taken in sequence to represent partial differentiation of the function.
(5) Evaluation of correlation coefficient: the correlation coefficient reflects the correlation degree of the two images, and the spectral information change degree of the multispectral image is obtained by comparing the correlation coefficients of the images before and after fusion. The correlation coefficient is calculated by the formula:
in the formula: p is a correlation coefficient, F (x)i,yj) The gray value of the ith row and the jth column of the image before fusion, A (x)i,yj) The gray values of the ith row and the jth column of the fused image are the mean value of the image before fusion and the mean value of the image after fusion, M is the total row number of the image, and N is the total column number of the image.
The remote sensing image is restored and corrected through the self-adaptive area characteristics, the problem of image distortion when a multispectral MS image and a full-color image PAN image are directly adopted for fusion in the prior art is solved, the defects of loss of space information and distortion of spectral information in the traditional method are overcome through a new algorithm, and when the self-adaptive area characteristics are applied to the remote sensing image fusion, the obtained fusion result image is high in resolution, the evaluation parameters of the fusion image are high, and the method has extremely important significance for improving and promoting the performance of the fusion method based on IHS transformation.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (5)
1. A remote sensing image fusion method based on IHS transformation and self-adaptive region features is characterized by comprising the following steps: selecting a multispectral MS image and a full-color PAN image shot for the same target from a database as remote sensing images to be fused; repairing and correcting the remote sensing image through the self-adaptive region characteristics; converting R, G, B three wave bands of the multispectral remote sensing image into an IHS space to obtain I, H, S three components; fusing a brightness component I obtained by IHS transformation of the PAN image and the multispectral MS image under a certain fusion rule to obtain a new fusion component I', and keeping the components H and S unchanged; IHS inverse transformation is carried out to obtain an enhanced multispectral image, the luminance component is replaced by the fusion component I', and the multispectral image and the H, S component image are transformed into an RGB space together to obtain a fusion image; the fusion results were evaluated.
2. The remote sensing image fusion method based on IHS transformation and adaptive regional characteristics according to claim 1, wherein the step of repairing and correcting the remote sensing image through the adaptive regional characteristics comprises the following steps: taking the remote sensing image to be fused as an input image; determining a region to be repaired according to pixel values constituting the input image; determining the area to be repaired as a self-adaptive area, and dividing the input image into a plurality of self-adaptive areas; calculating a point spread function of each self-adaptive area; interpolating point spread functions of pixels located between the representative pixels according to the calculated point spread functions; generating an output image by correcting the input image using the interpolated point spread function.
3. The remote sensing image fusion method based on IHS transformation and adaptive region characteristics as claimed in claim 2, wherein the representative pixel is a pixel representing each of the plurality of adaptive regions.
4. The remote sensing image fusion method based on IHS transformation and adaptive regional characteristics of claim 3, wherein the evaluation of the fusion result comprises subjective observation evaluation and objective quantitative evaluation.
5. The remote sensing image fusion method based on IHS transformation and adaptive regional features of claim 4, wherein the objective quantitative evaluation comprises mean evaluation, Root Mean Square Error (RMSE) evaluation, entropy evaluation, gradient evaluation and correlation coefficient evaluation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010196505.5A CN111476746A (en) | 2020-03-19 | 2020-03-19 | Remote sensing image fusion method based on IHS transformation and self-adaptive region characteristics |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010196505.5A CN111476746A (en) | 2020-03-19 | 2020-03-19 | Remote sensing image fusion method based on IHS transformation and self-adaptive region characteristics |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111476746A true CN111476746A (en) | 2020-07-31 |
Family
ID=71747644
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010196505.5A Pending CN111476746A (en) | 2020-03-19 | 2020-03-19 | Remote sensing image fusion method based on IHS transformation and self-adaptive region characteristics |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111476746A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112505776A (en) * | 2020-10-29 | 2021-03-16 | 中国石油集团工程咨询有限责任公司 | Color code adjusting method based on RGB-IHS multi-attribute fusion |
CN117058053A (en) * | 2023-07-18 | 2023-11-14 | 珠江水利委员会珠江水利科学研究院 | IHS space-spectrum fusion method, system, equipment and medium based on mean value filtering |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102017607A (en) * | 2009-02-25 | 2011-04-13 | 松下电器产业株式会社 | Image correction device and image correction method |
CN102063710A (en) * | 2009-11-13 | 2011-05-18 | 烟台海岸带可持续发展研究所 | Method for realizing fusion and enhancement of remote sensing image |
CN109447922A (en) * | 2018-07-10 | 2019-03-08 | 中国资源卫星应用中心 | A kind of improved IHS transformation remote sensing image fusing method and system |
-
2020
- 2020-03-19 CN CN202010196505.5A patent/CN111476746A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102017607A (en) * | 2009-02-25 | 2011-04-13 | 松下电器产业株式会社 | Image correction device and image correction method |
CN102063710A (en) * | 2009-11-13 | 2011-05-18 | 烟台海岸带可持续发展研究所 | Method for realizing fusion and enhancement of remote sensing image |
CN109447922A (en) * | 2018-07-10 | 2019-03-08 | 中国资源卫星应用中心 | A kind of improved IHS transformation remote sensing image fusing method and system |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112505776A (en) * | 2020-10-29 | 2021-03-16 | 中国石油集团工程咨询有限责任公司 | Color code adjusting method based on RGB-IHS multi-attribute fusion |
CN112505776B (en) * | 2020-10-29 | 2021-10-26 | 中国石油集团工程咨询有限责任公司 | Color code adjusting method based on RGB-IHS multi-attribute fusion |
CN117058053A (en) * | 2023-07-18 | 2023-11-14 | 珠江水利委员会珠江水利科学研究院 | IHS space-spectrum fusion method, system, equipment and medium based on mean value filtering |
CN117058053B (en) * | 2023-07-18 | 2024-04-05 | 珠江水利委员会珠江水利科学研究院 | IHS space-spectrum fusion method, system, equipment and medium based on mean value filtering |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110046673B (en) | No-reference tone mapping image quality evaluation method based on multi-feature fusion | |
CN107123089B (en) | Remote sensing image super-resolution reconstruction method and system based on depth convolution network | |
CN103606132B (en) | Based on the multiframe Digital Image Noise method of spatial domain and time domain combined filtering | |
CN109447922B (en) | Improved IHS (induction heating system) transformation remote sensing image fusion method and system | |
CN109191460B (en) | Quality evaluation method for tone mapping image | |
CN103871041B (en) | The image super-resolution reconstructing method built based on cognitive regularization parameter | |
CN108830796A (en) | Based on the empty high spectrum image super-resolution reconstructing method combined and gradient field is lost of spectrum | |
CN111612725B (en) | Image fusion method based on contrast enhancement of visible light image | |
CN107590791A (en) | Image enchancing method and image processing apparatus | |
CN103595980A (en) | Method for demosaicing color filtering array image based on outline non-local mean value | |
CN113822830B (en) | Multi-exposure image fusion method based on depth perception enhancement | |
CN113096029A (en) | High dynamic range image generation method based on multi-branch codec neural network | |
CN110910319B (en) | Operation video real-time defogging enhancement method based on atmospheric scattering model | |
CN103065293A (en) | Correlation weighted remote-sensing image fusion method and fusion effect evaluation method thereof | |
CN111476746A (en) | Remote sensing image fusion method based on IHS transformation and self-adaptive region characteristics | |
Bi et al. | Haze removal for a single remote sensing image using low-rank and sparse prior | |
CN111179173B (en) | Image splicing method based on discrete wavelet transform and gradient fusion algorithm | |
CN109933639B (en) | Layer-superposition-oriented multispectral image and full-color image self-adaptive fusion method | |
CN103489161A (en) | Gray level image colorizing method and device | |
CN111415304A (en) | Underwater vision enhancement method and device based on cascade deep network | |
CN113781375B (en) | Vehicle-mounted vision enhancement method based on multi-exposure fusion | |
Kapah et al. | Demosaicking using artificial neural networks | |
CN107220934A (en) | Image rebuilding method and device | |
CN107451608B (en) | SAR image non-reference quality evaluation method based on multi-view amplitude statistical characteristics | |
CN116883799A (en) | Hyperspectral image depth space spectrum fusion method guided by component replacement model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |