CN105023261A - Remote sensing image fusion method based on AGIHS and low-pass filter - Google Patents

Remote sensing image fusion method based on AGIHS and low-pass filter Download PDF

Info

Publication number
CN105023261A
CN105023261A CN201510433681.5A CN201510433681A CN105023261A CN 105023261 A CN105023261 A CN 105023261A CN 201510433681 A CN201510433681 A CN 201510433681A CN 105023261 A CN105023261 A CN 105023261A
Authority
CN
China
Prior art keywords
img
wave band
wight
opt
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510433681.5A
Other languages
Chinese (zh)
Other versions
CN105023261B (en
Inventor
刘帆
陈宏涛
柴晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taiyuan University of Technology
Original Assignee
Taiyuan University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taiyuan University of Technology filed Critical Taiyuan University of Technology
Priority to CN201510433681.5A priority Critical patent/CN105023261B/en
Publication of CN105023261A publication Critical patent/CN105023261A/en
Application granted granted Critical
Publication of CN105023261B publication Critical patent/CN105023261B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a remote sensing image fusion technology, specifically relates to a remote sensing image fusion method based on AGIHS and a low-pass filter, and solves the problems of an existing remote sensing image fusion method that a spectrum loss phenomenon and a detail component fuzzy phenomenon are liable to occur. The remote sensing image fusion method based on AGIHS and the low-pass filter comprises the following steps that 1) self-adaptive generalized IHS transformation is carried out on a source multispectral image, and an optimal brightness component is obtained; 2) the low-pass filter is utilized to obtain a high frequency component; and 3) the high frequency component is added to each wave band of the source multispectral image, and a new fusion image is obtained. The method is suitable for fusion of remote sensing satellite images, radar images, general natural images and medical images.

Description

Based on the remote sensing image fusion method of AGIHS and low-pass filter
Technical field
The present invention relates to Remote sensing image fusion, specifically a kind of remote sensing image fusion method based on AGIHS and low-pass filter.
Background technology
As an important branch of multisource data fusion, remote sensing image fusion has become the study hotspot in Image Engineering field in recent years.Remote sensing image fusion specifically refers to that a kind of image to separate sources carries out the new technology of overall treatment, its objective is to carry out information extraction with comprehensive from several source images, thus obtain Same Scene or more accurate, the comprehensive and reliable iamge description of target.Utilize Remote sensing image fusion can improve reliability and the automatization level of target identification.Remote sensing images can be divided into the following two kinds type: multispectral image (Multi-Spectral, MS) and full-colour image (Panchromatic, PAN).Wherein, multispectral image has spectral characteristic, and it comprises red, green, blue and near infrared four wave bands.Full-colour image contains higher spatial resolution but without spectral characteristic.Therefore, in order to express the regional characteristic of target better, multispectral image and full-colour image being merged thus obtains a width and not only there is spectral characteristic but also image containing higher spatial resolution is a cost-effective approach.Image after fusion is usually applied to terrain classification, drawing, spectral analysis and other remote sensing data applications.
Existing remote sensing image fusion method is mainly divided into the following two kinds type: the first kind is traditional remote sensing image fusion method, it is mainly divided into following three types: the spatial resolution of the component method of substitution, spectrographic method, improvement increases framework method (Am é lioration de la R é solution Spatiale par Injectionde Structures, ARSIS).Wherein, the component method of substitution is mainly divided into following Four types: based on brightness-tone-saturation degree (Intensity-Hue-Saturation, IHS) fusion method converted, based on principal component analysis (Principal ComponentAnalysis, PCA) fusion method, based on Schmidt's orthogonalization fusion method, based on broad sense IHS conversion (Generalized IHS, GIHS) fusion method.Spectrographic method mainly refers to the fusion method based on Brovey conversion, and the new spectrum subband obtained in the method is drawn by the ratio that the multispectral subband in source is shared in whole multispectral image.The spatial resolution improved increases framework method also referred to as multiple dimensioned modelling, and common multiple dimensioned instrument comprises small echo, Curvelets, Contourlets, support value transform etc.The advantage of various method is simple above, but its common Problems existing is: pursuing increase detailed information while, easily produce spectral losses phenomenon.Equations of The Second Kind is the improvement to traditional remote sensing image fusion method, and it is mainly divided into the following two kinds type: the fusion method converted based on self-adaptation IHS and the fusion method converted based on non-linear IHS.The basic thought of Equations of The Second Kind image interfusion method is: that improves brightness I component in IHS conversion obtains method, makes acquisition one width and the more close luminance component of source full-colour image, thus avoids spectral losses phenomenon.The common Problems existing of Equations of The Second Kind image interfusion method is: cannot excavate the detailed information in image well, easily produces details composition blooming.Based on this, be necessary to invent a kind of brand-new remote sensing image fusion method, to solve the problems referred to above that existing remote sensing image fusion method exists.
Summary of the invention
The present invention easily producing spectral losses phenomenon to solve existing remote sensing image fusion method, easily producing the problem of details composition blooming, provides a kind of remote sensing image fusion method based on AGIHS and low-pass filter.
The present invention adopts following technical scheme to realize: based on the remote sensing image fusion method of AGIHS and low-pass filter, the method adopts following steps to realize:
1) carry out self-adapting generalized IHS conversion for source multispectral image, obtain optimum luminance component thus; The concrete steps of described self-adapting generalized IHS conversion comprise:
1.1) be weighted summation for the red light wave band of source multispectral image, green light wave band, blue light wave band, near-infrared band, obtain luminance component thus; Concrete sum formula is as follows:
I_wight=
(1)
w 1·img_ms(R)+w 2·img_ms(G)+w 3·img_ms(B)+w 4·img_ms(N)
In formula (1): I_wight is luminance component; w 1, w 2, w 3, w 4be weights, and its scope is (0,1); Img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image;
1.2) difference of luminance component and source full-colour image is calculated; Specific formula for calculation is as follows:
d_wight=img_pan-I_wight (2);
In formula (2): d_wight is the difference of luminance component and source full-colour image; Img_pan is source full-colour image; I_wight is luminance component;
1.3) difference of luminance component and source full-colour image is joined in each wave band of source multispectral image, obtain new multispectral image thus; Concrete formula is as follows:
img_wight(R)=img_ms(R)+d_wight
img_wight(G)=img_ms(G)+d_wight
(3)
img_wight(B)=img_ms(B)+d_wight
img_wight(N)=img_ms(N)+d_wight
In formula (3): img_wight (R), img_wight (G), img_wight (B), img_wight (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of new multispectral image; Img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image; D_wight is the difference of luminance component and source full-colour image;
1.4) calculate the global quality index of new multispectral image, and using global quality index as fitness function, then, adopt particle cluster algorithm to find the optimal-adaptive angle value of fitness function, obtain best initial weights thus;
1.5) best initial weights is substituted into formula (1), obtain optimum luminance component thus; Concrete formula is as follows:
I_opt=
(8)
w opt,1·img_ms(R)+w opt,2·img_ms(G)+w opt,3·img_ms(B)+w opt,4·img_ms(N)
In formula (8): I_opt is optimum luminance component; w opt, 1, w opt, 2, w opt, 3, w opt, 4be best initial weights; Img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image;
2) low-pass filter is adopted to obtain high fdrequency component; Concrete steps comprise:
2.1) carry out grey level histogram coupling for source full-colour image and optimum luminance component, obtain the full-colour image matched thus;
2.2) difference of full-colour image and the optimum luminance component matched is calculated; Specific formula for calculation is as follows:
d_opt=img_pan_matched-I_opt (9);
In formula (9): d_opt is the difference of full-colour image and the optimum luminance component matched; Img_pan_matched is the full-colour image matched; I_opt is optimum luminance component;
2.3) adopt the difference of low-pass filter to the full-colour image matched and optimum luminance component to carry out filtering, obtain high fdrequency component and low frequency component respectively thus; The matrix of coefficients of low-pass filter is specifically expressed as follows:
3) high fdrequency component is joined in each wave band of source multispectral image, obtain new fused images thus; Concrete formula is as follows:
img_fus(R)=img_ms(R)+d_high
img_fus(G)=img_ms(G)+d_high
(11)
img_fus(B)=img_ms(B)+d_high
img_fus(N)=img_ms(N)+d_high
In formula (11): img_fus (R), img_fus (G), img_fus (B), img_fus (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of new fused images; Img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image; D_high is high fdrequency component.
Compared with existing remote sensing image fusion method, remote sensing image fusion method tool based on AGIHS and low-pass filter of the present invention has the following advantages: one, compared with first kind image interfusion method, remote sensing image fusion method based on AGIHS and low-pass filter of the present invention converts by adopting self-adapting generalized IHS, achieve and avoid spectral losses while pursuit increases detailed information, effectively overcome the problem that first kind image interfusion method easily produces spectral losses phenomenon thus.They are two years old, compared with Equations of The Second Kind image interfusion method, remote sensing image fusion method based on AGIHS and low-pass filter of the present invention passes through to adopt low-pass filter, achieve the detailed information excavated well in image, effectively overcome the problem that Equations of The Second Kind image interfusion method easily produces details composition blooming thus.Test experiments shows, remote sensing image fusion method based on AGIHS and low-pass filter of the present invention merges indexs for four of fusion results of remote sensing images and is better than respectively merging indexs, as shown in Fig. 2 a-2g, Fig. 3 a-3g, Fig. 4 a-4g based on four of fusion results of the fusion method of broad sense IHS conversion, fusion method based on principal component analysis, spectrographic method, compromise parametric method.
The present invention efficiently solves existing remote sensing image fusion method and easily produces spectral losses phenomenon, easily produces the problem of details composition blooming, is applicable to the fusion of remote sensing satellite image, radar image, general natural image and medical image.
Accompanying drawing explanation
Fig. 1 is realization flow figure of the present invention.
Fig. 2 a-2g adopts distinct methods remote sensing images to be merged to the comparative result schematic diagram obtained:
Fig. 2 a is source multispectral image schematic diagram to be fused;
Fig. 2 b is source full-colour image schematic diagram to be fused;
Fig. 2 c is the result schematic diagram adopting the fusion method based on broad sense IHS conversion to obtain;
Fig. 2 d is the result schematic diagram adopting the fusion method based on principal component analysis to obtain;
Fig. 2 e is the result schematic diagram adopting spectrographic method to obtain;
Fig. 2 f is the result schematic diagram adopting compromise parametric method to obtain;
Fig. 2 g is the result schematic diagram adopting the present invention to obtain.
Fig. 3 a-3g adopts distinct methods remote sensing images to be merged to the comparative result schematic diagram obtained:
Fig. 3 a is source multispectral image schematic diagram to be fused;
Fig. 3 b is source full-colour image schematic diagram to be fused;
Fig. 3 c is the result schematic diagram adopting the fusion method based on broad sense IHS conversion to obtain;
Fig. 3 d is the result schematic diagram adopting the fusion method based on principal component analysis to obtain;
Fig. 3 e is the result schematic diagram adopting spectrographic method to obtain;
Fig. 3 f is the result schematic diagram adopting compromise parametric method to obtain;
Fig. 3 g is the result schematic diagram adopting the present invention to obtain.
Fig. 4 a-4g adopts distinct methods remote sensing images to be merged to the comparative result schematic diagram obtained:
Fig. 4 a is source multispectral image schematic diagram to be fused;
Fig. 4 b is source full-colour image schematic diagram to be fused;
Fig. 4 c is the result schematic diagram adopting the fusion method based on broad sense IHS conversion to obtain;
Fig. 4 d is the result schematic diagram adopting the fusion method based on principal component analysis to obtain;
Fig. 4 e is the result schematic diagram adopting spectrographic method to obtain;
Fig. 4 f is the result schematic diagram adopting compromise parametric method to obtain;
Fig. 4 g is the result schematic diagram adopting the present invention to obtain.
Embodiment
Based on the remote sensing image fusion method of AGIHS and low-pass filter, the method adopts following steps to realize:
1) carry out self-adapting generalized IHS conversion for source multispectral image, obtain optimum luminance component thus; The concrete steps of described self-adapting generalized IHS conversion comprise:
1.1) be weighted summation for the red light wave band of source multispectral image, green light wave band, blue light wave band, near-infrared band, obtain luminance component thus; Concrete sum formula is as follows:
I_wight=
(1)
w 1·img_ms(R)+w 2·img_ms(G)+w 3·img_ms(B)+w 4·img_ms(N)
In formula (1): I_wight is luminance component; w 1, w 2, w 3, w 4be weights, and its scope is (0,1); Img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image;
1.2) difference of luminance component and source full-colour image is calculated; Specific formula for calculation is as follows:
d_wight=img_pan-I_wight (2);
In formula (2): d_wight is the difference of luminance component and source full-colour image; Img_pan is source full-colour image; I_wight is luminance component;
1.3) difference of luminance component and source full-colour image is joined in each wave band of source multispectral image, obtain new multispectral image thus; Concrete formula is as follows:
img_wight(R)=img_ms(R)+d_wight
img_wight(G)=img_ms(G)+d_wight
(3)
img_wight(B)=img_ms(B)+d_wight
img_wight(N)=img_ms(N)+d_wight
In formula (3): img_wight (R), img_wight (G), img_wight (B), img_wight (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of new multispectral image; Img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image; D_wight is the difference of luminance component and source full-colour image;
1.4) calculate the global quality index of new multispectral image, and using global quality index as fitness function, then, adopt particle cluster algorithm to find the optimal-adaptive angle value of fitness function, obtain best initial weights thus;
1.5) best initial weights is substituted into formula (1), obtain optimum luminance component thus; Concrete formula is as follows:
I_opt=
(8)
w opt,1·img_ms(R)+w opt,2·img_ms(G)+w opt,3·img_ms(B)+w opt,4·img_ms(N)
In formula (8): I_opt is optimum luminance component; w opt, 1, w opt, 2, w opt, 3, w opt, 4be best initial weights; Img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image;
2) low-pass filter is adopted to obtain high fdrequency component; Concrete steps comprise:
2.1) carry out grey level histogram coupling for source full-colour image and optimum luminance component, obtain the full-colour image matched thus;
2.2) difference of full-colour image and the optimum luminance component matched is calculated; Specific formula for calculation is as follows:
d_opt=img_pan_matched-I_opt (9);
In formula (9): d_opt is the difference of full-colour image and the optimum luminance component matched; Img_pan_matched is the full-colour image matched; I_opt is optimum luminance component;
2.3) adopt the difference of low-pass filter to the full-colour image matched and optimum luminance component to carry out filtering, obtain high fdrequency component and low frequency component respectively thus; The matrix of coefficients of low-pass filter is specifically expressed as follows:
3) high fdrequency component is joined in each wave band of source multispectral image, obtain new fused images thus; Concrete formula is as follows:
img_fus(R)=img_ms(R)+d_high
img_fus(G)=img_ms(G)+d_high
(11)
img_fus(B)=img_ms(B)+d_high
img_fus(N)=img_ms(N)+d_high
In formula (11): img_fus (R), img_fus (G), img_fus (B), img_fus (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of new fused images; Img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image; D_high is high fdrequency component.
Described step 1.4) in, the specific formula for calculation of the global quality index of new multispectral image is as follows:
z 1=a 1+ib 1+jc 1+kd 1
z 2=a 2+ib 2+jc 2+kd 2
In formula (4): Q 4for the global quality index of new multispectral image; hypercomplex number for source multispectral image represents and the covariance that the hypercomplex number of new multispectral image represents; for the variance that the hypercomplex number of source multispectral image represents; for the variance that the hypercomplex number of new multispectral image represents; z 1for the hypercomplex number of source multispectral image represents; z 2for the hypercomplex number of new multispectral image represents; A, b, c, d are real number; I, j, k are imaginary number, and i 2=j 2=k 2=ijk=-1;
Fitness function is specifically expressed as follows:
fitness=-Q 4(img_wight) (5);
In formula (5): fitness is fitness function; Q 4for the global quality index of new multispectral image; Img_wight is new multispectral image.
Described step 1.4) in, the concrete steps of particle cluster algorithm comprise:
1.4.1) acceleration constant c is supposed 1and c 2, T maxfor greatest iteration number; If current iteration number of times t=1, at definition space the interior random position vector po producing m particle 1, po 2..., po m, wherein the position vector of each particle is 4 dimensional vectors, is specifically expressed as follows:
i=1,2,…,m
Position vector composition initial population Po (t) of m particle;
Meanwhile, the random velocity v producing m particle 1, v 2..., v m, velocity composition rate matrices V (t) of m particle;
1.4.2) according to step 1.1)-1.3), the equal correspondence of position vector of each particle obtains a new multispectral image then, each new multispectral image is all substituted into formula (5), calculate the fitness value of each particle on the position vector that it is corresponding in initial population Po (t) thus;
1.4.3) compare the fitness value of each particle on the position vector that it is corresponding in initial population Po (t), and select optimal-adaptive angle value as optimal location vector opt m, then by optimal location vector opt mthe position of corresponding particle is designated as the current location in n-dimensional space;
1.4.4) upgrade according to the position vector of velocity to m particle of m particle, produce new population Po (t+1) thus; It is concrete that more new formula is as follows:
In formula (6)-(7): for the position vector after renewal; the position vector of d particle during iteration secondary to kth; for the velocity after renewal; the velocity of d particle during iteration secondary to kth; c 1and c 2for acceleration constant; r 1and r 2it is the random number between 0 and 1; the optimal location vector of d particle during iteration secondary to kth;
1.4.5) detect end condition, if satisfied condition, then terminate optimizing process; Otherwise, if t=t+1 and turn to step 1.4.2), until end condition reaches greatest iteration number T max, or assessed value is less than the precision ε provided;
1.4.6) according to optimal-adaptive angle value, optimal location vector opt=(w is exported opt, 1, w opt, 2, w opt, 3, w opt, 4), obtain best initial weights thus.
Advantage of the present invention can be illustrated by following emulation experiment:
Obtain under all following running environment of experimental result that the present invention is all: Windows 7 64 professional versions, E5-1603 double-core 2.8GHz CPU, 8G internal memory, MATLAB R2012a.
Experiment one, the remote sensing image fusion method emulation experiment based on AGIHS and low-pass filter:
This experiment is used for detecting the present invention to the performance of remote sensing image fusion.The remote sensing images used in experiment all come from QuickBird satellite, and QuickBird satellite image comprises multispectral image and full-colour image.Wherein, multispectral image comprises near-infrared band, red light wave band, green light wave band and blue light wave band, and its spatial resolution is 2.4m, and full-colour image spatial resolution is 0.6m.In this experiment, by acceleration constant c in particle cluster algorithm 1be set to 1, c 2be set to 1, greatest iteration number T maxbe 50, random generation 10 particles.
In order to check the validity of experimental result, have employed many group fusion rules indexs: related coefficient (Correlative Coefficient, CC), spectral losses (Spectral Distortion, SD), relative averaged spectrum error (Relative Average Spectral Error, RASE) with the dimensionless relative resultant error of the overall situation (the erreur relative globale adimensionnelle de synth é se, ERGAS), average quality index Q avg(its value calculates respectively and obtains in the moving window that size is 16 × 16,32 × 32,64 × 64 and 128 × 128 pixels) and global quality index Q 4.
A) related coefficient (Correlation Coefficient, CC).Related coefficient definition be source multispectral image and merge after multispectral image between relation:
In formula (12): with represent the average of each wave band of multispectral image after merging and the average of source each wave band of multispectral image respectively.The optimal value of related coefficient is 1, close to 1, its value more represents that the spectral losses of the multispectral image after merging is less.
B) spectral losses degree (Spectral Distortion, SD).The difference being source multispectral image and merging between the rear multispectral image obtained of spectral losses degree description:
In formula (13): B neach wave band of expression source multispectral image, n=1,2,3,4.Fus (B n) representing the multispectral image obtained after merging, a wave band size in multispectral image is M × N.SD value is less represents that the spectrum property merging the rear multispectral image obtained is better.
C) relative averaged spectrum error (Relative Average Spectral Error, RASE) with the dimensionless relative resultant error of the overall situation (the erreur relative globale adimensionnelle de synth é se, ERGAS).The value of this two indices is less, shows to merge the spectrum property that the multispectral image that obtains has had:
In formula (14): rad is each wave band B of source multispectral image naverage radiation rate, RMSE is obtained by following formula:
RMSE 2[Fus(B n),B n]=bias 2[Fus(B n),B n]+STD 2[Fus(B n),B n] (15);
What index ERGAS described is relatively overall spectral error:
In formula (16): high is the resolution of the full-colour image with high spatial resolution, low is the resolution of source multispectral image.
D) average quality index Q avg.Average quality index Q avgobtain on the basis of global quality index Q, the difference of the loss of correlativity between two width images that what index Q represented is, luminance distortion and contrast distortion.Average quality index Q avgbe the mean value calculating the Q obtained respectively in the moving window that size is 16 × 16,32 × 32,64 × 64 and 128 × 128 pixels.Q avgvalue more represent that the spectrum property of fused images is better close to 1:
In formula (17): u represents Fus (B n), v represents B n, n=1,2,3,4 is each wave band of multispectral image, and for the average of u and v, with the variance of u and v, σ uvrepresent the covariance between u and v.
E)Q 4。Q 4being a performance figure describing spectrum picture, is the global quality index representing spectrum property.Q 4be suitable for the multispectral image with four spectrum subbands, and multispectral image is by hypercomplex Digital Theory or quaternary: as formula z=a+ib+jc+kd defines.Wherein, a, b, c, d are real number, and i, j, k are imaginary number and i 2=j 2=k 2=ijk=-1.And if z1=a 1+ ib 1+ jc 1+ kd 1and z 2=a 2+ ib 2+ jc 2+ kd 2represent source multispectral image and the multispectral image after merging respectively.Performance figure Q 4be defined as follows:
In formula (18): Q 4computing formula be made up of three parts, Part I represents z 1and z 2related coefficient, Part II describes the similarity of two width graphics standard differences, and what Part III was expressed is the similarity of average between two width images.Q 4ideal value be 1.When with Q 4when measuring the quality of fused images, the sub-block being N × N by size calculates Q respectively 4value (N=32), finally obtain Q 4value be average.Therefore, the criterion for Multispectral Image Fusion performance evaluation is: for same group of fusion experiment, if the deviation of the image of certain fusion method acquisition, standard deviation, spectral losses degree, RASE and ERGAS are less, SAM more trends towards zero, simultaneously CC, Q avgand Q 4more close to 1, then illustrate that the fusion results spectrum property that this fusion method obtains is relatively better.
Table 1
Fig. 2 be distinct methods to remote sensing image fusion result schematic diagram, Fig. 2 a and Fig. 2 b is respectively the schematic diagram of source multispectral image and full-colour image, and region shown in figure is India Sundarbans park.Lab diagram contains a large amount of linear informations, mostly is road and track, and part buildings.Fig. 2 c converts (GIHS) result schematic diagram of obtaining based on traditional broad sense IHS, and in figure, the information such as road is comparatively clear, but green area exists spectral losses phenomenon, has bigger difference with figure Green region, source; Fig. 2 d is the result schematic diagram obtained based on self-adapting generalized IHS conversion (AGIHS), and intuitively difference is little compared with Fig. 2 c, but many group fusion rules indexs, spectrum conservation degree is better than the result shown in Fig. 2 c; Fig. 2 e is based on spectrographic method (Spectral Analysis, SA) result schematic diagram obtained, the method calculates luminance component by the ratio that red light wave band, green light wave band, blue light wave band and near-infrared band in reference source multispectral image are shared in the multispectral image of source respectively, and obtain fusion results in conjunction with full-colour image, still there is spectral losses phenomenon in its green area; Fig. 2 f is based on compromise parametric method (Tradeoff Parameters, TP) result schematic diagram obtained, TP is compromise parameter, average is found between spectral losses in order to increase in spatial resolution, can see from the result shown in figure, decrease the spectral losses of green area with front several Measures compare; Fig. 2 g is the fusion results schematic diagram that image interfusion method of the present invention obtains, and several method compares, in image co-registration result of the present invention spectrum and source multispectral image the most close, and details is preserved better.Table 1 gives the evaluation index of several fusion method result, and overstriking numeral is relative optimal value.
Table 2
Fig. 3 is all distinct methods to remote sensing image fusion result schematic diagram, and Fig. 3 a and Fig. 3 b is respectively the schematic diagram of source multispectral image and full-colour image, and region shown in figure is India Sundarbans park.Lab diagram mainly comprises a bridge and riverbank buildings, road information.Identical with result schematic diagram shown in Fig. 2, Fig. 3 c is the result schematic diagram obtained based on GIHS, and in figure, lines are comparatively clear, but spectrum and source figure have difference; Fig. 3 d is the result schematic diagram obtained based on AGIHS, and intuitively difference is little compared with Fig. 3 c; Fig. 3 e is the result schematic diagram obtained based on SA, and its spectral results is better than the result of Fig. 3 c and Fig. 3 d; Fig. 3 f is the result schematic diagram obtained based on TP, no matter is all better than Fig. 3 c-Fig. 3 e result from spectrum or details; Fig. 3 g is the fusion results schematic diagram that image interfusion method of the present invention obtains, and still can draw the conclusion same with Fig. 2: in image co-registration result of the present invention spectrum and source multispectral image the most close, and details is preserved better.Table 2 gives the fusion rules index of result shown in Fig. 3, and overstriking numeral is relative optimal value.
Remote sensing images used in the remote sensing image fusion result schematic diagram provided in Fig. 4 come from same remote sensing satellite with the remote sensing images in Fig. 2 and Fig. 3, and description be similarly India Sundarbans park.Fig. 4 a and Fig. 4 b is respectively the schematic diagram of source multispectral image and full-colour image, and the lab diagram region that mainly buildings is concentrated, comprises part trees and road.Identical with Fig. 2 with Fig. 3, Fig. 4 c is the result schematic diagram obtained based on GIHS; Fig. 4 d is the result schematic diagram obtained based on AGIHS; Fig. 4 e is the result schematic diagram obtained based on SA; Fig. 4 f is the result schematic diagram obtained based on TP; Fig. 4 g is the fusion results schematic diagram that image interfusion method of the present invention obtains.Table 3 gives the fusion rules index of result shown in Fig. 4, and overstriking numeral is relative optimal value.
Table 3

Claims (3)

1. based on a remote sensing image fusion method for AGIHS and low-pass filter, it is characterized in that: the method adopts following steps to realize:
1) carry out self-adapting generalized IHS conversion for source multispectral image, obtain optimum luminance component thus; The concrete steps of described self-adapting generalized IHS conversion comprise:
1.1) be weighted summation for the red light wave band of source multispectral image, green light wave band, blue light wave band, near-infrared band, obtain luminance component thus; Concrete sum formula is as follows:
I_wight=
w 1·img_ms(R)+w 2·img_ms(G)+w 3·img_ms(B)+w 4·img_ms(N) (1);
In formula (1): I_wight is luminance component; w 1, w 2, w 3, w 4be weights, and its scope is (0,1); Img_ms(R), img_ms(G), img_ms(B), img_ms(N) be respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image;
1.2) difference of luminance component and source full-colour image is calculated; Specific formula for calculation is as follows:
d_wight=img_pan-I_wight (2);
In formula (2): d_wight is the difference of luminance component and source full-colour image; Img_pan is source full-colour image; I_wight is luminance component;
1.3) difference of luminance component and source full-colour image is joined in each wave band of source multispectral image, obtain new multispectral image thus; Concrete formula is as follows:
img_wight(R)=img_ms(R)+d_wight
img_wight(G)=img_ms(G)+d_wight
img_wight(B)=img_ms(B)+d_wight (3);
img_wight(N)=img_ms(N)+d_wight
In formula (3): img_wight(R), img_wight(G), img_wight(B), img_wight(N) be respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of new multispectral image; Img_ms(R), img_ms(G), img_ms(B), img_ms(N) be respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image; D_wight is the difference of luminance component and source full-colour image;
1.4) calculate the global quality index of new multispectral image, and using global quality index as fitness function, then, adopt particle cluster algorithm to find the optimal-adaptive angle value of fitness function, obtain best initial weights thus;
1.5) best initial weights is substituted into formula (1), obtain optimum luminance component thus; Concrete formula is as follows:
I_opt=
w opt,1·img_ms(R)+w opt,2·img_ms(G)+w opt,3·img_ms(B)+w opt,4·img_ms(N) (8);
In formula (8): I_opt is optimum luminance component; w opt, 1, w opt, 2, w opt, 3, w opt, 4be best initial weights; Img_ms(R), img_ms(G), img_ms(B), img_ms(N) be respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image;
2) low-pass filter is adopted to obtain high fdrequency component; Concrete steps comprise:
2.1) carry out grey level histogram coupling for source full-colour image and optimum luminance component, obtain the full-colour image matched thus;
2.2) difference of full-colour image and the optimum luminance component matched is calculated; Specific formula for calculation is as follows:
d_opt=img_pan_matched-I_opt (9);
In formula (9): d_opt is the difference of full-colour image and the optimum luminance component matched; Img_pan_matched is the full-colour image matched; I_opt is optimum luminance component;
2.3) adopt the difference of low-pass filter to the full-colour image matched and optimum luminance component to carry out filtering, obtain high fdrequency component and low frequency component respectively thus; The matrix of coefficients of low-pass filter is specifically expressed as follows:
3) high fdrequency component is joined in each wave band of source multispectral image, obtain new fused images thus; Concrete formula is as follows:
img_fus(R)=img_ms(R)+d_high
img_fus(G)=img_ms(G)+d_high
img_fus(B)=img_ms(B)+d_high (11);
img_fus(N)=img_ms(N)+d_high
In formula (11): img_fus(R), img_fus(G), img_fus(B), img_fus(N) be respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of new fused images; Img_ms(R), img_ms(G), img_ms(B), img_ms(N) be respectively red light wave band, green light wave band, blue light wave band, the near-infrared band of source multispectral image; D_high is high fdrequency component.
2. the remote sensing image fusion method based on AGIHS and low-pass filter according to claim 1, is characterized in that: described step 1.4) in, the specific formula for calculation of the global quality index of new multispectral image is as follows:
z 1=a 1+ib 1+jc 1+kd 1
z 2=a 2+ib 2+jc 2+kd 2
In formula (4): Q 4for the global quality index of new multispectral image; hypercomplex number for source multispectral image represents and the covariance that the hypercomplex number of new multispectral image represents; for the variance that the hypercomplex number of source multispectral image represents; for the variance that the hypercomplex number of new multispectral image represents; z 1for the hypercomplex number of source multispectral image represents; z 2for the hypercomplex number of new multispectral image represents; A, b, c, d are real number; I, j, k are imaginary number, and i 2=j 2=k 2=ijk=-1;
Fitness function is specifically expressed as follows:
fitness=-Q 4(img_wight) (5);
In formula (5): fitness is fitness function; Q 4for the global quality index of new multispectral image; Img_wight is new multispectral image.
3. the remote sensing image fusion method based on AGIHS and low-pass filter according to claim 1 and 2, is characterized in that: described step 1.4) in, the concrete steps of particle cluster algorithm comprise:
1.4.1) acceleration constant c is supposed 1and c 2, T maxfor greatest iteration number; If current iteration number of times t=1, at definition space the interior random position vector po producing m particle 1, po 2..., po m, wherein the position vector of each particle is 4 dimensional vectors, is specifically expressed as follows:
Position vector composition initial population Po (t) of m particle;
Meanwhile, the random velocity v producing m particle 1, v 2..., v m, velocity composition rate matrices V (t) of m particle;
1.4.2) according to step 1.1)-1.3), the equal correspondence of position vector of each particle obtains a new multispectral image then, each new multispectral image is all substituted into formula (5), calculate the fitness value of each particle on the position vector that it is corresponding in initial population Po (t) thus;
1.4.3) compare the fitness value of each particle on the position vector that it is corresponding in initial population Po (t), and select optimal-adaptive angle value as optimal location vector opt m, then by optimal location vector opt mthe position of corresponding particle is designated as the current location in n-dimensional space;
1.4.4) upgrade according to the position vector of velocity to m particle of m particle, produce new population Po (t+1) thus; It is concrete that more new formula is as follows:
In formula (6)-(7): for the position vector after renewal; the position vector of d particle during iteration secondary to kth; for the velocity after renewal; the velocity of d particle during iteration secondary to kth; c 1and c 2for acceleration constant; r 1and r 2it is the random number between 0 and 1; the optimal location vector of d particle during iteration secondary to kth;
1.4.5) detect end condition, if satisfied condition, then terminate optimizing process; Otherwise, if t=t+1 and turn to step 1.4.2), until end condition reaches greatest iteration number T max, or assessed value is less than the precision ε provided;
1.4.6) according to optimal-adaptive angle value, optimal location vector opt=(w is exported opt, 1, w opt, 2, w opt, 3, w opt, 4), obtain best initial weights thus.
CN201510433681.5A 2015-07-22 2015-07-22 Remote sensing image fusion method based on AGIHS and low pass filter Expired - Fee Related CN105023261B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510433681.5A CN105023261B (en) 2015-07-22 2015-07-22 Remote sensing image fusion method based on AGIHS and low pass filter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510433681.5A CN105023261B (en) 2015-07-22 2015-07-22 Remote sensing image fusion method based on AGIHS and low pass filter

Publications (2)

Publication Number Publication Date
CN105023261A true CN105023261A (en) 2015-11-04
CN105023261B CN105023261B (en) 2017-08-04

Family

ID=54413203

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510433681.5A Expired - Fee Related CN105023261B (en) 2015-07-22 2015-07-22 Remote sensing image fusion method based on AGIHS and low pass filter

Country Status (1)

Country Link
CN (1) CN105023261B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109949270A (en) * 2019-01-28 2019-06-28 西北工业大学 Multispectral and full-colour image based on region convolutional network merges space quality evaluation method
CN113362425A (en) * 2021-06-18 2021-09-07 中科三清科技有限公司 Image fusion method and device, electronic equipment and storage medium
CN117078563A (en) * 2023-10-16 2023-11-17 武汉大学 Full-color sharpening method and system for hyperspectral image of first satellite of staring star

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1489111A (en) * 2003-08-21 2004-04-14 上海交通大学 Remote-sensing image mixing method based on local statistical property and colour space transformation
CN103065282A (en) * 2012-12-27 2013-04-24 浙江大学 Image fusion method based on sparse linear system
CN103065293A (en) * 2012-12-31 2013-04-24 中国科学院东北地理与农业生态研究所 Correlation weighted remote-sensing image fusion method and fusion effect evaluation method thereof
CN103198463A (en) * 2013-04-07 2013-07-10 北京航空航天大学 Spectrum image panchromatic sharpening method based on fusion of whole structure and space detail information
CN104156911A (en) * 2014-07-18 2014-11-19 苏州阔地网络科技有限公司 Processing method and system for image fusion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1489111A (en) * 2003-08-21 2004-04-14 上海交通大学 Remote-sensing image mixing method based on local statistical property and colour space transformation
CN103065282A (en) * 2012-12-27 2013-04-24 浙江大学 Image fusion method based on sparse linear system
CN103065293A (en) * 2012-12-31 2013-04-24 中国科学院东北地理与农业生态研究所 Correlation weighted remote-sensing image fusion method and fusion effect evaluation method thereof
CN103198463A (en) * 2013-04-07 2013-07-10 北京航空航天大学 Spectrum image panchromatic sharpening method based on fusion of whole structure and space detail information
CN104156911A (en) * 2014-07-18 2014-11-19 苏州阔地网络科技有限公司 Processing method and system for image fusion

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
SHAOHUI CHEN ET AL.: "SAR and Multispectral Image Fusion Using Generalized IHS Transform Based on a Trous Wavelet and EMD Decompositions", 《IEEE SENSORS JOURNAL》 *
刘帆 等: "结合最优亮度分量的遥感图像融合方法", 《西安电子科技大学学报(自然科学版)》 *
刘斌 等: "基于Red-Black小波变换的多光谱图像融合方法", 《仪器仪表学报》 *
王万同,李锐: "基于IHS的高分辨率遥感影像自适应融合算法", 《计算机工程与应用》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109949270A (en) * 2019-01-28 2019-06-28 西北工业大学 Multispectral and full-colour image based on region convolutional network merges space quality evaluation method
CN113362425A (en) * 2021-06-18 2021-09-07 中科三清科技有限公司 Image fusion method and device, electronic equipment and storage medium
CN117078563A (en) * 2023-10-16 2023-11-17 武汉大学 Full-color sharpening method and system for hyperspectral image of first satellite of staring star
CN117078563B (en) * 2023-10-16 2024-02-02 武汉大学 Full-color sharpening method and system for hyperspectral image of first satellite of staring star

Also Published As

Publication number Publication date
CN105023261B (en) 2017-08-04

Similar Documents

Publication Publication Date Title
US20210201452A1 (en) Image dehazing and restoration
CN108682026B (en) Binocular vision stereo matching method based on multi-matching element fusion
CN108256419B (en) A method of port and pier image is extracted using multispectral interpretation
US9686527B2 (en) Non-feature extraction-based dense SFM three-dimensional reconstruction method
KR101918007B1 (en) Method and apparatus for data fusion of polarimetric synthetic aperature radar image and panchromatic image
JP2017528685A (en) Estimation of vehicle position
CN103735269B (en) A kind of height measurement method followed the tracks of based on video multi-target
Lu et al. Underwater image descattering and quality assessment
KR102170260B1 (en) Apparatus and method for fusing synthetic aperture radar image and multispectral image, method for detecting change using it
WO2018076138A1 (en) Target detection method and apparatus based on large-scale high-resolution hyper-spectral image
CN107194936B (en) Hyperspectral image target detection method based on superpixel combined sparse representation
KR101928391B1 (en) Method and apparatus for data fusion of multi spectral image and radar image
CN105023261A (en) Remote sensing image fusion method based on AGIHS and low-pass filter
CN110310246A (en) A kind of cane -growing region remote sensing information extracting method based on three-line imagery
CN109615637A (en) A kind of improved remote sensing image Hybrid Techniques
Liu et al. Farmland aerial images fast-stitching method and application based on improved SIFT algorithm
Al-Wassai et al. Multisensor images fusion based on feature-level
CN113327271B (en) Decision-level target tracking method and system based on double-optical twin network and storage medium
CN113222871B (en) Multi-view satellite image digital surface model fusion method
Bouhennache et al. Extraction of urban land features from TM Landsat image using the land features index and Tasseled cap transformation
CN109946670A (en) A kind of polarization radar information extracting method of optical image driving
CN111563866B (en) Multisource remote sensing image fusion method
US7433540B1 (en) Decomposing natural image sequences
KR102160687B1 (en) Aviation image fusion method
CN116309139A (en) Demosaicing method suitable for one-to-many infrared multispectral image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170804

Termination date: 20190722