CN105023261B - Remote sensing image fusion method based on AGIHS and low pass filter - Google Patents

Remote sensing image fusion method based on AGIHS and low pass filter Download PDF

Info

Publication number
CN105023261B
CN105023261B CN201510433681.5A CN201510433681A CN105023261B CN 105023261 B CN105023261 B CN 105023261B CN 201510433681 A CN201510433681 A CN 201510433681A CN 105023261 B CN105023261 B CN 105023261B
Authority
CN
China
Prior art keywords
img
image
formula
multispectral image
source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201510433681.5A
Other languages
Chinese (zh)
Other versions
CN105023261A (en
Inventor
刘帆
陈宏涛
柴晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taiyuan University of Technology
Original Assignee
Taiyuan University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taiyuan University of Technology filed Critical Taiyuan University of Technology
Priority to CN201510433681.5A priority Critical patent/CN105023261B/en
Publication of CN105023261A publication Critical patent/CN105023261A/en
Application granted granted Critical
Publication of CN105023261B publication Critical patent/CN105023261B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to Remote sensing image fusion, specifically a kind of remote sensing image fusion method based on AGIHS and low pass filter.The present invention solves the problem of existing remote sensing image fusion method easily produces spectral losses phenomenon, easily produces details composition blooming.Remote sensing image fusion method based on AGIHS and low pass filter, this method is realized using following steps:1) self-adapting generalized IHS conversion is carried out for source multispectral image, thus obtains optimal luminance component;2) high fdrequency component is obtained using low pass filter;3) high fdrequency component is added in each wave band of source multispectral image, thus obtains new fused images.Fusion of the present invention suitable for remote sensing satellite image, radar image, general natural image and medical image.

Description

Remote sensing image fusion method based on AGIHS and low pass filter
Technical field
The present invention relates to Remote sensing image fusion, specifically a kind of remote sensing images based on AGIHS and low pass filter Fusion method.
Background technology
As an important branch of multisource data fusion, remote sensing image fusion is as Image Engineering field in recent years Study hotspot.Remote sensing image fusion specifically refers to the new technology that a kind of image to separate sources carries out integrated treatment, its purpose It is in order to carry out information extraction and synthesis from several source images, so as to obtain more accurate to Same Scene or target, comprehensive With reliable iamge description.The reliability and automatization level of target identification can be improved using Remote sensing image fusion.It is distant Sense image can be divided into the following two kinds type:Multispectral image (Multi-Spectral, MS) and full-colour image (Panchromatic, PAN).Wherein, multispectral image has spectral characteristic, and it includes four ripples of red, green, blue and near-infrared Section.Full-colour image contains higher spatial resolution but without spectral characteristic.Therefore, in order to which the region for preferably expressing target is special Property, multispectral image is merged with full-colour image had not only had spectral characteristic but also containing higher spatial resolution so as to obtain a width Image is a cost-effective approach.Image after fusion be usually applied to terrain classification, drawing, spectrum analysis and its His remote sensing data application.
Existing remote sensing image fusion method is broadly divided into the following two kinds type:The first kind is traditional remote sensing image fusion Method, it is broadly divided into following three types:The component method of substitution, spectra methods, improved spatial resolution increase framework method (Am é lioration de la R é solution Spatiale par Injection de Structures, ARSIS).Its In, the component method of substitution is broadly divided into following four type:Based on brightness-tone-saturation degree (Intensity-Hue- Saturation, IHS) conversion fusion method, based on principal component analysis (Principal Component Analysis, PCA fusion method), the fusion method based on Schmidt's orthogonalization, based on broad sense IHS conversion (Generalized IHS, GIHS fusion method).Spectra methods is primarily referred to as the fusion method converted based on Brovey, and what is obtained in this method is new Spectrum subband is drawn by the multispectral subband in source ratio shared in whole multispectral image.Improved spatial resolution increase Framework method is also referred to as multiple dimensioned modelling, and common multiple dimensioned instrument includes small echo, Curvelets, Contourlets, support Value conversion etc..The advantage of any of the above method is simple and easy to apply, but it is the problem of exist jointly:Increase detailed information pursuing While, easily produce spectral losses phenomenon.Equations of The Second Kind is the improvement to traditional remote sensing image fusion method, its be broadly divided into as Lower two types:The fusion method converted based on the adaptive IHS fusion methods converted and based on non-linear IHS.Equations of The Second Kind figure As the basic thought of fusion method is:Improve brightness I component in IHS conversion and obtain method, make the width of acquisition one and source full-colour picture As closer to luminance component, so as to avoid spectral losses phenomenon.The problem of Equations of The Second Kind image interfusion method exists jointly be: The detailed information in image can not be excavated well, easily produce details composition blooming.Based on this, it is necessary to which invention is a kind of Brand-new remote sensing image fusion method, to solve the above mentioned problem that existing remote sensing image fusion method is present.
The content of the invention
The present invention in order to solve existing remote sensing image fusion method easily produce spectral losses phenomenon, easily produce details into The problem of dividing blooming, there is provided a kind of remote sensing image fusion method based on AGIHS and low pass filter.
The present invention adopts the following technical scheme that realization:Remote sensing image fusion side based on AGIHS and low pass filter Method, this method is realized using following steps:
1) self-adapting generalized IHS conversion is carried out for source multispectral image, thus obtains optimal luminance component;It is described adaptive The specific steps for answering broad sense IHS to convert include:
1.1) red optical band for source multispectral image, green optical band, blue optical band, near infrared band are carried out Weighted sum, thus obtains luminance component;Specific sum formula is as follows:
In formula (1):I_wight is luminance component;w1、w2、w3、w4It is weights, and its scope is (0,1);img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively the red optical band, green optical band, indigo plant of source multispectral image Coloured light wave band, near infrared band;
1.2) calculate the difference of luminance component and source full-colour image;Specific formula for calculation is as follows:
D_wight=img_pan-I_wight (2);
In formula (2):D_wight is the difference of luminance component and source full-colour image;Img_pan is source full-colour image;I_ Wight is luminance component;
1.3) difference of luminance component and source full-colour image is added in each wave band of source multispectral image, thus To new multispectral image;Specific formula is as follows:
In formula (3):Img_wight (R), img_wight (G), img_wight (B), img_wight (N) are respectively new The red optical band of multispectral image, green optical band, blue optical band, near infrared band;img_ms(R)、img_ms(G)、 Img_ms (B), img_ms (N) be respectively the red optical band of source multispectral image, green optical band, blue optical band, near red Wave section;D_wight is the difference of luminance component and source full-colour image;
1.4) the global quality index of new multispectral image is calculated, and using global quality index as fitness function, Then, the adaptive optimal control angle value of fitness function is found using particle cluster algorithm, best initial weights are thus obtained;
1.5) best initial weights are substituted into formula (1), thus obtains optimal luminance component;Specific formula is as follows:
In formula (8):I_opt is optimal luminance component;wOpt, 1、wOpt, 2、wOpt, 3、wOpt, 4It is best initial weights;img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively the red optical band, green optical band, indigo plant of source multispectral image Coloured light wave band, near infrared band;
2) high fdrequency component is obtained using low pass filter;Specific steps include:
2.1) grey level histogram matching is carried out for source full-colour image and optimal luminance component, what is thus matched is complete Color image;
2.2) difference of the full-colour image matched and optimal luminance component is calculated;Specific formula for calculation is as follows:
D_opt=img_pan_matched-I_opt (9);
In formula (9):D_opt is the difference of the full-colour image and optimal luminance component matched;Img_pan_matched is The full-colour image matched;I_opt is optimal luminance component;
2.3) full-colour image and the difference of optimal luminance component that match are filtered using low pass filter, thus Respectively obtain high fdrequency component and low frequency component;The coefficient matrix of low pass filter is specifically expressed as follows:
3) high fdrequency component is added in each wave band of source multispectral image, thus obtains new fused images;Specifically Formula is as follows:
In formula (11):Img_fus (R), img_fus (G), img_fus (B), img_fus (N) are respectively new fusion figure The red optical band of picture, green optical band, blue optical band, near infrared band;img_ms(R)、img_ms(G)、img_ms (B), img_ms (N) is respectively the red optical band of source multispectral image, green optical band, blue optical band, near infrared band; D_high is high fdrequency component.
Compared with existing remote sensing image fusion method, the remote sensing figure of the present invention based on AGIHS and low pass filter As fusion method has the following advantages that:First, compared with first kind image interfusion method, it is of the present invention based on AGIHS and The remote sensing image fusion method of low pass filter is converted by using self-adapting generalized IHS, is realized and is being pursued increase details letter Spectral losses are avoided while breath, first kind image interfusion method is thus effectively overcome and easily produces asking for spectral losses phenomenon Topic.Second, compared with Equations of The Second Kind image interfusion method, the remote sensing images of the present invention based on AGIHS and low pass filter Fusion method realizes the detailed information excavated well in image by using low pass filter, thus effectively overcomes the The problem of two class image interfusion methods easily produce details composition blooming.Test experiments show, of the present invention to be based on The remote sensing image fusion method of AGIHS and low pass filter is excellent for four fusion index difference of the fusion results of remote sensing images In the fusion method converted based on broad sense IHS, fusion method based on principal component analysis, spectra methods, compromise parametric method Four fusion indexs of fusion results, as shown in Fig. 2 a-2g, Fig. 3 a-3g, Fig. 4 a-4g.
The present invention efficiently solves existing remote sensing image fusion method and easily produces spectral losses phenomenon, easily produces details The problem of composition blooming, it is adaptable to which remote sensing satellite image, radar image, general natural image and medical image melt Close.
Brief description of the drawings
Fig. 1 is the implementation process figure of the present invention.
Fig. 2 a-2g are that remote sensing images are carried out using distinct methods to merge obtained comparative result schematic diagram:
Fig. 2 a are source multispectral image schematic diagrames to be fused;
Fig. 2 b are source full-colour image schematic diagrames to be fused;
Fig. 2 c are to use the result schematic diagram obtained based on the broad sense IHS fusion methods converted;
Fig. 2 d are the result schematic diagrams obtained using the fusion method based on principal component analysis;
Fig. 2 e are the result schematic diagrams obtained using spectra methods;
Fig. 2 f are the result schematic diagrams obtained using compromise parametric method;
Fig. 2 g are the result schematic diagrams obtained using the present invention.
Fig. 3 a-3g are that remote sensing images are carried out using distinct methods to merge obtained comparative result schematic diagram:
Fig. 3 a are source multispectral image schematic diagrames to be fused;
Fig. 3 b are source full-colour image schematic diagrames to be fused;
Fig. 3 c are to use the result schematic diagram obtained based on the broad sense IHS fusion methods converted;
Fig. 3 d are the result schematic diagrams obtained using the fusion method based on principal component analysis;
Fig. 3 e are the result schematic diagrams obtained using spectra methods;
Fig. 3 f are the result schematic diagrams obtained using compromise parametric method;
Fig. 3 g are the result schematic diagrams obtained using the present invention.
Fig. 4 a-4g are that remote sensing images are carried out using distinct methods to merge obtained comparative result schematic diagram:
Fig. 4 a are source multispectral image schematic diagrames to be fused;
Fig. 4 b are source full-colour image schematic diagrames to be fused;
Fig. 4 c are to use the result schematic diagram obtained based on the broad sense IHS fusion methods converted;
Fig. 4 d are the result schematic diagrams obtained using the fusion method based on principal component analysis;
Fig. 4 e are the result schematic diagrams obtained using spectra methods;
Fig. 4 f are the result schematic diagrams obtained using compromise parametric method;
Fig. 4 g are the result schematic diagrams obtained using the present invention.
Embodiment
Remote sensing image fusion method based on AGIHS and low pass filter, this method is realized using following steps:
1) self-adapting generalized IHS conversion is carried out for source multispectral image, thus obtains optimal luminance component;It is described adaptive The specific steps for answering broad sense IHS to convert include:
1.1) red optical band for source multispectral image, green optical band, blue optical band, near infrared band are carried out Weighted sum, thus obtains luminance component;Specific sum formula is as follows:
In formula (1):I_wight is luminance component;w1、w2、w3、w4It is weights, and its scope is (0,1);img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively the red optical band, green optical band, indigo plant of source multispectral image Coloured light wave band, near infrared band;
1.2) difference of luminance component and source full-colour image is calculated;Specific formula for calculation is as follows:
D_wight=img_pan-I_wight (2);
In formula (2):D_wight is the difference of luminance component and source full-colour image;Img_pan is source full-colour image;I_ Wight is luminance component;
1.3) difference of luminance component and source full-colour image is added in each wave band of source multispectral image, thus To new multispectral image;Specific formula is as follows:
In formula (3):Img_wight (R), img_wight (G), img_wight (B), img_wight (N) are respectively new The red optical band of multispectral image, green optical band, blue optical band, near infrared band;img_ms(R)、img_ms(G)、 Img_ms (B), img_ms (N) be respectively the red optical band of source multispectral image, green optical band, blue optical band, near red Wave section;D_wight is the difference of luminance component and source full-colour image;
1.4) the global quality index of new multispectral image is calculated, and using global quality index as fitness function, Then, the adaptive optimal control angle value of fitness function is found using particle cluster algorithm, best initial weights are thus obtained;
1.5) best initial weights are substituted into formula (1), thus obtains optimal luminance component;Specific formula is as follows:
In formula (8):I_opt is optimal luminance component;wOpt, 1、wOpt, 2、wOpt, 3、wOpt, 4It is best initial weights;img_ms (R), img_ms (G), img_ms (B), img_ms (N) are respectively the red optical band, green optical band, indigo plant of source multispectral image Coloured light wave band, near infrared band;
2) high fdrequency component is obtained using low pass filter;Specific steps include:
2.1) grey level histogram matching is carried out for source full-colour image and optimal luminance component, what is thus matched is complete Color image;
2.2) difference of the full-colour image matched and optimal luminance component is calculated;Specific formula for calculation is as follows:
D_opt=img_pan_matched-I_opt (9);
In formula (9):D_opt is the difference of the full-colour image and optimal luminance component matched;Img_pan_matched is The full-colour image matched;I_opt is optimal luminance component;
2.3) full-colour image and the difference of optimal luminance component that match are filtered using low pass filter, thus Respectively obtain high fdrequency component and low frequency component;The coefficient matrix of low pass filter is specifically expressed as follows:
3) high fdrequency component is added in each wave band of source multispectral image, thus obtains new fused images;Specifically Formula is as follows:
In formula (11):Img_fus (R), img_fus (G), img_fus (B), img_fus (N) are respectively new fusion figure The red optical band of picture, green optical band, blue optical band, near infrared band;img_ms(R)、img_ms(G)、img_ms (B), img_ms (N) is respectively the red optical band of source multispectral image, green optical band, blue optical band, near infrared band; D_high is high fdrequency component.
The step 1.4) in, the specific formula for calculation of the global quality index of new multispectral image is as follows:
In formula (4):Q4For the global quality index of new multispectral image;For the quaternary number table of source multispectral image Show the covariance represented with the quaternary number of new multispectral image;The variance represented for the quaternary number of source multispectral image; The variance represented for the quaternary number of new multispectral image;z1Represented for the quaternary number of source multispectral image;z2To be new multispectral The quaternary number of image is represented;A, b, c, d are real number;I, j, k are imaginary number, and i2=j2=k2=ijk=-1;
Fitness function is specifically expressed as follows:
Fitness=-Q4(img_wight) (5);
In formula (5):Fitness is fitness function;Q4For the global quality index of new multispectral image;img_ Wight is new multispectral image.
The step 1.4) in, the specific steps of particle cluster algorithm include:
1.4.1 acceleration constant c) is assumed1And c2, TmaxFor greatest iteration number;It is empty in definition if current iteration number of times t=1 BetweenInside randomly generate the position vector po of m particle1,po2,...,pom, wherein the position vector of each particle is one 4 Dimensional vector, is specifically expressed as follows:
The position vector composition initial population Po (t) of m particle;
Meanwhile, randomly generate the velocity v of m particle1,v2,...,vm, the velocity composition velocity moment of m particle Battle array V (t);
1.4.2) according to step 1.1) -1.3), the position vector of each particle, which is corresponded to, obtains a new multispectral figure PictureThen, each new multispectral image is substituted into formula (5), thus calculated in initial population Po (t) Each fitness value of the particle on its corresponding position vector;
1.4.3 fitness value of each particle on its corresponding position vector in initial population Po (t)) is compared, and Selection adaptive optimal control angle value is used as optimal location vector optM, then by optimal location vector optMThe position note of corresponding particle For the current location in n-dimensional space;
1.4.4) position vector of m particle is updated according to the velocity of m particle, new kind is thus produced Group Po (t+1);Specific more new formula is as follows:
In formula (6)-(7):For the position vector after renewal;The position arrow of d-th of particle during for kth time iteration Amount;For the velocity after renewal;The velocity of d-th of particle during for kth time iteration;c1And c2It is normal for acceleration Number;r1And r2For the random number between 0 and 1;The optimal location vector of d-th of particle during for kth time iteration;
1.4.5 end condition) is detected, if meeting condition, terminates optimization process;Otherwise, if t=t+1 and turning to step Rapid 1.4.2), until end condition reaches greatest iteration number Tmax, or assessed value is less than the precision ε provided;
1.4.6) according to adaptive optimal control angle value, optimal location vector opt=(w are exportedopt,1,wopt,2,wopt,3,wopt,4), by This obtains best initial weights.
Advantages of the present invention can be illustrated by following emulation experiment:
Obtained under all following running environment of experimental result of the present invention:7 64 professional versions of Windows, E5- 1603 double-core 2.8GHz CPU, 8G internal memories, MATLAB R2012a.
Experiment one, the remote sensing image fusion method emulation experiment based on AGIHS and low pass filter:
This experiment is used for detecting performance of the present invention to remote sensing image fusion.The remote sensing images used in experiment both from QuickBird satellites, QuickBird satellite images include multispectral image and full-colour image.Wherein, multispectral image includes near Infrared band, red optical band, green optical band and blue optical band, its spatial resolution are 2.4m, full-colour image space point Resolution is 0.6m.In this experiment, by acceleration constant c in particle cluster algorithm1It is set to 1, c2It is set to 1, greatest iteration number TmaxFor 50, Randomly generate 10 particles.
In order to examine the validity of experimental result, multigroup fusion rules index is employed:Coefficient correlation (Correlative Coefficient, CC), spectral losses (Spectral Distortion, SD), relatively be averaged spectral error (Relative Average Spectral Error, RASE) and global relative resultant error (the erreur relative of dimensionless Globale adimensionnelle de synth é se, ERGAS), average quality index Qavg(its value is respectively 16 in size Calculate and obtain in the sliding window of × 16,32 × 32,64 × 64 and 128 × 128 pixels) and global quality index Q4
A) coefficient correlation (Correlation Coefficient, CC).What coefficient correlation was defined is source multispectral image with The relation between multispectral image after fusion:
In formula (12):WithRepresent that average and the source of multispectral image each wave band after fusion are multispectral respectively The average of each wave band of image.The optimal value of coefficient correlation is 1, and its value represents the light of the multispectral image after fusion closer to 1 Spectrum loss is smaller.
B) spectral losses degree (Spectral Distortion, SD).Spectral losses degree describe source multispectral image with Difference between the multispectral image obtained after fusion:
In formula (13):BnEach wave band of expression source multispectral image, n=1,2,3,4.Fus(Bn) represent to obtain after fusion Multispectral image, a wave band size in multispectral image is M × N.What is obtained after the smaller expression fusion of SD values is multispectral The spectrum property of image is better.
C) be averaged spectral error (Relative Average Spectral Error, RASE) and the dimensionless overall situation relatively With respect to resultant error (the erreur relative globale adimensionnelle de synth é se, ERGAS). The two refer to that target value is smaller, show that merging obtained multispectral image has better spectrum property:
In formula (14):Rad is each wave band B of source multispectral imagenAverage radiation rate, RMSE obtains by following formula:
RMSE2[Fus(Bn),Bn]=bias2[Fus(Bn),Bn]+STD2[Fus(Bn),Bn] (15);
Index ERGAS describes relatively global spectral error:
In formula (16):High is the resolution ratio of the full-colour image with high spatial resolution, and low is source multispectral image Resolution ratio.
D) average quality index Qavg.Average quality index QavgObtained on the basis of global quality index Q, index What Q was represented is the difference of the loss, luminance distortion and contrast distortion of correlation between two images.Average quality index Qavg The Q's as obtained respectively in size for calculating in the sliding window of 16 × 16,32 × 32,64 × 64 and 128 × 128 pixels Average value.QavgValue closer to 1 represent fused images spectrum property it is better:
In formula (17):U represents Fus (Bn), v represents Bn, n=1,2,3,4 be each wave band of multispectral image,AndFor U and v average,WithIt is u and v variance, σuvRepresent the covariance between u and v.
E)Q4。Q4It is the performance figure of a description spectrum picture, is the global quality index for representing spectrum property.Q4It is suitable Together in the multispectral image with four spectrum subbands, and multispectral image is by hypercomplex Digital Theory or quaternary:Such as formula Sub- z=a+ib+jc+kd is defined.Wherein, a, b, c, d are real number, and i, j, k are imaginary number and i2=j2=k2=ijk=-1.And If z1=a1+ib1+jc1+kd1And z2=a2+ib2+jc2+ kd2The multispectral image after source multispectral image and fusion is represented respectively.Performance figure Q4It is defined as follows:
In formula (18):Q4Calculation formula be made up of three parts, Part I represents z1And z2Coefficient correlation, second Part describes the similarity of two images standard deviation, Part III expression be average between two images similarity.Q4 Ideal value be 1.When with Q4When measuring the quality of fused images, Q is calculated by size respectively for N × N sub-block4Value (N= 32) Q, is finally obtained4Value be average.Therefore, the criterion for Multispectral Image Fusion performance evaluation is:Melt for same group Experiment is closed, if the deviation for the image that certain fusion method is obtained, standard deviation, spectral losses degree, RASE and ERGAS are smaller, SAM is got over It is intended to zero, while CC, QavgAnd Q4Closer to 1, then illustrate the fusion results spectrum property of fusion method acquisition relatively It is good.
Table 1
Fig. 2 a-2g be distinct methods to remote sensing image fusion result schematic diagram, Fig. 2 a and Fig. 2 b are respectively the multispectral figure in source As the schematic diagram with full-colour image, region shown in figure is India Sundarbans parks.Lab diagram contains substantial amounts of linear letter Breath, mostly road and track, and part building.Fig. 2 c are the results obtained based on traditional broad sense IHS conversion (GIHS) The information such as road is more clear in schematic diagram, figure, but green area has spectral losses phenomenon, has with source figure Green region Bigger difference;Fig. 2 d are to convert the result schematic diagram that (AGIHS) is obtained based on self-adapting generalized IHS, intuitively with Fig. 2 c phases Compare difference less, but from multigroup fusion rules index, spectrum conservation degree is better than the result shown in Fig. 2 c;Fig. 2 e are The result schematic diagram obtained based on spectra methods (Spectral Analysis, SA), this method is multispectral by reference source Red optical band, green optical band, blue optical band and near infrared band are shared in the multispectral image of source respectively in image Ratio calculate luminance component, and combine full-colour image to obtain fusion results, its green area still suffers from spectrum damage Lose phenomenon;Fig. 2 f are the result schematic diagrams obtained based on compromise parametric method (Tradeoff Parameters, TP), and TP is compromise Parameter, in order to find average between spatial resolution increase and spectral losses, the result shown in figure can see, and preceding Several method has reduced the spectral losses of green area;Fig. 2 g are the fusion results that image interfusion method of the present invention is obtained Schematic diagram, several method compares, and spectrum and source multispectral image are closest in image co-registration result of the present invention, and details is preserved Preferably.Table 1 gives the evaluation index of several fusion method results, and overstriking numeral is relative optimal value.
Table 2
Fig. 3 a-3g be all distinct methods to remote sensing image fusion result schematic diagram, Fig. 3 a and Fig. 3 b are respectively that source is multispectral The schematic diagram of image and full-colour image, region shown in figure is India Sundarbans parks.Lab diagram is mainly big comprising one Bridge and riverbank building, road information.Identical with result schematic diagram shown in Fig. 2 a-2g, Fig. 3 c are obtained based on GIHS Lines are more clear in result schematic diagram, figure, but spectrum has difference with source figure;Fig. 3 d are the result signals obtained based on AGIHS Figure, intuitively difference is little compared with Fig. 3 c;Fig. 3 e are the result schematic diagram obtained based on SA, and its spectral results is better than figure 3c and Fig. 3 d result;Fig. 3 f are the result schematic diagrams obtained based on TP, no matter are superior to Fig. 3 c- Fig. 3 e from spectrum or details As a result;Fig. 3 g are the fusion results schematic diagram that image interfusion method of the present invention is obtained, and can still draw the knot same with Fig. 2 a-2g By:Spectrum and source multispectral image are closest in image co-registration result of the present invention, and details preserves preferable.Table 2 gives figure The fusion rules index of result shown in 3a-3g, overstriking numeral is relative optimal value.
Used remote sensing images and Fig. 2 a-2g and figure in the remote sensing image fusion result schematic diagram provided in Fig. 4 a-4g Remote sensing images in 3a-3g come from same remote sensing satellite, and description is similarly India Sundarbans parks.Fig. 4 a with Fig. 4 b are respectively the schematic diagram of source multispectral image and full-colour image, and lab diagram is mainly the region of building concentration, includes portion Divide trees and road.Identical with Fig. 2 a-2g and Fig. 3 a-3g, Fig. 4 c are the result schematic diagrams obtained based on GIHS;Fig. 4 d are bases The result schematic diagram obtained in AGIHS;Fig. 4 e are the result schematic diagram obtained based on SA;Fig. 4 f are shown based on the obtained results of TP It is intended to;Fig. 4 g are the fusion results schematic diagram that image interfusion method of the present invention is obtained.Table 3 gives result shown in Fig. 4 a-4g Fusion rules index, overstriking numeral is relative optimal value.
Table 3

Claims (3)

1. a kind of remote sensing image fusion method based on AGIHS and low pass filter, it is characterised in that:This method is using as follows What step was realized:
1) self-adapting generalized IHS conversion is carried out for source multispectral image, thus obtains optimal luminance component;It is described adaptive wide The specific steps of adopted IHS conversion include:
1.1) red optical band for source multispectral image, green optical band, blue optical band, near infrared band are weighted Summation, thus obtains luminance component;Specific sum formula is as follows:
In formula (1):I_wight is luminance component;w1、w2、w3、w4It is weights, and its scope is (0,1);img_ms(R)、 Img_ms (G), img_ms (B), img_ms (N) are respectively the red optical band, green optical band, blue light of source multispectral image Wave band, near infrared band;
1.2) difference of luminance component and source full-colour image is calculated;Specific formula for calculation is as follows:
D_wight=img_pan-I_wight (2);
In formula (2):D_wight is the difference of luminance component and source full-colour image;Img_pan is source full-colour image;I_wight is Luminance component;
1.3) difference of luminance component and source full-colour image is added in each wave band of source multispectral image, thus obtains new Multispectral image;Specific formula is as follows:
In formula (3):Img_wight (R), img_wight (G), img_wight (B), img_wight (N) are respectively new light more The red optical band of spectrogram picture, green optical band, blue optical band, near infrared band;img_ms(R)、img_ms(G)、img_ Ms (B), img_ms (N) are respectively the red optical band of source multispectral image, green optical band, blue optical band, near-infrared ripple Section;D_wight is the difference of luminance component and source full-colour image;
1.4) the global quality index of new multispectral image is calculated, and using global quality index as fitness function, then, The adaptive optimal control angle value of fitness function is found using particle cluster algorithm, best initial weights are thus obtained;
1.5) best initial weights are substituted into formula (1), thus obtains optimal luminance component;Specific formula is as follows:
In formula (8):I_opt is optimal luminance component;wOpt, 1、wOpt, 2、wOpt, 3、wOpt, 4It is best initial weights;img_ms(R)、 Img_ms (G), img_ms (B), img_ms (N) are respectively the red optical band, green optical band, blue light of source multispectral image Wave band, near infrared band;
2) high fdrequency component is obtained using low pass filter;Specific steps include:
2.1) grey level histogram matching, the full-colour picture thus matched are carried out for source full-colour image and optimal luminance component Picture;
2.2) calculate the difference of the full-colour image that matches and optimal luminance component;Specific formula for calculation is as follows:
D_opt=img_pan_matched-I_opt (9);
In formula (9):D_opt is the difference of the full-colour image and optimal luminance component matched;Img_pan_matched is matching Good full-colour image;I_opt is optimal luminance component;
2.3) full-colour image and the difference of optimal luminance component that match are filtered using low pass filter, thus distinguished Obtain high fdrequency component and low frequency component;The coefficient matrix of low pass filter is specifically expressed as follows:
3) high fdrequency component is added in each wave band of source multispectral image, thus obtains new fused images;Specific formula It is as follows:
In formula (11):Img_fus (R), img_fus (G), img_fus (B), img_fus (N) are respectively new fused images Red optical band, green optical band, blue optical band, near infrared band;img_ms(R)、img_ms(G)、img_ms(B)、 Img_ms (N) is respectively the red optical band of source multispectral image, green optical band, blue optical band, near infrared band;d_ High is high fdrequency component.
2. the remote sensing image fusion method according to claim 1 based on AGIHS and low pass filter, it is characterised in that: The step 1.4) in, the specific formula for calculation of the global quality index of new multispectral image is as follows:
In formula (4):Q4For the global quality index of new multispectral image;For source multispectral image quaternary number represent and The covariance that the quaternary number of new multispectral image is represented;The variance represented for the quaternary number of source multispectral image;To be new Multispectral image the variance that represents of quaternary number;z1Represented for the quaternary number of source multispectral image;z2For new multispectral image Quaternary number represent;A, b, c, d are real number;I, j, k are imaginary number, and i2=j2=k2=ijk=-1;
Fitness function is specifically expressed as follows:
Fitness=-Q4(img_wight) (5);
In formula (5):Fitness is fitness function;Q4For the global quality index of new multispectral image;Img_wight is new Multispectral image.
3. the remote sensing image fusion method according to claim 2 based on AGIHS and low pass filter, it is characterised in that: The step 1.4) in, the specific steps of particle cluster algorithm include:
1.4.1 acceleration constant c) is assumed1And c2, TmaxFor greatest iteration number;If current iteration number of times t=1, in definition space Inside randomly generate the position vector po of m particle1,po2,...,pom, wherein the position vector of each particle be one 4 tie up to Amount, is specifically expressed as follows:
The position vector composition initial population Po (t) of m particle;
Meanwhile, randomly generate the velocity v of m particle1,v2,...,vm, the velocity composition rate matrices V of m particle (t);
1.4.2) according to step 1.1) -1.3), the position vector of each particle, which is corresponded to, obtains a new multispectral imageThen, each new multispectral image is substituted into formula (5), thus calculated every in initial population Po (t) Fitness value of the individual particle on its corresponding position vector;
1.4.3) compare fitness value of each particle on its corresponding position vector in initial population Po (t), and select Adaptive optimal control angle value is used as optimal location vector optM, then by optimal location vector optMThe position of corresponding particle is designated as in n Current location in dimension space;
1.4.4) position vector of m particle is updated according to the velocity of m particle, new population Po is thus produced (t+1);Specific more new formula is as follows:
In formula (6)-(7):For the position vector after renewal;The position vector of d-th of particle during for kth time iteration;For the velocity after renewal;The velocity of d-th of particle during for kth time iteration;c1And c2For acceleration constant;r1 And r2For the random number between 0 and 1;The optimal location vector of d-th of particle during for kth time iteration;
1.4.5 end condition) is detected, if meeting condition, terminates optimization process;Otherwise, if t=t+1 and turning to step 1.4.2), until end condition reaches greatest iteration number Tmax, or assessed value is less than the precision ε provided;
1.4.6) according to adaptive optimal control angle value, optimal location vector opt=(w are exportedopt,1,wopt,2,wopt,3,wopt,4), thus To best initial weights.
CN201510433681.5A 2015-07-22 2015-07-22 Remote sensing image fusion method based on AGIHS and low pass filter Expired - Fee Related CN105023261B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510433681.5A CN105023261B (en) 2015-07-22 2015-07-22 Remote sensing image fusion method based on AGIHS and low pass filter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510433681.5A CN105023261B (en) 2015-07-22 2015-07-22 Remote sensing image fusion method based on AGIHS and low pass filter

Publications (2)

Publication Number Publication Date
CN105023261A CN105023261A (en) 2015-11-04
CN105023261B true CN105023261B (en) 2017-08-04

Family

ID=54413203

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510433681.5A Expired - Fee Related CN105023261B (en) 2015-07-22 2015-07-22 Remote sensing image fusion method based on AGIHS and low pass filter

Country Status (1)

Country Link
CN (1) CN105023261B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109949270A (en) * 2019-01-28 2019-06-28 西北工业大学 Multispectral and full-colour image based on region convolutional network merges space quality evaluation method
CN113362425B (en) * 2021-06-18 2022-07-19 中科三清科技有限公司 Image fusion method and device, electronic equipment and storage medium
CN117078563B (en) * 2023-10-16 2024-02-02 武汉大学 Full-color sharpening method and system for hyperspectral image of first satellite of staring star

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1244885C (en) * 2003-08-21 2006-03-08 上海交通大学 Remote-sensing image mixing method based on local statistical property and colour space transformation
CN103065282A (en) * 2012-12-27 2013-04-24 浙江大学 Image fusion method based on sparse linear system
CN103065293A (en) * 2012-12-31 2013-04-24 中国科学院东北地理与农业生态研究所 Correlation weighted remote-sensing image fusion method and fusion effect evaluation method thereof
CN103198463B (en) * 2013-04-07 2014-08-27 北京航空航天大学 Spectrum image panchromatic sharpening method based on fusion of whole structure and space detail information
CN104156911A (en) * 2014-07-18 2014-11-19 苏州阔地网络科技有限公司 Processing method and system for image fusion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于Red-Black小波变换的多光谱图像融合方法;刘斌 等;《仪器仪表学报》;20110228;第32卷(第2期);第408-414页 *
结合最优亮度分量的遥感图像融合方法;刘帆 等;《西安电子科技大学学报(自然科学版)》;20141231;第41卷(第6期);第45-50页 *

Also Published As

Publication number Publication date
CN105023261A (en) 2015-11-04

Similar Documents

Publication Publication Date Title
CN107392925B (en) Remote sensing image ground object classification method based on super-pixel coding and convolutional neural network
Xiaoqin et al. Extraction of vegetation information from visible unmanned aerial vehicle images.
CN108647738B (en) Multi-index-based intelligent extraction method for water body of global scale remote sensing image
CN109344701A (en) A kind of dynamic gesture identification method based on Kinect
WO2018076138A1 (en) Target detection method and apparatus based on large-scale high-resolution hyper-spectral image
KR102170260B1 (en) Apparatus and method for fusing synthetic aperture radar image and multispectral image, method for detecting change using it
CN107563328A (en) A kind of face identification method and system based under complex environment
CN107194936B (en) Hyperspectral image target detection method based on superpixel combined sparse representation
KR101918007B1 (en) Method and apparatus for data fusion of polarimetric synthetic aperature radar image and panchromatic image
CN105023261B (en) Remote sensing image fusion method based on AGIHS and low pass filter
CN104376334B (en) A kind of pedestrian comparison method of multi-scale feature fusion
CN105894520B (en) A kind of automatic cloud detection method of optic of satellite image based on gauss hybrid models
CN102750701A (en) Method for detecting spissatus and spissatus shadow based on Landsat thematic mapper (TM) images and Landsat enhanced thematic mapper (ETM) images
CN105447833A (en) Foggy weather image reconstruction method based on polarization
Zhang et al. Exploration of deep learning-based multimodal fusion for semantic road scene segmentation
CN104778668B (en) The thin cloud minimizing technology of remote sensing image based on visible light wave range spectrum statistical nature
CN109918531A (en) A kind of the seeking method, apparatus and computer readable storage medium of mother drug plants
CN106529472B (en) Object detection method and device based on large scale high-resolution high spectrum image
CN114821343A (en) Mangrove remote sensing rapid and accurate extraction method based on cloud platform
Liu et al. Farmland aerial images fast-stitching method and application based on improved sift algorithm
CN110310246A (en) A kind of cane -growing region remote sensing information extracting method based on three-line imagery
CN114387195A (en) Infrared image and visible light image fusion method based on non-global pre-enhancement
CN113327271B (en) Decision-level target tracking method and system based on double-optical twin network and storage medium
CN113506275B (en) Urban image processing method based on panorama
CN111160478A (en) Hyperspectral target significance detection method based on deep learning

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170804

Termination date: 20190722