A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing
Technical field
The invention belongs to Remote sensing image fusion fields, and in particular to a kind of remote sensing image data Temporal Spectral integration is melted
Conjunction method.
Background technique
Existing remote sensing image fusion method use three kinds of different levels, respectively data-level, feature rank and certainly
Plan rank, according to the difference of target, remote sensing image fusion can be divided into more visual space fusions, the fusion of sky spectrum and temporal-spatial fusion.
Most of existing Data Fusion for Remote Sensing Image methods can be only applied to integrate space, time and spectral resolution its
In two indices, therefore highest time, space, spectral resolution blending image cannot be obtained.In addition, some remote sensing images
Data fusion method is intended to merge the supplemental information from one or two sensor, cannot make full use of and come from more multisensor
Complementary observation information.In the development of integral fusion theoretical method, domestic scholars are made that larger contribution.It was opened from 2010
Begin, Wuhan University starts correlative study under the subsidy of state natural sciences fund, has been put forward for the first time Temporal Spectral integral fusion
Concept with based on maximum a posteriori probability it is theoretical merge frame, and achieve certain research achievement.Hong Kong Chinese University pair
Remote sensing image data when, sky, spectrum correlation model carried out further exploration, it is equally theoretical based on MAP estimation, propose
A kind of new Temporal Spectral one fusion method, but still in terms of being the fusion for two sensing datas.Meng et al., Wu
Et al. achieve impressive progress in terms of 3 sensor fusions, but when without considering comprehensively, sky, spectrum signature.Meng et al. later
Further provide a unified fusion frame, can either effective integration when, sky, spectrum complementary information, while eliminating sensing
The limitation of device quantity, however the prior model of this method insertion is Gauss-Markov model, although effectively can smoothly scheme
As noise, but the model can also smooth out image detail information, and then affect the detailed information of blending image, and image is caused to clap
The visual effect taken the photograph is bad.
Summary of the invention
The purpose of the present invention is being the existing Data Fusion for Remote Sensing Image method of solution while smoothed image noise,
The problem of image detail information can be smoothed out.
The technical solution adopted by the present invention to solve the above technical problem is:
A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing, method includes the following steps:
Step 1: being obtained to multiple view as Y and auxiliary multi-source observation image Z progress edge enhancing using Laplace operator
To enhanced multiple view as Y ' and auxiliary multi-source observe image Z ';
Step 2: establishing the Temporal Spectral between the space degradation model between blending image x and Y ' and blending image x and Z '
Relational model calculates consistency constraint p (Z ' of the Y ' to the consistency constraint p of blending image x (Y ' | x), Z ' to blending image x
| x) and the description p (x) of image space relationship;
Step 3: obtaining the expression of objective function F (x) according to the calculated p of step 2 (Y ' | x), p (Z ' | x) and p (x)
Formula;
Step 4: solving objective function F (x) by conjugate gradient algorithms, and utilize the solving result of objective function F (x)
Calculate blending image x.
The beneficial effects of the present invention are: the present invention provides a kind of remote sensing image data Temporal Spectral integrations of details enhancing
Fusion method, the present invention use Laplace operator to carry out edge enhancing to multiple view picture and auxiliary multi-source observation image first,
Therefore it can retain image detail information while smoothed image noise;Then blending image and enhanced multiple view are utilized
Temporal Spectral relational model between space degradation model and blending image as between and auxiliary multi-source observation image, calculates enhancing
Multiple view picture afterwards observes image to the consistency constraint and figure of blending image to consistency constraint, the auxiliary multi-source of blending image
The description of image space relationship, and further obtain objective function;Objective function is solved finally by conjugate gradient algorithms, utilizes mesh
The solving result of scalar functions calculates blending image.Compared with the conventional method, method of the invention is smoothing out picture noise
Meanwhile can retain 95% or more image detail information.
Detailed description of the invention
Fig. 1 is a kind of remote sensing image data Temporal Spectral integral fusion method flow diagram of details enhancing of the invention;
Specific embodiment
Specific embodiment 1: as shown in Figure 1, described in present embodiment when a kind of remote sensing image data that details enhances
Sky spectrum integral fusion method, method includes the following steps:
Step 1: being obtained to multiple view as Y and auxiliary multi-source observation image Z progress edge enhancing using Laplace operator
To enhanced multiple view as Y ' and auxiliary multi-source observe image Z ';
Step 2: establishing the Temporal Spectral between the space degradation model between blending image x and Y ' and blending image x and Z '
Relational model calculates consistency constraint p (Z ' of the Y ' to the consistency constraint p of blending image x (Y ' | x), Z ' to blending image x
| x) and the description p (x) of image space relationship;
Step 3: obtaining the expression of objective function F (x) according to the calculated p of step 2 (Y ' | x), p (Z ' | x) and p (x)
Formula;
Step 4: solving objective function F (x) by conjugate gradient algorithms, and utilize the solving result of objective function F (x)
Calculate blending image x.
Specific embodiment 2: the present embodiment is different from the first embodiment in that: the detailed process of step 1 are as follows:
Multiple view is as Y={ y1,...,yk... }, ykIt is k-th of observed image, K is the sum of multiple view picture;In order to obtain
The high-resolution blending image at k moment is taken, on condition that being carved with observed image y in given time Kk, in general, image Y
With higher spectrum and temporal resolution, but spatial resolution is lower.It is therefore believed that Y is the image that space is degenerated.Phase
The x that corresponding enhancing fusion process obtains is the image with higher spatial resolution.
Multi-source is assisted to observe image Z={ z1,z2...,zn,...zN, in which: znN-th image is represented, N is auxiliary multi-source
Observe the sum of image;A usually full-colour image is multispectral (MS), have higher spatial resolution, but spectrum or when
Between resolution ratio it is lower.Integral fusion frame can complete different types of fusion task, including multiple view Space integration, spatial spectrum
Fusion, temporal-spatial fusion, Temporal Spectral fusion.
Laplace operator (Laplace) is an Order Linear Differential Operator, compared with linear first-order differential operator, two
The edge stationkeeping ability of rank differential is stronger, and it is more preferable to sharpen effect.Assuming that multiple view is two dimensional image as Y, the function expression of Y is
F (x, y), x, y are the abscissa and ordinate of two dimensional image, Laplace operator definitions respectively are as follows:
It is all linear operator for Differential Operator with Any Order, so Second Order Differential Operator and subsequent first order differential operator are all
It can be obtained a result with the mode for generating template and then convolution.Had according to the definition of differential:
Formula (2), (3) are combined to obtain with Laplace operator definition:
Laplace operator is expressed as to the form of templateTo program needs, then Laplace operator
Expansion templates be
If can be seen that in a darker region in the picture from template form and a bright spot occur, make
This bright spot can be made to become brighter with Laplace operator operation.Because the edge in image is exactly that those gray scales jump
Region, so laplacian spectral radius template edge details enhancing during have great role.
Finally then the weighted difference for calculating pixel and its neighboring pixel is added in original image again, after obtaining details enhancing
Image.Then function expression g (x, y) of the enhanced multiple view as Y ' are as follows:
Wherein: c is weight coefficient;The value of c and template definition above have relationship, when template center's numerical value takes timing, c
=-1, opposite c=1;
Similarly, enhanced auxiliary multi-source observation image Z ' is obtained.
Specific embodiment 3: present embodiment is unlike specific embodiment two: the detailed process of step 2 are as follows:
Space degradation model between blending image x and Y' is expressed as follows:
y′k,b=DSk,bMkxb+vk,b,1≤b≤Bx,1≤k≤K (6)
y′k,bIndicate the degeneration observed image of b-th of wave band of k-th of image of Y ', xbRepresent b-th of blending image x
Wave band, BxFor the sum of wave band, MkIndicate kinematic matrix, Sk,bIndicate optical dimming matrix, D is down-sampling matrix, vk,bIt is by passing
Zero-mean Gaussian noise caused by sensor and external environment;
Formula (6) is simplified, then space degradation model indicates are as follows:
y′k,b=Ay,k,bxb+vk,b (7)
Wherein: Ay,k,bIndicate the space degenerate matrix image between blending image x and Y ', Ay,k,b=DSk,bMk;
Temporal Spectral relational model between blending image x and Z ' is expressed as follows:
z′n,q=ψn,qCn,qAz,n,qxq+τn,q+vn,q,1≤q≤Bz,n,1≤n≤N (8)
z′n,qIndicate the degeneration observed image of q-th of wave band of the n-th width image of Z ', xqRepresent q-th of blending image x
Band, Bz,nFor the sum of band, Az,n,qIndicate the space degenerate matrix image between blending image x and Z ', Cn,qIt indicates
Spectral correlations matrix, ψn,qFor Time correlation matrix, τn,qFor hour offset amount, vn,qRepresent zero-mean sensor noise;
The then estimated value of blending image xIt is expressed as follows:
P (x | Y ', Z ') blending image x is represented to the consistency constraint of Y' and Z ';P is worked as in representative
When (x | Y ', Z ') is maximized, the value of corresponding x;
It is obtained according to Bayesian formula:
Wherein: p (Y ' | x) indicates Y' to the consistency constraint of blending image x, and p (Z ' | x) indicates Z ' to blending image x's
Consistency constraint, the description of p (x) representative image spatial relationship;
Assuming that zero-mean Gaussian noise v caused by sensor and external environmentk,bRandom Gaussian distribution is obeyed, then p (Y ' | x)
It is expressed as follows:
Wherein: p (y 'k,b|xb) indicate Y ' k-th of image b-th of wave band degeneration observed image to blending image x
B-th of wave band consistency constraint;ay,k,bIt is vk,bVariance, BxIt is spectral band number, φ1φ2Indicate y 'k,bSpace dimension
Degree, | | | |2Indicate 2 norms;
Assuming that vn,qRandom Gaussian distribution is obeyed, then p (Z ' | x) is expressed as follows:
Wherein: p (z 'n,q|xq) indicate Z ' the n-th width image q-th of wave band to the one of q-th of wave band of blending image x
The constraint of cause property;az,n,qIndicate noise vn,qVariance, H1H2Indicate z 'n,qSpatial Dimension;
Third probability density function p (x) is Image Priori Knowledge, for describing the spatial relationship of image.We introduce
Huber-Markov prior model.Compared to Gauss-Markov model, which can eliminate grass in fusion process
The wild effect come, and keep image edge, i.e., details enhances.The adaptive of three-dimensional space spectrum based on Laplace operator adds
Power, then p (x) is indicated are as follows:
Wherein: ax,bIt is the noise v of random Gaussian distributionx,bVariance, L1L2Representation space dimension, ρ () are Huber letter
Number, QxbIndicate adaptive weighted three-dimensional Laplace operator matrix.And QxbIt indicates are as follows:
It is the initial estimate of the blending image obtained according to resampling, β is parameter, it is indicated by (21):
Wherein μ is threshold parameter.Assuming that blending image only has several spectral bands, the curve of spectrum be it is discontinuous, because
It is feasible that this, which enables β=0,.On the contrary, if required image is a high spectrum image, it can be assumed that the curve of spectrum is continuous
, therefore, adaptive weighted priori item can effectively keep the curve of spectrum, reduce spectrum distortion.Assuming that different light
The difference very little between wave band is composed, spectral constraints are stronger.Indicate the gradient of the spectral Dimensions of b-th of wave band.
Specific embodiment 4: present embodiment is unlike specific embodiment three: the detailed process of step 3 are as follows:
Formula (11), (13) and (15) are substituted into formula (10), are handled according to the monotonicity of logarithmic function, then target
Function F (x) is expressed as canonical minimization problem, i.e., according to the monotonicity of logarithmic function and carry out simplify processing, many parameters are all
It can delete, last objective function can be expressed as canonical minimization problem, the estimated value of blending image xExpression formula are as follows:
Wherein:It indicates when F (x) is minimized, the value of corresponding x is estimated value
Wherein first item indicates the consistency constraint between x and y ', and Section 2 indicates the relationship between x and z ', Section 3
Indicate prior image.λ1And λ2Respectively indicate the weight coefficient of each section, λ1And λ2It is related to noise variance, ωn,qIndicate z 'n,q
Contribution to blending image x;It is adaptive polo placement ωn,q=λ 'n,qUnIt obtains.UnIt is by z 'nIt is obtained with the correlation calculations of x
It arrives.It assumes that correlation is bigger, contributes bigger.Auxiliary parameter λ 'n,qTo be adaptively adjusted each band of blending image
Spatial detail balance, is expressed as the blending image of each wave band, is expressed as
λ′n,q=f (zn,q,x)/min[f(z'1,2,x),...f(z'1,q,x)...f(z'n,q,x)]
Wherein f (z'n,q, x) and indicate the z' mergedn,qBand quantity.λ 1 and λ 2 are related to noise variance, and parameter is used to adjust
Save the specific gravity of three parts.
Specific embodiment 5: present embodiment is unlike specific embodiment four: the detailed process of step 4 are as follows:
The process for solving objective function F (x) by conjugate gradient algorithms is as follows:
To the x of objective function F (x)bDerivation obtains:
It indicates to objective function F (xb) derivation,ForTransposition,Indicate the b of blending image x
The corresponding spectral correlations matrix of a wave band;
The estimated value of b-th of wave band of blending image x is obtained by subsequent iteration:
xb,d+1=xb+θdeb,d (19)
Wherein, xb,d+1Indicate b-th of wave band of blending image x after the d+1 times iteration, eb,dIndicate the search of the d times iteration
The initial value in direction, the direction of search isI.e. for the 1st iteration, the direction of search isIt is next to search
The direction of search of the Suo Fangxiang to current iteration and before is related, θdFor iteration step length;
F(xb)dFor the corresponding objective function of the d times iteration;The then direction of search e of the d+1 times iterationb,d+1Are as follows:
Intermediate variable
Utilize the direction of search e of the d+1 times iterationb,d+1Calculate b-th of wave band of blending image x after the d+2 times iteration
xb,d+2, utilize xb,d+2Calculate F (xb)d+2;
And so on, calculate the corresponding objective function of each iteration;It is calculated using the solving result of objective function F (x)
The detailed process of blending image x are as follows: the corresponding objective function of each iteration is substituted into formula (16) respectively, calculates each iteration
The corresponding blending image x of objective function b-th of wave band estimated value, untilWhen, then stop
Only iteration, willThe estimated value of b-th of wave band as blending image x, wherein ζ is iteration ends threshold value;
Similarly, the estimated value for calculating remaining wave band of blending image x obtains blending image x.