CN109285133A - A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing - Google Patents
A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing Download PDFInfo
- Publication number
- CN109285133A CN109285133A CN201811142766.8A CN201811142766A CN109285133A CN 109285133 A CN109285133 A CN 109285133A CN 201811142766 A CN201811142766 A CN 201811142766A CN 109285133 A CN109285133 A CN 109285133A
- Authority
- CN
- China
- Prior art keywords
- image
- indicate
- blending
- blending image
- wave band
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003595 spectral effect Effects 0.000 title claims abstract description 31
- 230000002123 temporal effect Effects 0.000 title claims abstract description 22
- 230000002708 enhancing effect Effects 0.000 title claims abstract description 20
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 15
- 238000002156 mixing Methods 0.000 claims abstract description 76
- 238000000034 method Methods 0.000 claims abstract description 22
- 230000015556 catabolic process Effects 0.000 claims abstract description 9
- 238000006731 degradation reaction Methods 0.000 claims abstract description 9
- 239000011159 matrix material Substances 0.000 claims description 18
- 238000001228 spectrum Methods 0.000 claims description 17
- 230000003044 adaptive effect Effects 0.000 claims description 6
- 230000007850 degeneration Effects 0.000 claims description 6
- 239000008186 active pharmaceutical agent Substances 0.000 claims description 4
- 238000009795 derivation Methods 0.000 claims description 4
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 claims description 2
- 230000003287 optical effect Effects 0.000 claims description 2
- 238000005070 sampling Methods 0.000 claims description 2
- 230000004927 fusion Effects 0.000 abstract description 20
- 238000007796 conventional method Methods 0.000 abstract description 2
- 238000009499 grossing Methods 0.000 abstract description 2
- 230000010354 integration Effects 0.000 description 4
- 230000000295 complement effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000007499 fusion processing Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241001270131 Agaricus moelleri Species 0.000 description 1
- 244000025254 Cannabis sativa Species 0.000 description 1
- 241001269238 Data Species 0.000 description 1
- 238000012952 Resampling Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing, it belongs to Remote sensing image fusion field.The problem of present invention solves existing Data Fusion for Remote Sensing Image method while smoothed image noise, can also smooth out image detail information.The present invention uses Laplace operator to carry out edge enhancing to Y and Z first, therefore can retain image detail information while smoothed image noise;Then the space degradation model between blending image and Y ' and the Temporal Spectral relational model between blending image and Z ' are utilized, it calculates Y ' and objective function is obtained to the consistency constraint of blending image and the description of image space relationship to consistency constraint, the Z ' of blending image;Objective function is solved finally by conjugate gradient algorithms, calculates blending image using solving result.Compared with the conventional method, method of the invention can retain 95% or more image detail information while smoothing out picture noise.Present invention could apply to Remote sensing image fusion field use.
Description
Technical field
The invention belongs to Remote sensing image fusion fields, and in particular to a kind of remote sensing image data Temporal Spectral integration is melted
Conjunction method.
Background technique
Existing remote sensing image fusion method use three kinds of different levels, respectively data-level, feature rank and certainly
Plan rank, according to the difference of target, remote sensing image fusion can be divided into more visual space fusions, the fusion of sky spectrum and temporal-spatial fusion.
Most of existing Data Fusion for Remote Sensing Image methods can be only applied to integrate space, time and spectral resolution its
In two indices, therefore highest time, space, spectral resolution blending image cannot be obtained.In addition, some remote sensing images
Data fusion method is intended to merge the supplemental information from one or two sensor, cannot make full use of and come from more multisensor
Complementary observation information.In the development of integral fusion theoretical method, domestic scholars are made that larger contribution.It was opened from 2010
Begin, Wuhan University starts correlative study under the subsidy of state natural sciences fund, has been put forward for the first time Temporal Spectral integral fusion
Concept with based on maximum a posteriori probability it is theoretical merge frame, and achieve certain research achievement.Hong Kong Chinese University pair
Remote sensing image data when, sky, spectrum correlation model carried out further exploration, it is equally theoretical based on MAP estimation, propose
A kind of new Temporal Spectral one fusion method, but still in terms of being the fusion for two sensing datas.Meng et al., Wu
Et al. achieve impressive progress in terms of 3 sensor fusions, but when without considering comprehensively, sky, spectrum signature.Meng et al. later
Further provide a unified fusion frame, can either effective integration when, sky, spectrum complementary information, while eliminating sensing
The limitation of device quantity, however the prior model of this method insertion is Gauss-Markov model, although effectively can smoothly scheme
As noise, but the model can also smooth out image detail information, and then affect the detailed information of blending image, and image is caused to clap
The visual effect taken the photograph is bad.
Summary of the invention
The purpose of the present invention is being the existing Data Fusion for Remote Sensing Image method of solution while smoothed image noise,
The problem of image detail information can be smoothed out.
The technical solution adopted by the present invention to solve the above technical problem is:
A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing, method includes the following steps:
Step 1: being obtained to multiple view as Y and auxiliary multi-source observation image Z progress edge enhancing using Laplace operator
To enhanced multiple view as Y ' and auxiliary multi-source observe image Z ';
Step 2: establishing the Temporal Spectral between the space degradation model between blending image x and Y ' and blending image x and Z '
Relational model calculates consistency constraint p (Z ' of the Y ' to the consistency constraint p of blending image x (Y ' | x), Z ' to blending image x
| x) and the description p (x) of image space relationship;
Step 3: obtaining the expression of objective function F (x) according to the calculated p of step 2 (Y ' | x), p (Z ' | x) and p (x)
Formula;
Step 4: solving objective function F (x) by conjugate gradient algorithms, and utilize the solving result of objective function F (x)
Calculate blending image x.
The beneficial effects of the present invention are: the present invention provides a kind of remote sensing image data Temporal Spectral integrations of details enhancing
Fusion method, the present invention use Laplace operator to carry out edge enhancing to multiple view picture and auxiliary multi-source observation image first,
Therefore it can retain image detail information while smoothed image noise;Then blending image and enhanced multiple view are utilized
Temporal Spectral relational model between space degradation model and blending image as between and auxiliary multi-source observation image, calculates enhancing
Multiple view picture afterwards observes image to the consistency constraint and figure of blending image to consistency constraint, the auxiliary multi-source of blending image
The description of image space relationship, and further obtain objective function;Objective function is solved finally by conjugate gradient algorithms, utilizes mesh
The solving result of scalar functions calculates blending image.Compared with the conventional method, method of the invention is smoothing out picture noise
Meanwhile can retain 95% or more image detail information.
Detailed description of the invention
Fig. 1 is a kind of remote sensing image data Temporal Spectral integral fusion method flow diagram of details enhancing of the invention;
Specific embodiment
Specific embodiment 1: as shown in Figure 1, described in present embodiment when a kind of remote sensing image data that details enhances
Sky spectrum integral fusion method, method includes the following steps:
Step 1: being obtained to multiple view as Y and auxiliary multi-source observation image Z progress edge enhancing using Laplace operator
To enhanced multiple view as Y ' and auxiliary multi-source observe image Z ';
Step 2: establishing the Temporal Spectral between the space degradation model between blending image x and Y ' and blending image x and Z '
Relational model calculates consistency constraint p (Z ' of the Y ' to the consistency constraint p of blending image x (Y ' | x), Z ' to blending image x
| x) and the description p (x) of image space relationship;
Step 3: obtaining the expression of objective function F (x) according to the calculated p of step 2 (Y ' | x), p (Z ' | x) and p (x)
Formula;
Step 4: solving objective function F (x) by conjugate gradient algorithms, and utilize the solving result of objective function F (x)
Calculate blending image x.
Specific embodiment 2: the present embodiment is different from the first embodiment in that: the detailed process of step 1 are as follows:
Multiple view is as Y={ y1,...,yk... }, ykIt is k-th of observed image, K is the sum of multiple view picture;In order to obtain
The high-resolution blending image at k moment is taken, on condition that being carved with observed image y in given time Kk, in general, image Y
With higher spectrum and temporal resolution, but spatial resolution is lower.It is therefore believed that Y is the image that space is degenerated.Phase
The x that corresponding enhancing fusion process obtains is the image with higher spatial resolution.
Multi-source is assisted to observe image Z={ z1,z2...,zn,...zN, in which: znN-th image is represented, N is auxiliary multi-source
Observe the sum of image;A usually full-colour image is multispectral (MS), have higher spatial resolution, but spectrum or when
Between resolution ratio it is lower.Integral fusion frame can complete different types of fusion task, including multiple view Space integration, spatial spectrum
Fusion, temporal-spatial fusion, Temporal Spectral fusion.
Laplace operator (Laplace) is an Order Linear Differential Operator, compared with linear first-order differential operator, two
The edge stationkeeping ability of rank differential is stronger, and it is more preferable to sharpen effect.Assuming that multiple view is two dimensional image as Y, the function expression of Y is
F (x, y), x, y are the abscissa and ordinate of two dimensional image, Laplace operator definitions respectively are as follows:
It is all linear operator for Differential Operator with Any Order, so Second Order Differential Operator and subsequent first order differential operator are all
It can be obtained a result with the mode for generating template and then convolution.Had according to the definition of differential:
Formula (2), (3) are combined to obtain with Laplace operator definition:
Laplace operator is expressed as to the form of templateTo program needs, then Laplace operator
Expansion templates be
If can be seen that in a darker region in the picture from template form and a bright spot occur, make
This bright spot can be made to become brighter with Laplace operator operation.Because the edge in image is exactly that those gray scales jump
Region, so laplacian spectral radius template edge details enhancing during have great role.
Finally then the weighted difference for calculating pixel and its neighboring pixel is added in original image again, after obtaining details enhancing
Image.Then function expression g (x, y) of the enhanced multiple view as Y ' are as follows:
Wherein: c is weight coefficient;The value of c and template definition above have relationship, when template center's numerical value takes timing, c
=-1, opposite c=1;
Similarly, enhanced auxiliary multi-source observation image Z ' is obtained.
Specific embodiment 3: present embodiment is unlike specific embodiment two: the detailed process of step 2 are as follows:
Space degradation model between blending image x and Y' is expressed as follows:
y′k,b=DSk,bMkxb+vk,b,1≤b≤Bx,1≤k≤K (6)
y′k,bIndicate the degeneration observed image of b-th of wave band of k-th of image of Y ', xbRepresent b-th of blending image x
Wave band, BxFor the sum of wave band, MkIndicate kinematic matrix, Sk,bIndicate optical dimming matrix, D is down-sampling matrix, vk,bIt is by passing
Zero-mean Gaussian noise caused by sensor and external environment;
Formula (6) is simplified, then space degradation model indicates are as follows:
y′k,b=Ay,k,bxb+vk,b (7)
Wherein: Ay,k,bIndicate the space degenerate matrix image between blending image x and Y ', Ay,k,b=DSk,bMk;
Temporal Spectral relational model between blending image x and Z ' is expressed as follows:
z′n,q=ψn,qCn,qAz,n,qxq+τn,q+vn,q,1≤q≤Bz,n,1≤n≤N (8)
z′n,qIndicate the degeneration observed image of q-th of wave band of the n-th width image of Z ', xqRepresent q-th of blending image x
Band, Bz,nFor the sum of band, Az,n,qIndicate the space degenerate matrix image between blending image x and Z ', Cn,qIt indicates
Spectral correlations matrix, ψn,qFor Time correlation matrix, τn,qFor hour offset amount, vn,qRepresent zero-mean sensor noise;
The then estimated value of blending image xIt is expressed as follows:
P (x | Y ', Z ') blending image x is represented to the consistency constraint of Y' and Z ';P is worked as in representative
When (x | Y ', Z ') is maximized, the value of corresponding x;
It is obtained according to Bayesian formula:
Wherein: p (Y ' | x) indicates Y' to the consistency constraint of blending image x, and p (Z ' | x) indicates Z ' to blending image x's
Consistency constraint, the description of p (x) representative image spatial relationship;
Assuming that zero-mean Gaussian noise v caused by sensor and external environmentk,bRandom Gaussian distribution is obeyed, then p (Y ' | x)
It is expressed as follows:
Wherein: p (y 'k,b|xb) indicate Y ' k-th of image b-th of wave band degeneration observed image to blending image x
B-th of wave band consistency constraint;ay,k,bIt is vk,bVariance, BxIt is spectral band number, φ1φ2Indicate y 'k,bSpace dimension
Degree, | | | |2Indicate 2 norms;
Assuming that vn,qRandom Gaussian distribution is obeyed, then p (Z ' | x) is expressed as follows:
Wherein: p (z 'n,q|xq) indicate Z ' the n-th width image q-th of wave band to the one of q-th of wave band of blending image x
The constraint of cause property;az,n,qIndicate noise vn,qVariance, H1H2Indicate z 'n,qSpatial Dimension;
Third probability density function p (x) is Image Priori Knowledge, for describing the spatial relationship of image.We introduce
Huber-Markov prior model.Compared to Gauss-Markov model, which can eliminate grass in fusion process
The wild effect come, and keep image edge, i.e., details enhances.The adaptive of three-dimensional space spectrum based on Laplace operator adds
Power, then p (x) is indicated are as follows:
Wherein: ax,bIt is the noise v of random Gaussian distributionx,bVariance, L1L2Representation space dimension, ρ () are Huber letter
Number, QxbIndicate adaptive weighted three-dimensional Laplace operator matrix.And QxbIt indicates are as follows:
It is the initial estimate of the blending image obtained according to resampling, β is parameter, it is indicated by (21):
Wherein μ is threshold parameter.Assuming that blending image only has several spectral bands, the curve of spectrum be it is discontinuous, because
It is feasible that this, which enables β=0,.On the contrary, if required image is a high spectrum image, it can be assumed that the curve of spectrum is continuous
, therefore, adaptive weighted priori item can effectively keep the curve of spectrum, reduce spectrum distortion.Assuming that different light
The difference very little between wave band is composed, spectral constraints are stronger.Indicate the gradient of the spectral Dimensions of b-th of wave band.
Specific embodiment 4: present embodiment is unlike specific embodiment three: the detailed process of step 3 are as follows:
Formula (11), (13) and (15) are substituted into formula (10), are handled according to the monotonicity of logarithmic function, then target
Function F (x) is expressed as canonical minimization problem, i.e., according to the monotonicity of logarithmic function and carry out simplify processing, many parameters are all
It can delete, last objective function can be expressed as canonical minimization problem, the estimated value of blending image xExpression formula are as follows:
Wherein:It indicates when F (x) is minimized, the value of corresponding x is estimated value
Wherein first item indicates the consistency constraint between x and y ', and Section 2 indicates the relationship between x and z ', Section 3
Indicate prior image.λ1And λ2Respectively indicate the weight coefficient of each section, λ1And λ2It is related to noise variance, ωn,qIndicate z 'n,q
Contribution to blending image x;It is adaptive polo placement ωn,q=λ 'n,qUnIt obtains.UnIt is by z 'nIt is obtained with the correlation calculations of x
It arrives.It assumes that correlation is bigger, contributes bigger.Auxiliary parameter λ 'n,qTo be adaptively adjusted each band of blending image
Spatial detail balance, is expressed as the blending image of each wave band, is expressed as
λ′n,q=f (zn,q,x)/min[f(z'1,2,x),...f(z'1,q,x)...f(z'n,q,x)]
Wherein f (z'n,q, x) and indicate the z' mergedn,qBand quantity.λ 1 and λ 2 are related to noise variance, and parameter is used to adjust
Save the specific gravity of three parts.
Specific embodiment 5: present embodiment is unlike specific embodiment four: the detailed process of step 4 are as follows:
The process for solving objective function F (x) by conjugate gradient algorithms is as follows:
To the x of objective function F (x)bDerivation obtains:
It indicates to objective function F (xb) derivation,ForTransposition,Indicate the b of blending image x
The corresponding spectral correlations matrix of a wave band;
The estimated value of b-th of wave band of blending image x is obtained by subsequent iteration:
xb,d+1=xb+θdeb,d (19)
Wherein, xb,d+1Indicate b-th of wave band of blending image x after the d+1 times iteration, eb,dIndicate the search of the d times iteration
The initial value in direction, the direction of search isI.e. for the 1st iteration, the direction of search isIt is next to search
The direction of search of the Suo Fangxiang to current iteration and before is related, θdFor iteration step length;
F(xb)dFor the corresponding objective function of the d times iteration;The then direction of search e of the d+1 times iterationb,d+1Are as follows:
Intermediate variable
Utilize the direction of search e of the d+1 times iterationb,d+1Calculate b-th of wave band of blending image x after the d+2 times iteration
xb,d+2, utilize xb,d+2Calculate F (xb)d+2;
And so on, calculate the corresponding objective function of each iteration;It is calculated using the solving result of objective function F (x)
The detailed process of blending image x are as follows: the corresponding objective function of each iteration is substituted into formula (16) respectively, calculates each iteration
The corresponding blending image x of objective function b-th of wave band estimated value, untilWhen, then stop
Only iteration, willThe estimated value of b-th of wave band as blending image x, wherein ζ is iteration ends threshold value;
Similarly, the estimated value for calculating remaining wave band of blending image x obtains blending image x.
Claims (5)
1. a kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing, which is characterized in that this method include with
Lower step:
Step 1: being increased to multiple view as Y and auxiliary multi-source observation image Z progress edge enhancing using Laplace operator
Multiple view after strong is as Y ' and auxiliary multi-source observation image Z ';
Step 2: establishing the space-time genealogical relationship between the space degradation model between blending image x and Y ' and blending image x and Z '
Model calculates consistency constraint p (Z ' of the Y ' to the consistency constraint p of blending image x (Y ' | x) and Z ' to blending image x
| x) and the description p (x) of image space relationship;
Step 3: obtaining the expression formula of objective function F (x) according to the calculated p of step 2 (Y ' | x), p (Z ' | x) and p (x);
Step 4: solving objective function F (x) by conjugate gradient algorithms, and calculated using the solving result of objective function F (x)
Blending image x.
2. a kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing according to claim 1, special
Sign is, the detailed process of the step 1 are as follows:
Multiple view is as Y={ y1,y2...,yk,...yK, wherein ykIt is k-th of observed image, K is the sum of multiple view picture;Auxiliary
Multi-source observes image Z={ z1,z2...,zn,...zN, in which: znN-th image is represented, N is that multi-source is assisted to observe the total of image
Number;
Laplace operator is an Order Linear Differential Operator, it is assumed that multiple view is two dimensional image, the function expression of Y as Y
For f (x, y), then Laplace operator is defined as:
Had according to the definition of differential:
Formula (2), (3) are combined to obtain with Laplace operator definition:
Then function expression g (x, y) of the enhanced multiple view as Y ' are as follows:
Wherein: c is weight coefficient;
Similarly, enhanced auxiliary multi-source observation image Z ' is obtained.
3. a kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing according to claim 2, special
Sign is, the detailed process of the step 2 are as follows:
Space degradation model between blending image x and Y' is expressed as follows:
y′k,b=DSk,bMkxb+vk,b,1≤b≤Bx,1≤k≤K (6)
y′k,bIndicate the degeneration observed image of b-th of wave band of k-th of image of Y ', xbB-th of wave band of blending image x is represented,
BxFor the sum of wave band, MkIndicate kinematic matrix, Sk,bIndicate optical dimming matrix, D is down-sampling matrix, vk,bIt is by sensor
With zero-mean Gaussian noise caused by external environment;
Formula (6) is simplified, then space degradation model indicates are as follows:
y′k,b=Ay,k,bxb+vk,b (7)
Wherein: Ay,k,bIndicate the space degenerate matrix image between blending image x and Y ', Ay,k,b=DSk,bMk;
Temporal Spectral relational model between blending image x and Z ' is expressed as follows:
z′n,q=ψn,qCn,qAz,n,qxq+τn,q+vn,q,1≤q≤Bz,n,1≤n≤N (8)
z′n,qIndicate the degeneration observed image of q-th of wave band of the n-th width image of Z ', xqRepresent q-th of the spectrum of blending image x
Band, Bz,nFor the sum of band, Az,n,qIndicate the space degenerate matrix image between blending image x and Z ', Cn,qIndicate spectrum
Correlation matrix, ψn,qFor Time correlation matrix, τn,qFor hour offset amount, vn,qRepresent zero-mean sensor noise;
The then estimated value of blending image xIt is expressed as follows:
P (x | Y ', Z ') blending image x is represented to the consistency constraint of Y' and Z ';Represent when p (x |
Y ', Z ') when being maximized, the value of corresponding x;
It is obtained according to Bayesian formula:
Wherein: p (Y ' | x) indicates Y' to the consistency constraint of blending image x, and p (Z ' | x) indicates Z ' to the consistent of blending image x
Property constraint, the description of p (x) representative image spatial relationship;
Assuming that zero-mean Gaussian noise v caused by sensor and external environmentk,bRandom Gaussian distribution is obeyed, then p (Y ' | x) is indicated
It is as follows:
Wherein: p (y 'k,b|xb) indicate Y ' k-th of image b-th of wave band degeneration observed image to the b of blending image x
The consistency constraint of a wave band;ay,k,bIt is noise vk,bVariance, φ1φ2Indicate y 'k,bSpatial Dimension, | | | |2Indicate 2 models
Number;
Assuming that vn,qRandom Gaussian distribution is obeyed, then p (Z ' | x) is expressed as follows:
Wherein:Indicate q-th of wave band of the n-th width image of Z ' to the consistency of q-th of wave band of blending image x about
Beam;az,n,qIndicate noise vn,qVariance, H1H2Indicate z 'n,qSpatial Dimension;
Adaptive weighted, then p (x) expression of three-dimensional space spectrum based on Laplace operator are as follows:
Wherein: ax,bIt is the noise v of random Gaussian distributionx,bVariance, L1L2Representation space dimension, ρ () are Huber function,
QxbIndicate adaptive weighted three-dimensional Laplace operator matrix.
4. a kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing according to claim 3, special
Sign is, the detailed process of the step 3 are as follows:
Formula (11), (13) and (15) are substituted into formula (10), then objective function F (x) is expressed as canonical minimization problem:
Wherein:It indicates when F (x) is minimized, the value of corresponding x is estimated value
λ1And λ2Respectively indicate the weight coefficient of each section, ωn,qIndicate z 'n,qContribution to blending image x.
5. a kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing according to claim 4, special
Sign is, the detailed process of the step 4 are as follows:
To the x of objective function F (x)bDerivation obtains:
It indicates to objective function F (xb) derivation,For Ay,k,bTransposition,Indicate b-th of wave of blending image x
The corresponding spectral correlations matrix of section;
The estimated value of b-th of wave band of blending image x is obtained by subsequent iteration:
xb,d+1=xb+θdeb,d (19)
Wherein, xb,d+1Indicate b-th of wave band of blending image x after the d+1 times iteration, eb,dIndicate the searcher of the d times iteration
To the initial value of the direction of search isθdFor iteration step length;
F(xb)dFor the corresponding objective function of the d times iteration;The then direction of search e of the d+1 times iterationb,d+1Are as follows:
Intermediate variable
Utilize the direction of search e of the d+1 times iterationb,d+1Calculate b-th of wave band x of blending image x after the d+2 times iterationb,d+2,
Recycle xb,d+2Calculate F (xb)d+2;
And so on, calculate the corresponding objective function of each iteration;The corresponding objective function of each iteration is substituted into public affairs respectively
Formula (16) calculates the estimated value of b-th of wave band of the corresponding blending image x of objective function of each iteration, untilWhen stop iteration, willThe estimated value of b-th of wave band as blending image x, wherein ζ
For iteration ends threshold value;
Similarly, the estimated value for calculating remaining wave band of blending image x obtains blending image x.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811142766.8A CN109285133A (en) | 2018-09-28 | 2018-09-28 | A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811142766.8A CN109285133A (en) | 2018-09-28 | 2018-09-28 | A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109285133A true CN109285133A (en) | 2019-01-29 |
Family
ID=65182514
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811142766.8A Pending CN109285133A (en) | 2018-09-28 | 2018-09-28 | A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109285133A (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110751036A (en) * | 2019-09-17 | 2020-02-04 | 宁波大学 | High spectrum/multi-spectrum image fast fusion method based on sub-band and blocking strategy |
CN111742329A (en) * | 2020-05-15 | 2020-10-02 | 安徽中科智能感知产业技术研究院有限责任公司 | Mining typical ground object dynamic monitoring method and platform based on multi-source remote sensing data fusion and deep neural network |
CN111882511A (en) * | 2020-07-09 | 2020-11-03 | 广东海洋大学 | Multisource remote sensing image fusion method based on integral enhanced gradient iterative neural network |
CN112017135A (en) * | 2020-07-13 | 2020-12-01 | 香港理工大学深圳研究院 | Method, system and equipment for spatial-temporal fusion of remote sensing image data |
CN112767292A (en) * | 2021-01-05 | 2021-05-07 | 同济大学 | Geographical weighting spatial mixed decomposition method for space-time fusion |
CN112906577A (en) * | 2021-02-23 | 2021-06-04 | 清华大学 | Fusion method of multi-source remote sensing image |
CN113627357A (en) * | 2021-08-13 | 2021-11-09 | 哈尔滨工业大学 | High-spatial-high-spectral-resolution intrinsic decomposition method and system for remote sensing image |
CN115000107A (en) * | 2022-06-02 | 2022-09-02 | 广州睿芯微电子有限公司 | Multispectral imaging chip, multispectral imaging component, preparation method and mobile terminal |
CN115204314A (en) * | 2022-08-12 | 2022-10-18 | 西南交通大学 | Multi-source data fusion method based on vehicle-mounted OBU and vehicle-mounted OBU |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130011078A1 (en) * | 2011-02-03 | 2013-01-10 | Massachusetts Institute Of Technology | Hyper-Resolution Imaging |
CN102915529A (en) * | 2012-10-15 | 2013-02-06 | 黄波 | Integrated fusion technique and system based on remote sensing of time, space, spectrum and angle |
CN105809148A (en) * | 2016-03-29 | 2016-07-27 | 中国科学院遥感与数字地球研究所 | Crop drought recognition and risk evaluation method based on remote sensing time-space-spectrum fusion |
-
2018
- 2018-09-28 CN CN201811142766.8A patent/CN109285133A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130011078A1 (en) * | 2011-02-03 | 2013-01-10 | Massachusetts Institute Of Technology | Hyper-Resolution Imaging |
CN102915529A (en) * | 2012-10-15 | 2013-02-06 | 黄波 | Integrated fusion technique and system based on remote sensing of time, space, spectrum and angle |
CN105809148A (en) * | 2016-03-29 | 2016-07-27 | 中国科学院遥感与数字地球研究所 | Crop drought recognition and risk evaluation method based on remote sensing time-space-spectrum fusion |
Non-Patent Citations (3)
Title |
---|
HUANFENG SHEN等: "An Integrated Framework for the Spatio–Temporal–Spectral Fusion of Remote Sensing Images", 《IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING》 * |
吴娱: "《数字图像处理》", 31 October 2017, 北京邮电大学出版社 * |
孟祥超: "多源时—空—谱光学遥感影像的变分融合方法", 《中国优秀博士学位论文全文数据库基础科学辑》 * |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110751036A (en) * | 2019-09-17 | 2020-02-04 | 宁波大学 | High spectrum/multi-spectrum image fast fusion method based on sub-band and blocking strategy |
CN111742329A (en) * | 2020-05-15 | 2020-10-02 | 安徽中科智能感知产业技术研究院有限责任公司 | Mining typical ground object dynamic monitoring method and platform based on multi-source remote sensing data fusion and deep neural network |
CN111742329B (en) * | 2020-05-15 | 2023-09-12 | 安徽中科智能感知科技股份有限公司 | Mining typical feature dynamic monitoring method and platform based on multi-source remote sensing data fusion and deep neural network |
CN111882511A (en) * | 2020-07-09 | 2020-11-03 | 广东海洋大学 | Multisource remote sensing image fusion method based on integral enhanced gradient iterative neural network |
CN112017135B (en) * | 2020-07-13 | 2021-09-21 | 香港理工大学深圳研究院 | Method, system and equipment for spatial-temporal fusion of remote sensing image data |
CN112017135A (en) * | 2020-07-13 | 2020-12-01 | 香港理工大学深圳研究院 | Method, system and equipment for spatial-temporal fusion of remote sensing image data |
CN112767292A (en) * | 2021-01-05 | 2021-05-07 | 同济大学 | Geographical weighting spatial mixed decomposition method for space-time fusion |
CN112767292B (en) * | 2021-01-05 | 2022-09-16 | 同济大学 | Geographic weighting spatial hybrid decomposition method for space-time fusion |
CN112906577A (en) * | 2021-02-23 | 2021-06-04 | 清华大学 | Fusion method of multi-source remote sensing image |
CN112906577B (en) * | 2021-02-23 | 2024-04-26 | 清华大学 | Fusion method of multisource remote sensing images |
CN113627357A (en) * | 2021-08-13 | 2021-11-09 | 哈尔滨工业大学 | High-spatial-high-spectral-resolution intrinsic decomposition method and system for remote sensing image |
CN115000107A (en) * | 2022-06-02 | 2022-09-02 | 广州睿芯微电子有限公司 | Multispectral imaging chip, multispectral imaging component, preparation method and mobile terminal |
CN115204314A (en) * | 2022-08-12 | 2022-10-18 | 西南交通大学 | Multi-source data fusion method based on vehicle-mounted OBU and vehicle-mounted OBU |
CN115204314B (en) * | 2022-08-12 | 2023-05-30 | 西南交通大学 | Multi-source data fusion method based on vehicle-mounted OBU and vehicle-mounted OBU |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109285133A (en) | A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing | |
CN109859147B (en) | Real image denoising method based on generation of antagonistic network noise modeling | |
Riegler et al. | A deep primal-dual network for guided depth super-resolution | |
CN107123089A (en) | Remote sensing images super-resolution reconstruction method and system based on depth convolutional network | |
CN114119444B (en) | Multi-source remote sensing image fusion method based on deep neural network | |
CN110210524B (en) | Training method of image enhancement model, image enhancement method and device | |
Lepcha et al. | A deep journey into image enhancement: A survey of current and emerging trends | |
CN104835130A (en) | Multi-exposure image fusion method | |
CN107507135A (en) | Image reconstructing method based on coding aperture and target | |
CN103530848A (en) | Double exposure implementation method for inhomogeneous illumination image | |
CN112837224A (en) | Super-resolution image reconstruction method based on convolutional neural network | |
CN103559684B (en) | Based on the image recovery method of smooth correction | |
AU2013258866B2 (en) | Reducing the dynamic range of image data | |
CN113870124B (en) | Weak supervision-based double-network mutual excitation learning shadow removing method | |
CN117197627B (en) | Multi-mode image fusion method based on high-order degradation model | |
CN104732492A (en) | Depth image denoising method | |
Li et al. | A self-learning image super-resolution method via sparse representation and non-local similarity | |
CN112184646A (en) | Image fusion method based on gradient domain oriented filtering and improved PCNN | |
Liu et al. | Facial image inpainting using attention-based multi-level generative network | |
CN110060226B (en) | Adaptive image fusion method based on human visual gradient transformation and total variation parameters | |
CN112927164B (en) | No-reference low-illumination image enhancement method based on deep convolutional neural network | |
CN112862684A (en) | Data processing method for depth map super-resolution reconstruction and denoising neural network | |
CN116452431A (en) | Weak light image enhancement method based on multi-branch progressive depth network | |
Diana Earshia et al. | A guided optimized recursive least square adaptive filtering based multi-variate dense fusion network model for image interpolation | |
Liu et al. | Research on image enhancement algorithm based on artificial intelligence |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190129 |