CN109285133A - A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing - Google Patents

A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing Download PDF

Info

Publication number
CN109285133A
CN109285133A CN201811142766.8A CN201811142766A CN109285133A CN 109285133 A CN109285133 A CN 109285133A CN 201811142766 A CN201811142766 A CN 201811142766A CN 109285133 A CN109285133 A CN 109285133A
Authority
CN
China
Prior art keywords
image
indicate
blending
blending image
wave band
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811142766.8A
Other languages
Chinese (zh)
Inventor
林连雷
王建峰
杨京礼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN201811142766.8A priority Critical patent/CN109285133A/en
Publication of CN109285133A publication Critical patent/CN109285133A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Abstract

A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing, it belongs to Remote sensing image fusion field.The problem of present invention solves existing Data Fusion for Remote Sensing Image method while smoothed image noise, can also smooth out image detail information.The present invention uses Laplace operator to carry out edge enhancing to Y and Z first, therefore can retain image detail information while smoothed image noise;Then the space degradation model between blending image and Y ' and the Temporal Spectral relational model between blending image and Z ' are utilized, it calculates Y ' and objective function is obtained to the consistency constraint of blending image and the description of image space relationship to consistency constraint, the Z ' of blending image;Objective function is solved finally by conjugate gradient algorithms, calculates blending image using solving result.Compared with the conventional method, method of the invention can retain 95% or more image detail information while smoothing out picture noise.Present invention could apply to Remote sensing image fusion field use.

Description

A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing
Technical field
The invention belongs to Remote sensing image fusion fields, and in particular to a kind of remote sensing image data Temporal Spectral integration is melted Conjunction method.
Background technique
Existing remote sensing image fusion method use three kinds of different levels, respectively data-level, feature rank and certainly Plan rank, according to the difference of target, remote sensing image fusion can be divided into more visual space fusions, the fusion of sky spectrum and temporal-spatial fusion.
Most of existing Data Fusion for Remote Sensing Image methods can be only applied to integrate space, time and spectral resolution its In two indices, therefore highest time, space, spectral resolution blending image cannot be obtained.In addition, some remote sensing images Data fusion method is intended to merge the supplemental information from one or two sensor, cannot make full use of and come from more multisensor Complementary observation information.In the development of integral fusion theoretical method, domestic scholars are made that larger contribution.It was opened from 2010 Begin, Wuhan University starts correlative study under the subsidy of state natural sciences fund, has been put forward for the first time Temporal Spectral integral fusion Concept with based on maximum a posteriori probability it is theoretical merge frame, and achieve certain research achievement.Hong Kong Chinese University pair Remote sensing image data when, sky, spectrum correlation model carried out further exploration, it is equally theoretical based on MAP estimation, propose A kind of new Temporal Spectral one fusion method, but still in terms of being the fusion for two sensing datas.Meng et al., Wu Et al. achieve impressive progress in terms of 3 sensor fusions, but when without considering comprehensively, sky, spectrum signature.Meng et al. later Further provide a unified fusion frame, can either effective integration when, sky, spectrum complementary information, while eliminating sensing The limitation of device quantity, however the prior model of this method insertion is Gauss-Markov model, although effectively can smoothly scheme As noise, but the model can also smooth out image detail information, and then affect the detailed information of blending image, and image is caused to clap The visual effect taken the photograph is bad.
Summary of the invention
The purpose of the present invention is being the existing Data Fusion for Remote Sensing Image method of solution while smoothed image noise, The problem of image detail information can be smoothed out.
The technical solution adopted by the present invention to solve the above technical problem is:
A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing, method includes the following steps:
Step 1: being obtained to multiple view as Y and auxiliary multi-source observation image Z progress edge enhancing using Laplace operator To enhanced multiple view as Y ' and auxiliary multi-source observe image Z ';
Step 2: establishing the Temporal Spectral between the space degradation model between blending image x and Y ' and blending image x and Z ' Relational model calculates consistency constraint p (Z ' of the Y ' to the consistency constraint p of blending image x (Y ' | x), Z ' to blending image x | x) and the description p (x) of image space relationship;
Step 3: obtaining the expression of objective function F (x) according to the calculated p of step 2 (Y ' | x), p (Z ' | x) and p (x) Formula;
Step 4: solving objective function F (x) by conjugate gradient algorithms, and utilize the solving result of objective function F (x) Calculate blending image x.
The beneficial effects of the present invention are: the present invention provides a kind of remote sensing image data Temporal Spectral integrations of details enhancing Fusion method, the present invention use Laplace operator to carry out edge enhancing to multiple view picture and auxiliary multi-source observation image first, Therefore it can retain image detail information while smoothed image noise;Then blending image and enhanced multiple view are utilized Temporal Spectral relational model between space degradation model and blending image as between and auxiliary multi-source observation image, calculates enhancing Multiple view picture afterwards observes image to the consistency constraint and figure of blending image to consistency constraint, the auxiliary multi-source of blending image The description of image space relationship, and further obtain objective function;Objective function is solved finally by conjugate gradient algorithms, utilizes mesh The solving result of scalar functions calculates blending image.Compared with the conventional method, method of the invention is smoothing out picture noise Meanwhile can retain 95% or more image detail information.
Detailed description of the invention
Fig. 1 is a kind of remote sensing image data Temporal Spectral integral fusion method flow diagram of details enhancing of the invention;
Specific embodiment
Specific embodiment 1: as shown in Figure 1, described in present embodiment when a kind of remote sensing image data that details enhances Sky spectrum integral fusion method, method includes the following steps:
Step 1: being obtained to multiple view as Y and auxiliary multi-source observation image Z progress edge enhancing using Laplace operator To enhanced multiple view as Y ' and auxiliary multi-source observe image Z ';
Step 2: establishing the Temporal Spectral between the space degradation model between blending image x and Y ' and blending image x and Z ' Relational model calculates consistency constraint p (Z ' of the Y ' to the consistency constraint p of blending image x (Y ' | x), Z ' to blending image x | x) and the description p (x) of image space relationship;
Step 3: obtaining the expression of objective function F (x) according to the calculated p of step 2 (Y ' | x), p (Z ' | x) and p (x) Formula;
Step 4: solving objective function F (x) by conjugate gradient algorithms, and utilize the solving result of objective function F (x) Calculate blending image x.
Specific embodiment 2: the present embodiment is different from the first embodiment in that: the detailed process of step 1 are as follows:
Multiple view is as Y={ y1,...,yk... }, ykIt is k-th of observed image, K is the sum of multiple view picture;In order to obtain The high-resolution blending image at k moment is taken, on condition that being carved with observed image y in given time Kk, in general, image Y With higher spectrum and temporal resolution, but spatial resolution is lower.It is therefore believed that Y is the image that space is degenerated.Phase The x that corresponding enhancing fusion process obtains is the image with higher spatial resolution.
Multi-source is assisted to observe image Z={ z1,z2...,zn,...zN, in which: znN-th image is represented, N is auxiliary multi-source Observe the sum of image;A usually full-colour image is multispectral (MS), have higher spatial resolution, but spectrum or when Between resolution ratio it is lower.Integral fusion frame can complete different types of fusion task, including multiple view Space integration, spatial spectrum Fusion, temporal-spatial fusion, Temporal Spectral fusion.
Laplace operator (Laplace) is an Order Linear Differential Operator, compared with linear first-order differential operator, two The edge stationkeeping ability of rank differential is stronger, and it is more preferable to sharpen effect.Assuming that multiple view is two dimensional image as Y, the function expression of Y is F (x, y), x, y are the abscissa and ordinate of two dimensional image, Laplace operator definitions respectively are as follows:
It is all linear operator for Differential Operator with Any Order, so Second Order Differential Operator and subsequent first order differential operator are all It can be obtained a result with the mode for generating template and then convolution.Had according to the definition of differential:
Formula (2), (3) are combined to obtain with Laplace operator definition:
Laplace operator is expressed as to the form of templateTo program needs, then Laplace operator Expansion templates be
If can be seen that in a darker region in the picture from template form and a bright spot occur, make This bright spot can be made to become brighter with Laplace operator operation.Because the edge in image is exactly that those gray scales jump Region, so laplacian spectral radius template edge details enhancing during have great role.
Finally then the weighted difference for calculating pixel and its neighboring pixel is added in original image again, after obtaining details enhancing Image.Then function expression g (x, y) of the enhanced multiple view as Y ' are as follows:
Wherein: c is weight coefficient;The value of c and template definition above have relationship, when template center's numerical value takes timing, c =-1, opposite c=1;
Similarly, enhanced auxiliary multi-source observation image Z ' is obtained.
Specific embodiment 3: present embodiment is unlike specific embodiment two: the detailed process of step 2 are as follows:
Space degradation model between blending image x and Y' is expressed as follows:
y′k,b=DSk,bMkxb+vk,b,1≤b≤Bx,1≤k≤K (6)
y′k,bIndicate the degeneration observed image of b-th of wave band of k-th of image of Y ', xbRepresent b-th of blending image x Wave band, BxFor the sum of wave band, MkIndicate kinematic matrix, Sk,bIndicate optical dimming matrix, D is down-sampling matrix, vk,bIt is by passing Zero-mean Gaussian noise caused by sensor and external environment;
Formula (6) is simplified, then space degradation model indicates are as follows:
y′k,b=Ay,k,bxb+vk,b (7)
Wherein: Ay,k,bIndicate the space degenerate matrix image between blending image x and Y ', Ay,k,b=DSk,bMk
Temporal Spectral relational model between blending image x and Z ' is expressed as follows:
z′n,qn,qCn,qAz,n,qxqn,q+vn,q,1≤q≤Bz,n,1≤n≤N (8)
z′n,qIndicate the degeneration observed image of q-th of wave band of the n-th width image of Z ', xqRepresent q-th of blending image x Band, Bz,nFor the sum of band, Az,n,qIndicate the space degenerate matrix image between blending image x and Z ', Cn,qIt indicates Spectral correlations matrix, ψn,qFor Time correlation matrix, τn,qFor hour offset amount, vn,qRepresent zero-mean sensor noise;
The then estimated value of blending image xIt is expressed as follows:
P (x | Y ', Z ') blending image x is represented to the consistency constraint of Y' and Z ';P is worked as in representative When (x | Y ', Z ') is maximized, the value of corresponding x;
It is obtained according to Bayesian formula:
Wherein: p (Y ' | x) indicates Y' to the consistency constraint of blending image x, and p (Z ' | x) indicates Z ' to blending image x's Consistency constraint, the description of p (x) representative image spatial relationship;
Assuming that zero-mean Gaussian noise v caused by sensor and external environmentk,bRandom Gaussian distribution is obeyed, then p (Y ' | x) It is expressed as follows:
Wherein: p (y 'k,b|xb) indicate Y ' k-th of image b-th of wave band degeneration observed image to blending image x B-th of wave band consistency constraint;ay,k,bIt is vk,bVariance, BxIt is spectral band number, φ1φ2Indicate y 'k,bSpace dimension Degree, | | | |2Indicate 2 norms;
Assuming that vn,qRandom Gaussian distribution is obeyed, then p (Z ' | x) is expressed as follows:
Wherein: p (z 'n,q|xq) indicate Z ' the n-th width image q-th of wave band to the one of q-th of wave band of blending image x The constraint of cause property;az,n,qIndicate noise vn,qVariance, H1H2Indicate z 'n,qSpatial Dimension;
Third probability density function p (x) is Image Priori Knowledge, for describing the spatial relationship of image.We introduce Huber-Markov prior model.Compared to Gauss-Markov model, which can eliminate grass in fusion process The wild effect come, and keep image edge, i.e., details enhances.The adaptive of three-dimensional space spectrum based on Laplace operator adds Power, then p (x) is indicated are as follows:
Wherein: ax,bIt is the noise v of random Gaussian distributionx,bVariance, L1L2Representation space dimension, ρ () are Huber letter Number, QxbIndicate adaptive weighted three-dimensional Laplace operator matrix.And QxbIt indicates are as follows:
It is the initial estimate of the blending image obtained according to resampling, β is parameter, it is indicated by (21):
Wherein μ is threshold parameter.Assuming that blending image only has several spectral bands, the curve of spectrum be it is discontinuous, because It is feasible that this, which enables β=0,.On the contrary, if required image is a high spectrum image, it can be assumed that the curve of spectrum is continuous , therefore, adaptive weighted priori item can effectively keep the curve of spectrum, reduce spectrum distortion.Assuming that different light The difference very little between wave band is composed, spectral constraints are stronger.Indicate the gradient of the spectral Dimensions of b-th of wave band.
Specific embodiment 4: present embodiment is unlike specific embodiment three: the detailed process of step 3 are as follows:
Formula (11), (13) and (15) are substituted into formula (10), are handled according to the monotonicity of logarithmic function, then target Function F (x) is expressed as canonical minimization problem, i.e., according to the monotonicity of logarithmic function and carry out simplify processing, many parameters are all It can delete, last objective function can be expressed as canonical minimization problem, the estimated value of blending image xExpression formula are as follows:
Wherein:It indicates when F (x) is minimized, the value of corresponding x is estimated value
Wherein first item indicates the consistency constraint between x and y ', and Section 2 indicates the relationship between x and z ', Section 3 Indicate prior image.λ1And λ2Respectively indicate the weight coefficient of each section, λ1And λ2It is related to noise variance, ωn,qIndicate z 'n,q Contribution to blending image x;It is adaptive polo placement ωn,q=λ 'n,qUnIt obtains.UnIt is by z 'nIt is obtained with the correlation calculations of x It arrives.It assumes that correlation is bigger, contributes bigger.Auxiliary parameter λ 'n,qTo be adaptively adjusted each band of blending image Spatial detail balance, is expressed as the blending image of each wave band, is expressed as
λ′n,q=f (zn,q,x)/min[f(z'1,2,x),...f(z'1,q,x)...f(z'n,q,x)]
Wherein f (z'n,q, x) and indicate the z' mergedn,qBand quantity.λ 1 and λ 2 are related to noise variance, and parameter is used to adjust Save the specific gravity of three parts.
Specific embodiment 5: present embodiment is unlike specific embodiment four: the detailed process of step 4 are as follows:
The process for solving objective function F (x) by conjugate gradient algorithms is as follows:
To the x of objective function F (x)bDerivation obtains:
It indicates to objective function F (xb) derivation,ForTransposition,Indicate the b of blending image x The corresponding spectral correlations matrix of a wave band;
The estimated value of b-th of wave band of blending image x is obtained by subsequent iteration:
xb,d+1=xbdeb,d (19)
Wherein, xb,d+1Indicate b-th of wave band of blending image x after the d+1 times iteration, eb,dIndicate the search of the d times iteration The initial value in direction, the direction of search isI.e. for the 1st iteration, the direction of search isIt is next to search The direction of search of the Suo Fangxiang to current iteration and before is related, θdFor iteration step length;
F(xb)dFor the corresponding objective function of the d times iteration;The then direction of search e of the d+1 times iterationb,d+1Are as follows:
Intermediate variable
Utilize the direction of search e of the d+1 times iterationb,d+1Calculate b-th of wave band of blending image x after the d+2 times iteration xb,d+2, utilize xb,d+2Calculate F (xb)d+2
And so on, calculate the corresponding objective function of each iteration;It is calculated using the solving result of objective function F (x) The detailed process of blending image x are as follows: the corresponding objective function of each iteration is substituted into formula (16) respectively, calculates each iteration The corresponding blending image x of objective function b-th of wave band estimated value, untilWhen, then stop Only iteration, willThe estimated value of b-th of wave band as blending image x, wherein ζ is iteration ends threshold value;
Similarly, the estimated value for calculating remaining wave band of blending image x obtains blending image x.

Claims (5)

1. a kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing, which is characterized in that this method include with Lower step:
Step 1: being increased to multiple view as Y and auxiliary multi-source observation image Z progress edge enhancing using Laplace operator Multiple view after strong is as Y ' and auxiliary multi-source observation image Z ';
Step 2: establishing the space-time genealogical relationship between the space degradation model between blending image x and Y ' and blending image x and Z ' Model calculates consistency constraint p (Z ' of the Y ' to the consistency constraint p of blending image x (Y ' | x) and Z ' to blending image x | x) and the description p (x) of image space relationship;
Step 3: obtaining the expression formula of objective function F (x) according to the calculated p of step 2 (Y ' | x), p (Z ' | x) and p (x);
Step 4: solving objective function F (x) by conjugate gradient algorithms, and calculated using the solving result of objective function F (x) Blending image x.
2. a kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing according to claim 1, special Sign is, the detailed process of the step 1 are as follows:
Multiple view is as Y={ y1,y2...,yk,...yK, wherein ykIt is k-th of observed image, K is the sum of multiple view picture;Auxiliary Multi-source observes image Z={ z1,z2...,zn,...zN, in which: znN-th image is represented, N is that multi-source is assisted to observe the total of image Number;
Laplace operator is an Order Linear Differential Operator, it is assumed that multiple view is two dimensional image, the function expression of Y as Y For f (x, y), then Laplace operator is defined as:
Had according to the definition of differential:
Formula (2), (3) are combined to obtain with Laplace operator definition:
Then function expression g (x, y) of the enhanced multiple view as Y ' are as follows:
Wherein: c is weight coefficient;
Similarly, enhanced auxiliary multi-source observation image Z ' is obtained.
3. a kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing according to claim 2, special Sign is, the detailed process of the step 2 are as follows:
Space degradation model between blending image x and Y' is expressed as follows:
y′k,b=DSk,bMkxb+vk,b,1≤b≤Bx,1≤k≤K (6)
y′k,bIndicate the degeneration observed image of b-th of wave band of k-th of image of Y ', xbB-th of wave band of blending image x is represented, BxFor the sum of wave band, MkIndicate kinematic matrix, Sk,bIndicate optical dimming matrix, D is down-sampling matrix, vk,bIt is by sensor With zero-mean Gaussian noise caused by external environment;
Formula (6) is simplified, then space degradation model indicates are as follows:
y′k,b=Ay,k,bxb+vk,b (7)
Wherein: Ay,k,bIndicate the space degenerate matrix image between blending image x and Y ', Ay,k,b=DSk,bMk
Temporal Spectral relational model between blending image x and Z ' is expressed as follows:
z′n,qn,qCn,qAz,n,qxqn,q+vn,q,1≤q≤Bz,n,1≤n≤N (8)
z′n,qIndicate the degeneration observed image of q-th of wave band of the n-th width image of Z ', xqRepresent q-th of the spectrum of blending image x Band, Bz,nFor the sum of band, Az,n,qIndicate the space degenerate matrix image between blending image x and Z ', Cn,qIndicate spectrum Correlation matrix, ψn,qFor Time correlation matrix, τn,qFor hour offset amount, vn,qRepresent zero-mean sensor noise;
The then estimated value of blending image xIt is expressed as follows:
P (x | Y ', Z ') blending image x is represented to the consistency constraint of Y' and Z ';Represent when p (x | Y ', Z ') when being maximized, the value of corresponding x;
It is obtained according to Bayesian formula:
Wherein: p (Y ' | x) indicates Y' to the consistency constraint of blending image x, and p (Z ' | x) indicates Z ' to the consistent of blending image x Property constraint, the description of p (x) representative image spatial relationship;
Assuming that zero-mean Gaussian noise v caused by sensor and external environmentk,bRandom Gaussian distribution is obeyed, then p (Y ' | x) is indicated It is as follows:
Wherein: p (y 'k,b|xb) indicate Y ' k-th of image b-th of wave band degeneration observed image to the b of blending image x The consistency constraint of a wave band;ay,k,bIt is noise vk,bVariance, φ1φ2Indicate y 'k,bSpatial Dimension, | | | |2Indicate 2 models Number;
Assuming that vn,qRandom Gaussian distribution is obeyed, then p (Z ' | x) is expressed as follows:
Wherein:Indicate q-th of wave band of the n-th width image of Z ' to the consistency of q-th of wave band of blending image x about Beam;az,n,qIndicate noise vn,qVariance, H1H2Indicate z 'n,qSpatial Dimension;
Adaptive weighted, then p (x) expression of three-dimensional space spectrum based on Laplace operator are as follows:
Wherein: ax,bIt is the noise v of random Gaussian distributionx,bVariance, L1L2Representation space dimension, ρ () are Huber function, QxbIndicate adaptive weighted three-dimensional Laplace operator matrix.
4. a kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing according to claim 3, special Sign is, the detailed process of the step 3 are as follows:
Formula (11), (13) and (15) are substituted into formula (10), then objective function F (x) is expressed as canonical minimization problem:
Wherein:It indicates when F (x) is minimized, the value of corresponding x is estimated value
λ1And λ2Respectively indicate the weight coefficient of each section, ωn,qIndicate z 'n,qContribution to blending image x.
5. a kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing according to claim 4, special Sign is, the detailed process of the step 4 are as follows:
To the x of objective function F (x)bDerivation obtains:
It indicates to objective function F (xb) derivation,For Ay,k,bTransposition,Indicate b-th of wave of blending image x The corresponding spectral correlations matrix of section;
The estimated value of b-th of wave band of blending image x is obtained by subsequent iteration:
xb,d+1=xbdeb,d (19)
Wherein, xb,d+1Indicate b-th of wave band of blending image x after the d+1 times iteration, eb,dIndicate the searcher of the d times iteration To the initial value of the direction of search isθdFor iteration step length;
F(xb)dFor the corresponding objective function of the d times iteration;The then direction of search e of the d+1 times iterationb,d+1Are as follows:
Intermediate variable
Utilize the direction of search e of the d+1 times iterationb,d+1Calculate b-th of wave band x of blending image x after the d+2 times iterationb,d+2, Recycle xb,d+2Calculate F (xb)d+2
And so on, calculate the corresponding objective function of each iteration;The corresponding objective function of each iteration is substituted into public affairs respectively Formula (16) calculates the estimated value of b-th of wave band of the corresponding blending image x of objective function of each iteration, untilWhen stop iteration, willThe estimated value of b-th of wave band as blending image x, wherein ζ For iteration ends threshold value;
Similarly, the estimated value for calculating remaining wave band of blending image x obtains blending image x.
CN201811142766.8A 2018-09-28 2018-09-28 A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing Pending CN109285133A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811142766.8A CN109285133A (en) 2018-09-28 2018-09-28 A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811142766.8A CN109285133A (en) 2018-09-28 2018-09-28 A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing

Publications (1)

Publication Number Publication Date
CN109285133A true CN109285133A (en) 2019-01-29

Family

ID=65182514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811142766.8A Pending CN109285133A (en) 2018-09-28 2018-09-28 A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing

Country Status (1)

Country Link
CN (1) CN109285133A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110751036A (en) * 2019-09-17 2020-02-04 宁波大学 High spectrum/multi-spectrum image fast fusion method based on sub-band and blocking strategy
CN111742329A (en) * 2020-05-15 2020-10-02 安徽中科智能感知产业技术研究院有限责任公司 Mining typical ground object dynamic monitoring method and platform based on multi-source remote sensing data fusion and deep neural network
CN112017135A (en) * 2020-07-13 2020-12-01 香港理工大学深圳研究院 Method, system and equipment for spatial-temporal fusion of remote sensing image data
CN112767292A (en) * 2021-01-05 2021-05-07 同济大学 Geographical weighting spatial mixed decomposition method for space-time fusion
CN112906577A (en) * 2021-02-23 2021-06-04 清华大学 Fusion method of multi-source remote sensing image
CN113627357A (en) * 2021-08-13 2021-11-09 哈尔滨工业大学 High-spatial-high-spectral-resolution intrinsic decomposition method and system for remote sensing image
CN115204314A (en) * 2022-08-12 2022-10-18 西南交通大学 Multi-source data fusion method based on vehicle-mounted OBU and vehicle-mounted OBU
CN112906577B (en) * 2021-02-23 2024-04-26 清华大学 Fusion method of multisource remote sensing images

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130011078A1 (en) * 2011-02-03 2013-01-10 Massachusetts Institute Of Technology Hyper-Resolution Imaging
CN102915529A (en) * 2012-10-15 2013-02-06 黄波 Integrated fusion technique and system based on remote sensing of time, space, spectrum and angle
CN105809148A (en) * 2016-03-29 2016-07-27 中国科学院遥感与数字地球研究所 Crop drought recognition and risk evaluation method based on remote sensing time-space-spectrum fusion

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130011078A1 (en) * 2011-02-03 2013-01-10 Massachusetts Institute Of Technology Hyper-Resolution Imaging
CN102915529A (en) * 2012-10-15 2013-02-06 黄波 Integrated fusion technique and system based on remote sensing of time, space, spectrum and angle
CN105809148A (en) * 2016-03-29 2016-07-27 中国科学院遥感与数字地球研究所 Crop drought recognition and risk evaluation method based on remote sensing time-space-spectrum fusion

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HUANFENG SHEN等: "An Integrated Framework for the Spatio–Temporal–Spectral Fusion of Remote Sensing Images", 《IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING》 *
吴娱: "《数字图像处理》", 31 October 2017, 北京邮电大学出版社 *
孟祥超: "多源时—空—谱光学遥感影像的变分融合方法", 《中国优秀博士学位论文全文数据库基础科学辑》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110751036A (en) * 2019-09-17 2020-02-04 宁波大学 High spectrum/multi-spectrum image fast fusion method based on sub-band and blocking strategy
CN111742329A (en) * 2020-05-15 2020-10-02 安徽中科智能感知产业技术研究院有限责任公司 Mining typical ground object dynamic monitoring method and platform based on multi-source remote sensing data fusion and deep neural network
CN111742329B (en) * 2020-05-15 2023-09-12 安徽中科智能感知科技股份有限公司 Mining typical feature dynamic monitoring method and platform based on multi-source remote sensing data fusion and deep neural network
CN112017135A (en) * 2020-07-13 2020-12-01 香港理工大学深圳研究院 Method, system and equipment for spatial-temporal fusion of remote sensing image data
CN112017135B (en) * 2020-07-13 2021-09-21 香港理工大学深圳研究院 Method, system and equipment for spatial-temporal fusion of remote sensing image data
CN112767292A (en) * 2021-01-05 2021-05-07 同济大学 Geographical weighting spatial mixed decomposition method for space-time fusion
CN112767292B (en) * 2021-01-05 2022-09-16 同济大学 Geographic weighting spatial hybrid decomposition method for space-time fusion
CN112906577A (en) * 2021-02-23 2021-06-04 清华大学 Fusion method of multi-source remote sensing image
CN112906577B (en) * 2021-02-23 2024-04-26 清华大学 Fusion method of multisource remote sensing images
CN113627357A (en) * 2021-08-13 2021-11-09 哈尔滨工业大学 High-spatial-high-spectral-resolution intrinsic decomposition method and system for remote sensing image
CN115204314A (en) * 2022-08-12 2022-10-18 西南交通大学 Multi-source data fusion method based on vehicle-mounted OBU and vehicle-mounted OBU
CN115204314B (en) * 2022-08-12 2023-05-30 西南交通大学 Multi-source data fusion method based on vehicle-mounted OBU and vehicle-mounted OBU

Similar Documents

Publication Publication Date Title
CN109285133A (en) A kind of remote sensing image data Temporal Spectral integral fusion method of details enhancing
CN109859147B (en) Real image denoising method based on generation of antagonistic network noise modeling
Shen et al. Generalized random walks for fusion of multi-exposure images
CN104835130A (en) Multi-exposure image fusion method
CN103559684B (en) Based on the image recovery method of smooth correction
CN109872285A (en) A kind of Retinex low-luminance color image enchancing method based on variational methods
CN110210524B (en) Training method of image enhancement model, image enhancement method and device
CN103530848A (en) Double exposure implementation method for inhomogeneous illumination image
CN101930604B (en) Infusion method of full-color image and multi-spectral image based on low-frequency correlation analysis
CN114119444A (en) Multi-source remote sensing image fusion method based on deep neural network
AU2013258866B2 (en) Reducing the dynamic range of image data
CN106169181A (en) A kind of image processing method and system
CN104732492A (en) Depth image denoising method
Lepcha et al. A deep journey into image enhancement: A survey of current and emerging trends
CN112837224A (en) Super-resolution image reconstruction method based on convolutional neural network
CN113870124B (en) Weak supervision-based double-network mutual excitation learning shadow removing method
Li et al. A self-learning image super-resolution method via sparse representation and non-local similarity
Liu et al. Video frame interpolation via optical flow estimation with image inpainting
Liu et al. Facial image inpainting using attention-based multi-level generative network
CN112184646A (en) Image fusion method based on gradient domain oriented filtering and improved PCNN
CN104036481A (en) Multi-focus image fusion method based on depth information extraction
Xiao et al. Variational pansharpening based on coefficient estimation with nonlocal regression
CN112862684A (en) Data processing method for depth map super-resolution reconstruction and denoising neural network
CN112927164B (en) No-reference low-illumination image enhancement method based on deep convolutional neural network
CN110060226B (en) Adaptive image fusion method based on human visual gradient transformation and total variation parameters

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190129