CN103914815A - Image fusion method and device - Google Patents

Image fusion method and device Download PDF

Info

Publication number
CN103914815A
CN103914815A CN201210596216.XA CN201210596216A CN103914815A CN 103914815 A CN103914815 A CN 103914815A CN 201210596216 A CN201210596216 A CN 201210596216A CN 103914815 A CN103914815 A CN 103914815A
Authority
CN
China
Prior art keywords
image
energy function
component
norm
total variation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201210596216.XA
Other languages
Chinese (zh)
Inventor
姜安
崔峰
訾树波
谢启伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Technologies Oy
Original Assignee
Nokia Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Oyj filed Critical Nokia Oyj
Priority to CN201210596216.XA priority Critical patent/CN103914815A/en
Priority to PCT/FI2013/051212 priority patent/WO2014102458A1/en
Publication of CN103914815A publication Critical patent/CN103914815A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a method for image fusion and a corresponding device. The method comprises the following steps: to begin with, obtaining a plurality of images; and then, fusing the plurality of images based on a total variation-L1 norm energy function, wherein the total variation-L1 norm energy function comprises two components: one component is a total variation component which expresses detail information of high-definition images is retained, while the other component is an L1 norm component which expresses spectral information of low-resolution multispectral images is retained. According to the image fusion method and device, the detail information and spectral information of the images after fusion can be balanced.

Description

Image interfusion method and equipment
Technical field
The present invention relates to image processing techniques, more specifically, the present invention relates to image interfusion method and equipment.
Background technology
Image co-registration (Image Fusion) refers to by several source images about same target are carried out to image processing, retains to greatest extent the favourable information in source images separately, by comprehensive several source images become the process of high-quality single image.
The application of image co-registration comprises remote sense, medical imaging, quality and defects detection and bio-identification etc.Especially in mobile communication, exist and strong multiple image is merged to obtain about the better description of the scene of sensing and the demand of explanation.
Image co-registration should be followed some fusion rule, to build a composograph.Therefore, how fusion rule/algorithm being set, is a problem.Conventional algorithm can be categorized as 3 classes: 1) projection and alternative method, and for example brightness-tone-saturation degree (IHS) color integration and principal component analysis (PCA) (PCA) are merged; 2) band ratio and arithmetic combined method, for example, multiply each other with compositional variable than (SVR); 3) fusion method based on small echo.
in document " Multiresolution based image fusion with additive wavelet decomposition; IEEE Tran.Geosci.Rem.Sensing; 32 (3), 1204-1211 (1999) ", having proposed use additivity small echo (AW) algorithm Deng people merges a high-definition picture (SPOT) and a low resolution multispectral image (Landsat Thematic Mapper (TM) image) mutually.In this scheme, the Atrous small echo that is substituted SPOT image by the wave band of TM image is approximate.The people such as Li have proposed the MAXIMUM SELECTION rule (AMS) based on region based on wavelet transformation in document " Multisensor image fusion using the wavelet transform; Graphical Models and Image Processing; 57,235-245 (1995) ".The people such as Scheunders are at document " A Multivalued ImageWavelet Representation Based on Multiscale Fundamental Forms, IEEE Transactions on Image Processing, 10 (5), 568-575 (2002) " and document " Fusion and merging of multispectral images using multiscale fundamental forms, Journal of the Optical Society of America A, Optics, Image Science, and Vision.18 (10), 2468-2477 (2001) " in provided the novel algorithm based on multiple dimensioned first fundamental form (MFF) method, it carrys out fused images by multivalue image Wavelet representation for transient method.The people such as Chen have proposed the multiple dimensioned first fundamental form of improved weighting (WMFF) method and have avoided MFF method to amplify the problem of wavelet coefficient in document " Image fusion using weighted multiscale fundamental form; Proceedings of the2004International Conference on Image Processing; 5,3319-3322 (2004) ".
But these fusion methods based on small echo recited above have some common restrictions: a restriction is to select predefined small echo basis, and another restriction is how correctly to select wavelet decomposition level.And AW and AMS retain more spectral information and ignore some spaces or detailed information, and MFF and WMFF pay close attention to space or detailed information more, and distortion spectral information.
In order to address the above problem, the people such as Chen have proposed new image and have adapted to decomposition algorithm in document " Fusion of color microscopic images based on bidimensional empirical mode decomposition.OPTICS EXPRESS; 18,21757-21769 (2010) ".But these picture breakdown arithmetic result are also unstable.
Summary of the invention
According to a first aspect of the invention, propose a kind of method, having comprised: obtained multiple images; Based on total variation-L 1norm energy function, merges described multiple image, wherein, and described total variation-L 1norm energy function comprises two components, and one-component is total variation component, and it represents to retain the detailed information for the treatment of fused images, and another component is L 1norm component, it represents to retain the spectral information for the treatment of fused images.
According to a second aspect of the invention, propose a kind of equipment, having comprised: acquiring unit, for obtaining multiple images; Integrated unit, for based on total variation-L 1norm energy function, merges described multiple image, wherein, and described total variation-L 1norm energy function comprises two components, and one-component is total variation component, and it represents to retain the detailed information for the treatment of fused images, and another component is L 1norm component, it represents to retain the spectral information for the treatment of fused images.
According to a third aspect of the invention we, propose a kind of equipment, having comprised: for obtaining the device of multiple images; Be used for based on total variation-L 1norm energy function, merges the device of described multiple images, wherein, and described total variation-L 1norm energy function comprises two components, and one-component is total variation component, and it represents to retain the detailed information for the treatment of fused images, and another component is L 1norm component, it represents to retain the spectral information for the treatment of fused images.
According to a forth aspect of the invention, propose a kind of equipment, having comprised: at least one processor and at least one storer that comprises computer program code; Described processor and described storer are configured to utilize described processor, described equipment are at least carried out: obtain multiple images; Based on total variation-L 1norm energy function, merges described multiple image, wherein, and described total variation-L 1norm energy function comprises two components, and one-component is total variation component, and it represents to retain the detailed information for the treatment of fused images, and another component is L 1norm component, it represents to retain the spectral information for the treatment of fused images.
According to the present invention, can balance merge detailed information and the spectral information of gained image afterwards.
Brief description of the drawings
By below in conjunction with the description of the drawings, and along with understanding more comprehensively of the present invention, other objects of the present invention and effect will become to be known and easy to understand more, wherein:
Fig. 1 shows according to the process flow diagram/block diagram of the image interfusion method/equipment 100 of an embodiment of the invention;
Fig. 2 shows according to the process flow diagram of the numerical evaluation scheme of an embodiment of the invention.
Fig. 3 shows the fused images obtaining by the solution of the present invention with according to existing AW, AMS, MFF method;
Fig. 4 has schematically shown the block diagram of image co-registration equipment according to an embodiment of the invention.
In all above-mentioned accompanying drawings, identical label represents to have identical, similar or corresponding feature or function.
Embodiment
Preferred implementation of the present disclosure is described below with reference to accompanying drawings in more detail.Although shown preferred implementation of the present disclosure in accompanying drawing, but should be appreciated that, can realize the disclosure and the embodiment that should do not set forth limits here with various forms.On the contrary, it is in order to make the disclosure more thorough and complete that these embodiments are provided, and can be by the those skilled in the art that conveys to complete the scope of the present disclosure.
The present invention proposes a kind of new image co-registration scheme.Basic thought of the present invention is based on total variation-L 1norm energy function, merges multiple images, detailed information and the spectral information of the image being obtained after merging with balance.Wherein, described total variation-L 1norm energy function comprises two components, and one-component is total variation component, and it represents to retain the detailed information for the treatment of fused images, and another component is L 1norm component, it represents to retain the spectral information for the treatment of fused images.
Fig. 1 shows according to the process flow diagram/block diagram of the image interfusion method/equipment 100 of an embodiment of the invention.
As shown in Figure 1, the method/equipment 100 comprises obtaining step/acquiring unit 110, for obtaining multiple images; And fusion steps/integrated unit 120, for based on total variation-L 1norm energy function, merges described multiple image, wherein, and described total variation-L 1norm energy function comprises two components, and one-component is total variation component, and it represents to retain the detailed information for the treatment of fused images, and another component is L 1norm component, it represents to retain the spectral information for the treatment of fused images.
According to an embodiment of the invention, before fusion steps/integrated unit 120, also comprise step of registration/registration unit 115, for described multiple images are carried out to registration operation.
Same from different perspectives, diverse location take two width images, due to shooting condition difference, every width image only reflects the feature of some aspect.Will be by this two width image co-registration together, piece image wherein will be done to mobile and rotation, to make it and another width image alignment.This alignment procedure is exactly registration process.Keep motionless being called to make reference image, do the floating image that is called converting.Image after registration is merged to the fused images that just can obtain reflecting overall picture.
In the following description of embodiment of the present invention, the number of image to be merged is two, and one of them is the more rich high-definition picture of detailed information, and another is the more rich low resolution multispectral image of spectral information.
Certainly, it should be appreciated by those skilled in the art, the present invention is also applicable to the situation that the number of image to be merged is greater than two.
According to an embodiment of the invention, for the brightness part IM of high-definition picture G, multispectral image M, and the image R obtaining after merging, energy function adopts following form:
| | ▿ ( R - G ) | | TV + λ | | R - IM | | 1 - - - ( 1 )
Wherein, represent total variation component, || R-IM|| 1represent L 1norm component, R is the image of gained after merging, λ is weight parameter, for example, be a positive constant, for gradient operator.
It will be appreciated that, G, IM, R are image (form with matrix in computation process occur).
Total variation component retains detailed information, and L 1norm component retains spectral information.According to this energy function, by selecting suitable λ value, between detailed information and multispectral information, carry out balance.Wherein, can obtain suitable λ value by experience.
According to an embodiment of the invention, by being minimized, above-mentioned energy function (1) tries to achieve the image R that merges rear gained.
Can be expressed as:
min R ( E ( R ) ) = min R { | | ▿ ( R - G ) | | TV + λ | | R - IM | | 1 } - - - ( 2 )
In the present invention, total variation is as undefined isotropy total variation:
| | ▿ u | | TV = Σ i , j ( u x ( i , j ) ) 2 + ( u y ( i , j ) ) 2 - - - ( 3 )
Wherein u is the image of L*Q, the line number of L presentation video pixel, and the columns of Q presentation video pixel, that is to say that u is a matrix.
U x, u yrepresent the partial derivative of u, (i, j) representative image pixel position.
Functional form shown in formula 1 can be reduced to:
min K ( E ( K ) ) = min K { | | ▿ ( K ) | | TV + λ | | K - ( IM - G ) | | 1 } - - - ( 6 )
Wherein K=R-G.Because K and R are linear relationships, therefore the energy function of K minimize minimizing corresponding to the energy function of R.
The function of equation (6) is rough, therefore solves K and has any problem, and need to carry out smoothing processing to it.
That is to say, according to an embodiment of the invention, image interfusion method/image co-registration equipment 100 also comprises level and smooth step/smooth unit 130, for equation (6) is carried out to smoothing processing, in the hope of K.
Propose some papers and solved this non-smoothing problasm.For example Brox, T. in people's such as grade document " High accuracy optical flow estimation based on a theoryfor warping; European Conference on Computer Vision, 2004 ", proposed to be similar to this unsmooth function with smooth function.And Zach, C. wait people's document " A duality based approachfor realtime TV-L1optical flow; The Annual Symposium of the German Associationfor Pattern Recognition, 2007. " to propose to use the Dual Method based on separating coupling process.
According to an embodiment of the invention, adopt solution coupling process and original Dual Method to calculate above-mentioned equation (6).
Here introducing a new variable N and a new parameter ρ, is write as formula 6 again:
min K , N ( E ( K , N ) ) = min K , N { | | ▿ ( K ) | | TV + λ | | N - ( IM - G ) | | 1 + ρ | | K - N | | 2 2 } - - - ( 7 )
Wherein ρ is a large constant, thereby makes N can be in close proximity to k.Wherein represent 2 norms square.
The saddle point form of equation 7 can be as follows:
min K max p { < p , &dtri; K > + &lambda; | | N - ( IM - G ) | | 1 + &rho; | | K - N | | 2 2 - &delta; P ( p ) } - - - ( 8 )
Wherein <, > is Hilbert inner product, p is dual variable, p ∈ P, p is by providing as follows P={||p|| in set ≤ 1}, it is Infinite Norm || || the unit ball in space, wherein || p|| be defined as:
| | p | | &infin; = max i , j ( p i , j 1 ) 2 + ( p i , j 2 ) 2 - - - ( 9 )
Wherein i, j representative image location of pixels, it is respectively the component of vectorial p.
Function δ p(p) be the indicator function of set P, it is defined as follows
In order more clearly to express this algorithm, to represent like that numerical evaluation scheme, the antithesis step 210 that formula 8 is expressed as, original steps 220 and threshold step 230 shown in Fig. 2:
Antithesis step
If fixing K=K i, wherein i is iterations, solves so following formula
max p { < p , &dtri; K i > - &delta; p ( p ) } - - - ( 11 )
Because: be that p is differentiated, therefore p be updated to:
p i + 1 = prox &tau; i &CenterDot; &delta; p ( K i + &tau; i &dtri; K i ) - - - ( 12 )
Wherein proximal operator, τ iantithesis step-length, in this embodiment, prox &tau; i &CenterDot; &delta; p ( x ) = x max { | | x | | &infin; , 1 }
Original steps:
If fixing p=p i+1, N=N i, solve so
min k ( E ( K ) ) = min k { < p , &dtri; K > + &rho; | | K - N | | 2 2 } - - - ( 13 )
Then equation 13 is converted to
min k ( E ( K ) ) = min k { - < K , div p > + &rho; | | K - N | | 2 2 } - - - ( 14 )
Wherein div is divergence operator, therefore K is updated to:
K i + 1 = K i - &sigma; i ( - div ( p i + 1 ) &rho; + K i - N i ) - - - ( 15 )
Wherein σ ifor original step-length.
Threshold step:
If fixing K=K i+1, solve so following formula
min N { &lambda; | | N - ( IM - G ) | | 1 + &rho; | | K - N | | 2 2 } , - - - ( 16 )
Then N is updated to
After completing threshold step, judge whether the condition of convergence meets, if meet the condition of convergence, exits so iterative computation, if also do not meet the condition of convergence, turns back to so antithesis step.The order that it should be understood that antithesis step, original steps, threshold step can change, not necessarily order as shown in Figure 2.
τ iand σ ican be chosen as: τ i=0.2+0.08i, the condition of convergence can be chosen as: wherein ε is little constant.
If obtain K, R solves by R=K+G so.
If the tone part HM of known image, saturation degree part SM, and as the R of intensity part, just can obtain the image after fusion.
Use the series of experiments of IKONOS (Yi Kenuosi) image is verified to the present invention.In order to assess quantitatively fusion results, use following 3 standards that propose.First, CM (correlated measure) (referring to in people document " Multiresolution based image fusion with additive wavelet decomposition; IEEE Transactions on Geoscience and Remote Sensing.; 1999; 32; (3); pp.1204-1211 ") calculate multispectral image and merge after the related coefficient of red, green, blue channel between the image of gained, its can be used for evaluating and merge after the reserving degree of spectral information of image of gained.The colour that the image of rear gained is merged in the larger expression of CM value distorts fewer.Secondly, (mutual information measure) (referring to document " Information measurefor performance of image fusion; Electronics Letters; 2002; 38; (7), pp.313-315 " of the people such as Qu) can reflect the total amount that the image of the rear gained of fusion comprises multispectral image and high-resolution image information.Mutual information measure is larger, and after instruction fusion, the image of gained comprises the more information of original image.The 3rd, parameter Q aB/F(referring to the people's such as Xydeas document " Objective Image Fusion Performance Measure; Electronics Letters; 2000; 36; (4), pp.308-309 ") instruction is from the amount to the marginal information of the image transfer that obtains after merging until fused images.Q aB/Fbe worth larger, instruction merge after the image of gained retain the more multiple edge information of fused images treated.
Inventor on IKONOS (Yi Kenuosi) satellite image, carry out proposition of the present invention method and deng people at document " Multiresolution based image fusion with additive wavelet decomposition, IEEE Transactions on Geoscience and Remote Sensing., 1999, 32, (3), pp.1204-1211 " the middle AW method proposing, the people such as Li are at document " Multisensor image fusion using the wavelet transform, Graphical Models and Image Processing, 1995, 57, pp.235-245 " in propose AMS method and Scheunders at document " A Multivalued Image Wavelet Representation Based on Multiscale Fundamental Forms, IEEE Transactions on Image Processing, 2002, 10, (5), pp.568-575 " the middle MFF method proposing, and come relatively to merge the rear image obtaining by distinct methods by above-mentioned 3 tolerance.
For the scheme proposing in the present invention, λ=1 and ρ=12.5.In addition, AW, the wavelet decomposition level of AMS and MFF scheme is 4.
Fig. 3 shows the fused images obtaining by the solution of the present invention and AW as above, AMS, MFF method.Wherein label 310 represents the fused images obtaining by the solution of the present invention and AW as above, AMS, MFF method, and label 320 represents multispectral image to be merged, and label 330 represents high-definition picture to be merged.Label 340 is indicated the comparison of spectral information in respective image, and label 350 is indicated the comparison of detailed information in respective image.
In form 1, provide the comparison of above-mentioned 3 tolerance of the fused images obtaining by the solution of the present invention and AW as above, AMS, MFF method.
Form 1
3 row above of form 1 have reflected that the solution of the present invention retains more spectral information than other three kinds of existing methods.The 4th row of form 1 have reflected that the solution of the present invention obtains more information than other three kinds of existing methods from original image.The 5th row of form 1 have reflected compares other three kinds of existing methods, and the image obtaining by the solution of the present invention has the marginal information of original image more accurately.
Fig. 4 has schematically shown the block diagram of image co-registration equipment according to an embodiment of the invention.As shown in Figure 4, the storer (MEM) 403 that image co-registration equipment 400 comprises data processor (DP) 401 and is coupled with data processor 401.Described storer 403 storage programs (PROG) 402.
Storer 403 can be any suitable type that is applicable to local technical environment, and can utilize any suitable data storage technology to realize, include but not limited to memory device, magnetic memory device and system, light storage device and the system of based semiconductor.Although only show a memory cell in Fig. 4, in image co-registration equipment 400, can there is the different memory cell of multiple physics.DP 401 can be any suitable type that is applicable to local technical environment, and can include but not limited to one or more multiple in multi-purpose computer, special purpose computer, microprocessor, digital signal processor (DSP) and the polycaryon processor framework based on processor.Image co-registration equipment 400 can comprise multiple processors.
As shown in Figure 4, data processor 401 and storer 403 are configured to utilize data processor 401, image co-registration equipment 400 are at least carried out: obtain multiple images; Based on total variation-L 1norm energy function, merges described multiple image, wherein, and described total variation-L 1norm energy function comprises two components, and one-component is total variation component, and it represents to retain the detailed information for the treatment of fused images, and another component is l 1norm component, it represents to retain the spectral information for the treatment of fused images.
According to an embodiment of the invention, by making total variation-L 1norm energy function minimizes, and merges described multiple image.
According to an embodiment of the invention, described energy function adopts following form:
| | &dtri; ( R - G ) | | TV + &lambda; | | R - IM | | 1
Wherein, represent total variation component, || R-IM|| 1represent L 1norm component, R is the image of gained after merging, and G treats the abundantest image of detailed information in fused images, and IM is the brightness part for the treatment of the abundantest image of spectral information in fused images, and λ is a weight parameter, for gradient operator, by being minimized, above-mentioned energy function tries to achieve the image R that merges rear gained.
According to an embodiment of the invention, described data processor 401 and described storer 403 are also configured to utilize described data processor 401, described image co-registration equipment 400 is at least carried out: above-mentioned energy function is carried out to smoothing processing, in the hope of the image R of gained after described fusion.
According to an embodiment of the invention, adopt solution coupling process and original Dual Method to calculate above-mentioned energy function.
According to an embodiment of the invention, described data processor 401 and described storer 403 are also configured to utilize described data processor 401, described image co-registration equipment 400 is at least carried out: before merging described multiple images, described multiple images are carried out to registration operation.
According to an embodiment of the invention, the number of image to be merged is two, and one of them is the more rich high-definition picture of detailed information, and another is the more rich low resolution multispectral image of spectral information.
It should be noted that for the present invention is easier to understand, description has above been omitted to be known for a person skilled in the art and may to be essential some ins and outs more specifically for realization of the present invention.
Therefore; selecting and describing embodiment is in order to explain better principle of the present invention and practical application thereof; and those of ordinary skill in the art are understood, do not departing under the prerequisite of essence of the present invention, within all modifications and change all fall into protection scope of the present invention defined by the claims.
In addition, the step that those skilled in the art will appreciate that above-described the whole bag of tricks can realize by the computing machine of programming.Here, some embodiment is intended to overlay program memory device, and it is machine or computer-readable, and coding has machine can carry out or computer executable instructions program, and the some or all of steps of said method are carried out in wherein said instruction.This program storage device can be for example for example Disk and tape, hard disk drive or optical readable digital data storage medium of magnetic recording medium.Embodiment is also intended to cover the computing machine that is programmed for the described step of carrying out said method.

Claims (28)

1. a method, comprising:
Obtain multiple images;
Based on total variation-L 1norm energy function, merges described multiple image,
Wherein, described total variation-L 1norm energy function comprises two components, and one-component is total variation component, and it represents to retain the detailed information for the treatment of fused images, and another component is L 1norm component, it represents to retain the spectral information for the treatment of fused images.
2. method according to claim 1,
Wherein by making total variation-L 1norm energy function minimizes, and merges described multiple image.
3. method according to claim 2, wherein said energy function adopts following form:
| | &dtri; ( R - G ) | | TV + &lambda; | | R - IM | | 1
Wherein, represent total variation component, || R-IM|| 1represent L 1norm component, R is the image of gained after merging, and G treats the abundantest image of detailed information in fused images, and IM is the brightness part for the treatment of the abundantest image of spectral information in fused images, and λ is weight parameter, for gradient operator, by above-mentioned energy function is minimized, try to achieve the image R of gained after merging.
4. method according to claim 3, also comprises above-mentioned energy function is carried out to smoothing processing, in the hope of the image R of gained after described fusion.
5. method according to claim 3, wherein adopts solution coupling process and original Dual Method to calculate above-mentioned energy function.
6. method according to claim 1, before being also included in the described multiple images of fusion, carries out registration operation to described multiple images.
7. according to the method described in any one in claim 1-6, wherein the number of image to be merged is two, and one of them is the high-definition picture that detailed information is abundant, and another is the abundant low resolution multispectral image of spectral information.
8. an equipment, comprising:
Acquiring unit, for obtaining multiple images;
Integrated unit, for based on total variation-L 1norm energy function, merges described multiple image,
Wherein, described total variation-L 1norm energy function comprises two components, and one-component is total variation component, and it represents to retain the detailed information for the treatment of fused images, and another component is L 1norm component, it represents to retain the spectral information for the treatment of fused images.
9. equipment according to claim 8,
Wherein by making total variation-L 1norm energy function minimizes, and merges described multiple image.
10. equipment according to claim 9, wherein said energy function adopts following form:
| | &dtri; ( R - G ) | | TV + &lambda; | | R - IM | | 1
Wherein, represent total variation component, || R-IM|| 1represent L 1norm component, R is the image after merging, and G treats the abundantest image of detailed information in fused images, and IM is the brightness part for the treatment of the abundantest image of spectral information in fused images, and λ is weight parameter, for gradient operator, by making above-mentioned energy function minimize to try to achieve the image R after fusion.
11. equipment according to claim 10, also comprise:
Smooth unit, for above-mentioned energy function is carried out to smoothing processing, in the hope of the image R of gained after described fusion.
12. equipment according to claim 10, wherein adopt solution coupling process and original Dual Method to calculate above-mentioned energy function.
13. equipment according to claim 8, also comprise:
Registration unit, for before merging described multiple images, carries out registration operation to described multiple images.
Equipment in 14. according to Claim 8-13 described in any one, wherein the number of image to be merged is two, and one of them is the more rich high-definition picture of detailed information, and another is the more rich low resolution multispectral image of spectral information.
15. 1 kinds of equipment, comprising:
For obtaining the device of multiple images;
Be used for based on total variation-L 1norm energy function, merges the device of described multiple images,
Wherein, described total variation-L 1norm energy function comprises two components, and one-component is total variation component, and it represents to retain the detailed information for the treatment of fused images; Another component is L 1norm component, it represents to retain the spectral information for the treatment of fused images.
16. equipment according to claim 15,
Wherein by making total variation-L 1norm energy function minimizes, and merges described multiple image.
17. equipment according to claim 16, wherein said energy function adopts following form:
| | &dtri; ( R - G ) | | TV + &lambda; | | R - IM | | 1
Wherein, represent total variation component, || R-IM|| 1represent L 1norm component, R is the image of gained after merging, and G treats the abundantest image of detailed information in fused images, and IM is the brightness part for the treatment of the abundantest image of spectral information in fused images, and λ is weight parameter, for gradient operator, by being minimized, above-mentioned energy function tries to achieve the image R that merges rear gained.
18. equipment according to claim 17, also comprise:
For above-mentioned energy function is carried out to smoothing processing, in the hope of the device of the image R of gained after described fusion.
19. equipment according to claim 17, wherein adopt solution coupling process and original Dual Method to calculate above-mentioned energy function.
20. equipment according to claim 15, also comprise:
For before merging described multiple images, described multiple images are carried out to the device of registration operation.
21. according to the equipment described in any one in claim 15-20, and wherein the number of image to be merged is two, and one of them is the more rich high-definition picture of detailed information, and another is the more rich low resolution multispectral image of spectral information.
22. 1 kinds of equipment, comprising:
At least one processor and at least one storer that comprises computer program code;
Described processor and described storer are configured to utilize described processor, and described equipment is at least carried out:
Obtain multiple images;
Based on total variation-L 1norm energy function, merges described multiple image,
Wherein, described total variation-L 1norm energy function comprises two components, and one-component is total variation component, and it represents to retain the detailed information for the treatment of fused images, and another component is L 1norm component, it represents to retain the spectral information for the treatment of fused images.
23. equipment according to claim 22,
Wherein by making total variation-L 1norm energy function minimizes, and merges described multiple image.
24. equipment according to claim 23, wherein said energy function adopts following form:
| | &dtri; ( R - G ) | | TV + &lambda; | | R - IM | | 1
Wherein, represent total variation component, || R-IM|| 1represent L 1norm component, R is the image of gained after merging, and G treats the abundantest image of detailed information in fused images, and IM is the brightness part for the treatment of the abundantest image of spectral information in fused images, and λ is weight parameter, for gradient operator, by being minimized, above-mentioned energy function tries to achieve the image R that merges rear gained.
25. equipment according to claim 24, wherein said processor and described storer are also configured to utilize described processor, and described equipment is at least carried out:
Above-mentioned energy function is carried out to smoothing processing, in the hope of the image R of gained after described fusion.
26. equipment according to claim 24, wherein adopt solution coupling process and original Dual Method to calculate above-mentioned energy function.
27. equipment according to claim 22, wherein said processor and described storer are also configured to utilize described processor, and described equipment is at least carried out:
Before merging described multiple images, described multiple images are carried out to registration operation.
28. according to the equipment described in any one in claim 22-27, and wherein the number of image to be merged is two, and one of them is the more rich high-definition picture of detailed information, and another is the more rich low resolution multispectral image of spectral information.
CN201210596216.XA 2012-12-31 2012-12-31 Image fusion method and device Pending CN103914815A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201210596216.XA CN103914815A (en) 2012-12-31 2012-12-31 Image fusion method and device
PCT/FI2013/051212 WO2014102458A1 (en) 2012-12-31 2013-12-30 Method and apparatus for image fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210596216.XA CN103914815A (en) 2012-12-31 2012-12-31 Image fusion method and device

Publications (1)

Publication Number Publication Date
CN103914815A true CN103914815A (en) 2014-07-09

Family

ID=51019949

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210596216.XA Pending CN103914815A (en) 2012-12-31 2012-12-31 Image fusion method and device

Country Status (2)

Country Link
CN (1) CN103914815A (en)
WO (1) WO2014102458A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104966108A (en) * 2015-07-15 2015-10-07 武汉大学 Visible light and infrared image fusion method based on gradient transfer
CN105957010A (en) * 2016-05-19 2016-09-21 沈祥明 Vehicle-mounted image splicing system
CN106296624A (en) * 2015-06-11 2017-01-04 联想(北京)有限公司 A kind of image interfusion method and device

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109215006B (en) * 2018-09-11 2020-10-13 中国科学院长春光学精密机械与物理研究所 Image fusion method based on total significant variation model and L2-norm optimization
CN110084774B (en) * 2019-04-11 2023-05-05 江南大学 Method for minimizing fusion image by enhanced gradient transfer and total variation
CN112529827A (en) * 2020-12-14 2021-03-19 珠海大横琴科技发展有限公司 Training method and device for remote sensing image fusion model
CN113160099B (en) * 2021-03-18 2023-12-26 北京达佳互联信息技术有限公司 Face fusion method, device, electronic equipment, storage medium and program product
CN114018834A (en) * 2021-08-06 2022-02-08 中科联芯(广州)科技有限公司 Intelligent target identification method and detection device for silicon-based multispectral signals

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060291751A1 (en) * 2004-12-16 2006-12-28 Peyman Milanfar Robust reconstruction of high resolution grayscale images from a sequence of low-resolution frames (robust gray super-resolution)
CN102542549A (en) * 2012-01-04 2012-07-04 西安电子科技大学 Multi-spectral and panchromatic image super-resolution fusion method based on compressive sensing

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060291751A1 (en) * 2004-12-16 2006-12-28 Peyman Milanfar Robust reconstruction of high resolution grayscale images from a sequence of low-resolution frames (robust gray super-resolution)
CN102542549A (en) * 2012-01-04 2012-07-04 西安电子科技大学 Multi-spectral and panchromatic image super-resolution fusion method based on compressive sensing

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
DONG JIANG.ET AL.: "Advances in multi-sensor data fusion:Algorithms and applications", 《SENSORS》, vol. 9, no. 10, 30 September 2009 (2009-09-30), pages 7771 - 7784 *
MRITYUNJAJ K.ET AL.: "A total variation-based algorithm for pixel-level image fusion", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》, vol. 18, no. 9, 30 September 2009 (2009-09-30), pages 2137 - 2143 *
YUAN J.ET AL.: "Efficient Convex Optimization Approaches to Variational Image Fusion", 《UCLA CAM REPORT 11-86》, 13 January 2012 (2012-01-13), pages 1 - 2 *
ZACH C.ET AL.: "A Globally Optimal Algorithm for Robust TV-L1 Range Image Integration", 《IEEE 11TH INTERNATIONAL CONFERENCE ON COMPUTER VISION》, 31 October 2007 (2007-10-31), pages 1, XP031194472 *
ZHANG L.ET AL.: "Adjustable Model-Based Fusion Method for Multispectral and Panchromatic Images", 《IEEE TRANSACTIONS ON SYSTEMS,MAN,AND CYBERNETICS,PART B:CYBERNETICS》, vol. 42, no. 6, 14 November 2012 (2012-11-14), pages 2 - 4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106296624A (en) * 2015-06-11 2017-01-04 联想(北京)有限公司 A kind of image interfusion method and device
CN104966108A (en) * 2015-07-15 2015-10-07 武汉大学 Visible light and infrared image fusion method based on gradient transfer
CN105957010A (en) * 2016-05-19 2016-09-21 沈祥明 Vehicle-mounted image splicing system

Also Published As

Publication number Publication date
WO2014102458A1 (en) 2014-07-03

Similar Documents

Publication Publication Date Title
Chen et al. Real-world single image super-resolution: A brief review
CN103914815A (en) Image fusion method and device
Han et al. SSF-CNN: Spatial and spectral fusion with CNN for hyperspectral image super-resolution
Fan et al. Hyperspectral image denoising with superpixel segmentation and low-rank representation
US9342760B1 (en) System and method for combining color information with spatial information in multispectral images
Chierchia et al. A nonlocal structure tensor-based approach for multicomponent image recovery problems
Gao et al. Seam-driven image stitching.
JP5555706B2 (en) High resolution video acquisition apparatus and method
US20140301659A1 (en) Panchromatic Sharpening Method of Spectral Image Based on Fusion of Overall Structural Information and Spatial Detail Information
Cui et al. Blind light field image quality assessment by analyzing angular-spatial characteristics
Huang et al. Removing reflection from a single image with ghosting effect
Rota et al. Video restoration based on deep learning: a comprehensive survey
Mishra et al. Self-FuseNet: data free unsupervised remote sensing image super-resolution
Wang et al. Learning an epipolar shift compensation for light field image super-resolution
Imani et al. Pansharpening optimisation using multiresolution analysis and sparse representation
Misra et al. SPRINT: Spectra Preserving Radiance Image Fusion Technique using holistic deep edge spatial attention and Minnaert guided Bayesian probabilistic model
Zhu et al. Stereoscopic image super-resolution with interactive memory learning
Tan et al. Low-light image enhancement with geometrical sparse representation
Zhang et al. Deep joint neural model for single image haze removal and color correction
Tang et al. MPCFusion: Multi-scale parallel cross fusion for infrared and visible images via convolution and vision Transformer
Zhang et al. Inpainting at modern camera resolution by guided patchmatch with auto-curation
Li et al. Pansharpening via subpixel convolutional residual network
Khan et al. Pansharpening of hyperspectral images using spatial distortion optimization
Van Vo et al. High dynamic range video synthesis using superpixel-based illuminance-invariant motion estimation
Frosio et al. Adaptive segmentation based on a learned quality metric.

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20160118

Address after: Espoo, Finland

Applicant after: Technology Co., Ltd. of Nokia

Address before: Espoo, Finland

Applicant before: Nokia Oyj

WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20140709