CN106530277A - Image fusion method based on wavelet direction correlation coefficient - Google Patents

Image fusion method based on wavelet direction correlation coefficient Download PDF

Info

Publication number
CN106530277A
CN106530277A CN201610895138.1A CN201610895138A CN106530277A CN 106530277 A CN106530277 A CN 106530277A CN 201610895138 A CN201610895138 A CN 201610895138A CN 106530277 A CN106530277 A CN 106530277A
Authority
CN
China
Prior art keywords
sigma
image
coefficient
delta
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610895138.1A
Other languages
Chinese (zh)
Other versions
CN106530277B (en
Inventor
王敏
李维
严卫
施伟来
郭随平
贾赟
任尚书
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PLA University of Science and Technology
Original Assignee
PLA University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PLA University of Science and Technology filed Critical PLA University of Science and Technology
Priority to CN201610895138.1A priority Critical patent/CN106530277B/en
Publication of CN106530277A publication Critical patent/CN106530277A/en
Application granted granted Critical
Publication of CN106530277B publication Critical patent/CN106530277B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Image Analysis (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to an image fusion method based on a wavelet direction correlation coefficient. An image is decomposed into a three-layer low-frequency sub band part and a three-layer high-frequency sub band part by using db3 wavelet conversion, wherein low-frequency sub bands employ a pixel point fusion rule determined based on cyclic shift sub block space frequency correlation coefficient; and for all high-frequency sub bands, high-frequency coefficients are determined by regional energy and a gradient normalization correlation coefficient difference based on direction characteristics of the sub bands. According to the image fusion method designed by the invention, correlation of wavelet coefficient spatial frequencies as well as direction correlation between the energy and gradient is taken into consideration and the coefficient participating in fusion is more important and accurate for subjective and objective quality of the fused image. The fusion precision and practicability are high.

Description

A kind of image interfusion method based on small echo directional correlation coefficient
Technical field
The present invention relates to a kind of image interfusion method, is based especially on the image interfusion method of small echo directional correlation coefficient.
Background technology
Image co-registration be it is a kind of by high vision treatment technology being combined the technology of multi-source image, by specific algorithm The multiple image of the Same Scene that will be obtained from different sensors (or same sensor is in different time or different observation angles) Through integrated treatment, so as to obtain a width meet it is certain requirement, more accurate to the description of target or scene, more fully, More structurally sound image, the further process of observation and computer in order to human eye.Due to these features, image fusion technology It has been widely used in the fields such as military affairs, remote sensing, computer vision and medical science.
Image co-registration process is mainly carried out on three different levels:Pixel-level fusion, feature-based fusion and decision level Fusion.As pixel-level image fusion is basic and relatively more directly perceived, application also most extensively, therefore is most closed in image co-registration field Note.At present the more commonly used pixel level fusing method mainly have IHS converter techniques, Wavelet Transform, principal component analysis (PCA) method and Brovey converter techniques etc..
For image interfusion method, the selection of fusion rule and fusion operator is the focus and difficulties of current research. Wavelet transformation carries out multiresolution analysis characteristic with to signal, and the image interfusion method based on wavelet transformation can be obtained and people Visual characteristic be close to syncretizing effect.Therefore, become the focus of people's research in recent years based on the blending algorithm of wavelet transformation. At present, the blending algorithm of wavelet field mainly includes two kinds of forms, based on average weighted image interfusion method and based on wavelet systems The image interfusion method of data/coherency (statistical nature in local window such as variance, gradient, energy etc.).Weighted mean method is simple Intuitively, it is adapted to real-time processing, but coefficient to be fused is carried out isolated weighting simply process, have ignored between neighboring Wavelet Coefficients Area coherence, causes fusion accuracy to reduce;Relativity of Coefficients method, by calculating the region coefficient correlation of coefficient to be fused, from Adaptively determine fusion coefficients.The method improve efficiently average weighted image interfusion method, achieve preferably fusion Effect.But there is also deficiency:When the coefficient correlation of coefficient to be fused is calculated, the coefficient for participating in fusion is to be fused to the method The eight neighborhood coefficient of coefficient, but do not consider the direction character that the coefficient distribution of each high-frequency sub-band has, thus affect last Syncretizing effect.
The content of the invention
For above-mentioned technical problem, it is good that the object of the invention is to provide a kind of high precision, real-time, can make full use of small echo Directional characteristic image interfusion method.
The present invention solves above-mentioned technical problem and employs the following technical solutions:Image co-registration based on small echo directional correlation coefficient Method, using wavelet transformation by picture breakdown into low frequency and HFS, low frequency sub-band is using based on cyclic shift sub-block space The pixel fusion rule that frequency correlation coefficient determines;For each high-frequency sub-band is according to the direction character of its place subband, adopt The normalizated correlation coefficient difference of region energy and gradient based on directional characteristic determines high frequency coefficient.
Comprise the following steps that:
Step 001. two width image A and B to be fused carry out denoising and image registration respectively;
Step 002. carries out 3 layers of db3 respectively and (chooses for wavelet transformation to the two width image of A, B of Jing denoisings and image registration Wavelet basis) wavelet transformation, isolate high frequency imaging (level, vertical and three directions of diagonal) and low-frequency image;
Step 003. merges to the low-frequency image of two width image of A, B, specifically includes:
Two width image low frequency component cyclic shift of A, B is resolved into step 0031. subimage block that size is 3 × 3, respectively It is designated as Alk、Blk(l, k are respectively the ranks sequence number of sub-block);
Step 0032. calculates A respectivelylk、BlkSpatial frequency
(M, N are respectively the ranks value of sub-block, i.e. M=3, N=3)
Step 0033. calculates correspondence sub-block Alk、BlkBetween coefficient correlation
Step 0034. low frequency fusion rule is CFK=ω CAK+(1-ω)CBK, ω is weighted factor,
Step 004. merges to each layer horizontal high-frequent image of two width image of A, B, specifically includes:
Step 0041. calculates 3 × 3 windows that A and two width image level high-frequency sub-band wavelet coefficients of B are put centered on (x, y) Energy in the horizontal direction of mouth region domainWith
Represent the A images wavelet coefficient values that (x, y) puts on H in the horizontal direction, the weighted template W of horizontal directionH It is taken as:
Step 0042. calculates the horizontal direction energy correlation coefficient that (x, y) puts in two width image corresponding regions
Step 0043. passes through Sobel horizontal direction operator SHWith vertical direction operator SVCalculated level subband wavelet coefficient with 3 × 3 window area direction gradients put centered on (x, y)With
Step 0044. calculates the direction gradient coefficient correlation that (x, y) puts in two width horizontal high-frequent image corresponding regions
Step 0045. is by horizontal direction energy correlation coefficientWith horizontal direction gradient coefficient correlation It is normalized
Step 0046. calculates the difference of above-mentioned normalizated correlation coefficient
Step 0047. horizontal high-frequent Image Fusion Rule, specifically includes:
| m (x, the y) | >=T if step 00471., using the maximum of local energy or gradient as fusion results
| m (x, y) | the < T if step 00472., using the maximum or weighted average of local energy or gradient as fusion As a result
T (0 < T < 1) is a threshold value, and span is 0.2~0.5.
Step 005. merges to each layer vertical high frequency image of two width image of A, B, specifically includes:
Step 0051. calculates 3 × 3 windows that A and two width image vertical high frequency subband wavelet coefficients of B are put centered on (x, y) Energy in the vertical direction of mouth region domainWith
Represent the A images wavelet coefficient values that (x, y) puts in vertical direction V, the weighted template W of vertical directionV It is taken as:
Step 0052. calculates the vertical direction energy correlation coefficient that (x, y) puts in two width image corresponding regions
Step 0053. passes through Sobel horizontal direction operator SHWith vertical direction operator SVCalculate vertical subband wavelet coefficient with 3 × 3 window area direction gradients put centered on (x, y)With
Step 0054. calculates the vertical gradient coefficient correlation that (x, y) puts in two width vertical high frequency image corresponding regions
Step 0055. is by vertical direction energy correlation coefficientWith vertical gradient coefficient correlation It is normalized
Step 0056. calculates the difference of above-mentioned normalizated correlation coefficient
Step 0057. vertical high frequency Image Fusion Rule, specifically includes:
| m (x, the y) | >=T if step 00571., using the maximum of local energy or gradient as fusion results
| m (x, y) | the < T if step 00572., using the maximum or weighted average of local energy or gradient as fusion As a result
T (0 < T < 1) is a threshold value, and span is 0.2~0.5.
Step 006. merges to each layer diagonal high frequency imaging of two width image of A, B, specifically includes:
Step 0061. calculates 3 × 3 that A and two width image diagonal high-frequency sub-band wavelet coefficients of B are put centered on (x, y) The diagonally adjacent energy of window areaWith
Represent the A images wavelet coefficient values that (x, y) puts in diagonal d, the weighting of diagonal Template WdIt is taken as:
Step 0062. calculates the diagonal energy correlation coefficient that (x, y) puts in two width image corresponding regions
Step 0063. passes through Sobel horizontal direction operator SHWith vertical direction operator SVCalculate diagonal subband wavelet coefficient 3 × 3 window area direction gradients put centered on (x, y)With
It is related that step 0064. calculates the diagonal gradient that (x, y) puts in two width diagonal high frequency imaging corresponding regions Coefficient
Step 0065. is by diagonal energy correlation coefficientWith diagonal gradient coefficient correlationIt is normalized
Step 0066. calculates the difference of above-mentioned normalizated correlation coefficient
Step 0067. diagonal high frequency imaging fusion rule, specifically includes:
| m (x, the y) | >=T if step 00671., using the maximum of local energy or gradient as fusion results
| m (x, y) | the < T if step 00672., using the maximum or weighted average of local energy or gradient as fusion As a result
T (0 < T < 1) is a threshold value, and span is 0.2~0.5.
Fusion low frequency subgraph that step 003,004,005 and 006 are obtained by step 007., fusion horizontal high-frequent subgraph, melt Vertical high frequency subgraph and fusion diagonal high frequency subgraph is closed, and last picture rich in detail is reconstructed using wavelet inverse transformation.
As a kind of preferred version, during low-frequency image cyclic blocking, sub-block does zero padding in the edge pixel beyond border Process.
Beneficial effect:A kind of image interfusion method based on small echo directional correlation coefficient of the present invention, using above skill Art scheme compared with prior art, simply, easily realize, high precision by algorithm, automatically updates low frequency using the operation of cyclic shift block little Ripple fusion coefficients, the directional correlation existed between the correlation and energy and gradient that have taken into full account wavelet coefficient spatial frequency Property, the different fusion rule of adaptive selection, the coefficient of participation more meet actual, more important, more notable, effectively increase figure As the precision of fusion, the adjacent coefficient for simultaneously participating in are reduced, the complexity of fusion is reduced.
Description of the drawings
Fig. 1 is a kind of image interfusion method flow chart based on small echo directional correlation coefficient of present invention design.
Specific embodiment
The specific embodiment of the present invention is described in further detail with reference to Figure of description.
As shown in figure 1, the present invention devises a kind of image interfusion method based on small echo directional correlation coefficient, using small echo Become low frequency and HFS that source images of changing commanders resolve into different scale, low frequency sub-band is using based on cyclic shift sub-block space frequency The pixel fusion rule that rate coefficient correlation determines;For each high-frequency sub-band is according to the direction character of its place subband, using base High frequency coefficient is determined in the region energy of directional characteristic and the normalizated correlation coefficient difference of gradient.
Comprise the following steps that:
Step 001. two width image A and B to be fused carry out denoising and image registration respectively;
Step 002. carries out 3 layers of db3 wavelet transformations respectively to the two width image of A, B of Jing denoisings and image registration, isolates High frequency imaging (level, vertical and three directions of diagonal) and low-frequency image;
Step 003. merges to the low-frequency image of two width image of A, B, specifically includes:
Two width image low frequency component cyclic shift of A, B is resolved into step 0031. subimage block that size is 3 × 3, respectively It is designated as Alk、Blk(l, k are respectively the ranks sequence number of sub-block);
Step 0032. calculates A respectivelylk、BlkSpatial frequency
(M, N are respectively the ranks value of sub-block, i.e. M=3, N=3)
Step 0033. calculates correspondence sub-block Alk、BlkBetween coefficient correlation
Step 0034. low frequency fusion rule is CFK=ω CAK+(1-ω)CBK, ω is weighted factor,
Step 004. merges to each layer horizontal high-frequent image of two width image of A, B, specifically includes:
Step 0041. calculates 3 × 3 windows that A and two width image level high-frequency sub-band wavelet coefficients of B are put centered on (x, y) Energy in the horizontal direction of mouth region domainWith
Represent the A images wavelet coefficient values that (x, y) puts on H in the horizontal direction, the weighted template W of horizontal directionH It is taken as:
Step 0042. calculates the horizontal direction energy correlation coefficient that (x, y) puts in two width image corresponding regions
Step 0043. passes through Sobel horizontal direction operator SHWith vertical direction operator SVCalculated level subband wavelet coefficient with 3 × 3 window area direction gradients put centered on (x, y)With
Step 0044. calculates the direction gradient coefficient correlation that (x, y) puts in two width horizontal high-frequent image corresponding regions
Step 0045. is by horizontal direction energy correlation coefficientWith horizontal direction gradient coefficient correlation It is normalized
Step 0046. calculates the difference of above-mentioned normalizated correlation coefficient
Step 0047. horizontal high-frequent Image Fusion Rule, specifically includes:
| m (x, the y) | >=T if step 00471., using the maximum of local energy or gradient as fusion results
| m (x, y) | the < T if step 00472., using the maximum or weighted average of local energy or gradient as fusion As a result
T (0 < T < 1) is a threshold value, and span is 0.2~0.5.
Step 005. merges to each layer vertical high frequency image of two width image of A, B, specifically includes:
Step 0051. calculates 3 × 3 windows that A and two width image vertical high frequency subband wavelet coefficients of B are put centered on (x, y) Energy in the vertical direction of mouth region domainWith
Represent the A images wavelet coefficient values that (x, y) puts in vertical direction V, the weighted template W of vertical directionV It is taken as:
Step 0052. calculates the vertical direction energy correlation coefficient that (x, y) puts in two width image corresponding regions
Step 0053. passes through Sobel horizontal direction operator SHWith vertical direction operator SVCalculate vertical subband wavelet coefficient with 3 × 3 window area direction gradients put centered on (x, y)With
Step 0054. calculates the vertical gradient coefficient correlation that (x, y) puts in two width vertical high frequency image corresponding regions
Step 0055. is by vertical direction energy correlation coefficientWith vertical gradient coefficient correlation It is normalized
Step 0056. calculates the difference of above-mentioned normalizated correlation coefficient
Step 0057. vertical high frequency Image Fusion Rule, specifically includes:
| m (x, the y) | >=T if step 00571., using the maximum of local energy or gradient as fusion results
| m (x, y) | the < T if step 00572., using the maximum or weighted average of local energy or gradient as fusion As a result
T (0 < T < 1) is a threshold value, and span is 0.2~0.5.
Step 006. merges to each layer diagonal high frequency imaging of two width image of A, B, specifically includes:
Step 0061. calculates 3 × 3 that A and two width image diagonal high-frequency sub-band wavelet coefficients of B are put centered on (x, y) The diagonally adjacent energy of window areaWith
Represent the A images wavelet coefficient values that (x, y) puts in diagonal d, the weighting mould of diagonal Plate WdIt is taken as:
Step 0062. calculates the diagonal energy correlation coefficient that (x, y) puts in two width image corresponding regions
Step 0063. passes through Sobel horizontal direction operator SHWith vertical direction operator SVCalculate diagonal subband wavelet coefficient 3 × 3 window area direction gradients put centered on (x, y)With
It is related that step 0064. calculates the diagonal gradient that (x, y) puts in two width diagonal high frequency imaging corresponding regions Coefficient
Step 0065. is by diagonal energy correlation coefficientWith diagonal gradient coefficient correlationIt is normalized
Step 0066. calculates the difference of above-mentioned normalizated correlation coefficient
Step 0067. diagonal high frequency imaging fusion rule, specifically includes:
| m (x, the y) | >=T if step 00671., using the maximum of local energy or gradient as fusion results
| m (x, y) | the < T if step 00672., using the maximum or weighted average of local energy or gradient as fusion As a result
T (0 < T < 1) is a threshold value, and span is 0.2~0.5.
Fusion low frequency subgraph that step 003,004,005 and 006 are obtained by step 007., fusion horizontal high-frequent subgraph, melt Vertical high frequency subgraph and fusion diagonal high frequency subgraph is closed, and last picture rich in detail is reconstructed using wavelet inverse transformation.
As a kind of preferred version, during low-frequency image cyclic blocking, sub-block does zero padding in the edge pixel beyond border Process.
The image interfusion method of present invention design, method for designing are short and sweet, effectively make use of image Jing wavelet decompositions Afterwards, the approximate characteristic of low frequency sub-band, weighted factor are determined by the coefficient correlation of spatial frequency between image block, and adopt circulation Displaced block operation automatically updates Wavelet Fusion coefficient;And high-frequency sub-band has different directional characteristics, the subband to different directions, Using the template of different directions, the normalizated correlation coefficient by zoning oriented energy and gradient is poor, adaptive selection The very big still weighted average of different fusion rules, i.e. delivery.Relative to traditional images fusion rule, the rule is it is determined that high frequency During fusion coefficients, the coefficient of participation more meets actual, more important, more notable, and then improves the precision of fusion, and As the adjacent coefficient for participating in is reduced, the complexity of fusion is reduced.
To sum up, by setting up and implementing the image interfusion method based on small echo directional correlation coefficient that the present invention is designed, energy The low frequency space correlation and high frequency direction characteristic of enough effectively utilizes images, improves the efficiency and precision of image co-registration, with wide Wealthy market application foreground and economic worth.
Embodiments of the present invention are explained in detail above in conjunction with accompanying drawing, but the present invention is not limited to above-mentioned enforcement Mode, in the ken that those of ordinary skill in the art possess, can be with the premise of without departing from present inventive concept Make a variety of changes.

Claims (2)

1. a kind of image interfusion method based on small echo directional correlation coefficient, it is characterised in that using db3 wavelet transformations by image 3 layers of low frequency sub-band and high-frequency sub-band part is resolved into, low frequency sub-band is using based on cyclic shift sub-block spatial frequency coefficient correlation It is determined that pixel fusion rule;For each high-frequency sub-band is according to the direction character of its place subband, using based on directional characteristic Region energy and gradient normalizated correlation coefficient difference determine high frequency coefficient;Comprise the following steps that:
Step 001 two width image A and B to be fused carry out denoising and image registration respectively;
Step 002 carries out wavelet transformation respectively to the two width image of A, B of Jing denoisings and image registration, isolate level, it is vertical and The high frequency imaging and low-frequency image in three directions of diagonal;
Step 003 merges to the low-frequency image of two width image of A, B, specifically includes:
Two width image low frequency component cyclic shift of A, B is resolved into step 0031 subimage block that size is 3 × 3, is designated as respectively Alk、Blk, l, k are respectively the ranks sequence number of sub-block;
Step 0032 calculates A respectivelylk、BlkSpatial frequency
RF l k A = 1 / M N Σ i = 1 M Σ j = 2 N [ A ( i , j ) - A ( i , j - 1 ) ] 2
CF l k A = 1 / M N Σ i = 2 N Σ j = 1 M [ A ( i , j ) - A ( i - 1 , j ) ] 2
SF l k A = ( RF l k A ) 2 + ( CF l k A ) 2
RF l k B = 1 / M N Σ i = 1 M Σ j = 2 N [ B ( i , j ) - B ( i , j - 1 ) ] 2
CF l k B = 1 / M N Σ i = 2 N Σ j = 1 M [ B ( i , j ) - B ( i - 1 , j ) ] 2
SF l k B = ( RF l k B ) 2 + ( CF l k B ) 2 ;
(M, N are respectively the ranks value of sub-block, i.e. M=3, N=3)
Step 0033 calculates correspondence sub-block Alk、BlkBetween coefficient correlation
r l k A B = Σ i = 1 M Σ j = 1 N [ ( SF l k A - e A ) × ( SF l k B - e B ) ] / [ Σ i = 1 M Σ j = 1 N ( SF l k A - e A ) 2 ] × [ Σ i = 1 M Σ j = 1 N ( SF l k B - e B ) 2 ]
Step 0034 low frequency fusion rule is CFK=ω CAK+(1-ω)CBK, ω is weighted factor,
Step 004 merges to each layer horizontal high-frequent image of two width image of A, B, specifically includes:
Step 0041 calculates 3 × 3 window areas that A and two width image level high-frequency sub-band wavelet coefficients of B are put centered on (x, y) Energy in horizontal directionWith
E A H ( x , y ) = Σ m = - 1 1 Σ n = - 1 1 W H ( m + 2 , n + 2 ) [ D A H ( x + m , y + n ) ] 2
E B H ( x , y ) = Σ m = - 1 1 Σ n = - 1 1 W H ( m + 2 , n + 2 ) [ D B H ( x + m , y + n ) ] 2
Represent the A images wavelet coefficient values that (x, y) puts on H in the horizontal direction, the weighted template W of horizontal directionHTake For:
W H = 0 0 0 1 2 1 0 0 0
Step 0042 calculates the horizontal direction energy correlation coefficient that (x, y) puts in two width image corresponding regions
r A B E H ( x , y ) = 2 Σ m = - 1 1 Σ n = - 1 1 W H ( m + 2 , n + 2 ) D A H ( x + m , y + n ) D B H ( x + m , y + n ) E A H + E B H
Step 0043 passes through Sobel horizontal direction operator SHWith vertical direction operator SVCalculated level subband wavelet coefficient is with (x, y) Centered on 3 × 3 window area direction gradients putWith
G A H ( x , y ) = Σ i = 1 M Σ j = 1 N ( Δ x D A H ( x , y ) ) 2 + ( Δ y D A H ( x , y ) ) 2
Δ x D A H ( x , y ) = S H · D A H ( x , y )
Δ Y D A H ( x , y ) = S V · D A H ( x , y )
S H = - 1 - 2 - 1 0 0 0 1 2 1 S V = - 1 0 1 - 2 0 2 - 1 0 1
Step 0044 calculates the direction gradient coefficient correlation that (x, y) puts in two width horizontal high-frequent image corresponding regions
r A B G H ( x , y ) = 2 Σ i = 1 M Σ j = 1 N ( Δ x D A H ( x , y ) ) 2 + ( Δ y D A H ( x , y ) ) 2 ( Δ x D B H ( x , y ) ) 2 + ( Δ y D B H ( x , y ) ) 2 G A H + G B H
Step 0045. is by horizontal direction energy correlation coefficientWith horizontal direction gradient coefficient correlationCarry out Normalized
Nr A B E H ( x , y ) = r A B E H ( x , y ) r A B E H ( x , y ) + r A B G H ( x , y )
Nr A B G H ( x , y ) = r A B G H ( x , y ) r A B E H ( x , y ) + r A B G H ( x , y )
Step 0046 calculates the difference of above-mentioned normalizated correlation coefficient
m ( x , y ) = Nr A B E H ( x , y ) - Nr A B G H ( x , y )
Step 0047 horizontal high-frequent Image Fusion Rule, specifically includes:
| m (x, the y) | >=T if step 00471, using the maximum of local energy or gradient as fusion results,
| m (x, y) | the < T if step 00472, using the maximum or weighted average of local energy or gradient as fusion results,
T (0 < T < 1) is a threshold value, and span is 0.2~0.5;
Step 005 merges to each layer vertical high frequency image of two width image of A, B, specifically includes:
Step 0051 calculates 3 × 3 window areas that A and two width image vertical high frequency subband wavelet coefficients of B are put centered on (x, y) Energy in vertical directionWith
E A V ( x , y ) = Σ m = - 1 1 Σ n = - 1 1 W V ( m + 2 , n + 2 ) [ D A V ( x + m , y + n ) ] 2
E B V ( x , y ) = Σ m = - 1 1 Σ n = - 1 1 W V ( m + 2 , n + 2 ) [ D B V ( x + m , y + n ) ] 2
Represent the A images wavelet coefficient values that (x, y) puts in vertical direction V, the weighted template W of vertical directionVIt is taken as:
W V = 0 1 0 0 2 0 0 1 0
Step 0052 calculates the vertical direction energy correlation coefficient that (x, y) puts in two width image corresponding regions
r A B E V ( x , y ) = 2 Σ m = - 1 1 Σ n = - 1 1 W V ( m + 2 , n + 2 ) D A V ( x + m , y + n ) D B V ( x + m , y + n ) E A V + E B V
Step 0053 passes through Sobel horizontal direction operator SHWith vertical direction operator SVVertical subband wavelet coefficient is calculated with (x, y) Centered on 3 × 3 window area direction gradients putWith
G A V ( x , y ) = Σ i = 1 M Σ j = 1 N ( Δ x D A V ( x , y ) ) 2 + ( Δ y D A V ( x , y ) ) 2
Δ x D A V ( x , y ) = S H · D A V ( x , y )
Δ Y D A V ( x , y ) = S V · D A V ( x , y )
S H = - 1 - 2 - 1 0 0 0 1 2 1 S r = - 1 0 1 - 2 0 2 - 1 0 1
Step 0054 calculates the vertical gradient coefficient correlation that (x, y) puts in two width vertical high frequency image corresponding regions
r A B G V ( x , y ) = 2 Σ i = 1 M Σ j = 1 N ( Δ x D A V ( x , y ) ) 2 + ( Δ y D A V ( x , y ) ) 2 ( Δ x D B V ( x , y ) ) 2 + ( Δ y D B V ( x , y ) ) 2 G A V + G B V
Step 0055 is by vertical direction energy correlation coefficientWith vertical gradient coefficient correlationReturned One change is processed,
Nr A B E V ( x , y ) = r A B E V ( x , y ) r A B E V ( x , y ) + r A B G V ( x , y )
Nr A B G V ( x , y ) = r A B G V ( x , y ) r A B E V ( x , y ) + r A B G V ( x , y )
Step 0056 calculates the difference of above-mentioned normalizated correlation coefficient
m ( x , y ) = Nr A B E V ( x , y ) - Nr A B G V ( x , y )
Step 0057 vertical high frequency Image Fusion Rule, specifically includes:
| m (x, the y) | >=T if step 00571, using the maximum of local energy or gradient as fusion results,
| m (x, y) | the < T if step 00572, using the maximum or weighted average of local energy or gradient as fusion results,
T (0 < T < 1) is a threshold value, and span is 0.2~0.5;
Step 006 merges to each layer diagonal high frequency imaging of two width image of A, B, specifically includes:
Step 0061 calculates 3 × 3 window regions that A and two width image diagonal high-frequency sub-band wavelet coefficients of B are put centered on (x, y) The diagonally adjacent energy in domainWith
E A d ( x , y ) = Σ m = - 1 1 Σ n = - 1 1 W d ( m + 2 , n + 2 ) [ D A d ( x + m , y + n ) ] 2
E B d ( x , y ) = Σ m = - 1 1 Σ n = - 1 1 W d ( m + 2 , n + 2 ) [ D B d ( x + m , y + n ) ] 2
Represent the A images wavelet coefficient values that (x, y) puts in diagonal d, the weighted template W of diagonald It is taken as:
W d = 1 0 1 0 2 0 1 0 1
Step 0062 calculates the diagonal energy correlation coefficient that (x, y) puts in two width image corresponding regions
r A B E d ( x , y ) = 2 Σ m = - 1 1 Σ n = - 1 1 W d ( m + 2 , n + 2 ) D A d ( x + m , y + n ) D B d ( x + m , y + n ) E A d + E B d
Step 0063 passes through Sobel horizontal direction operator SHWith vertical direction operator SVCalculate diagonal subband wavelet coefficient with (x, Y) 3 × 3 window area direction gradients put centered onWith
G A d ( x , y ) = Σ i = 1 M Σ j = 1 N ( Δ x D A d ( x , y ) ) 2 + ( Δ y D A d ( x , y ) ) 2
Δ x D A d ( x , y ) = S H · D A d ( x , y )
Δ Y D A d ( x , y ) = S V · D A d ( x , y )
S H = - 1 - 2 - 1 0 0 0 1 2 1 S V = - 1 0 1 - 2 0 2 - 1 0 1
Step 0064 calculates the diagonal gradient coefficient correlation that (x, y) puts in two width diagonal high frequency imaging corresponding regions
r A B G d ( x , y ) = 2 Σ i = 1 M Σ j = 1 N ( Δ x D A d ( x , y ) ) 2 + ( Δ y D A d ( x , y ) ) 2 ( Δ x D B d ( x , y ) ) 2 + ( Δ y D B d ( x , y ) ) 2 G A d + G B d
Step 0065 is by diagonal energy correlation coefficientWith diagonal gradient coefficient correlationEnter Row normalized,
Nr A B E d ( x , y ) = r A B E d ( x , y ) r A B E d ( x , y ) + r A B G d ( x , y )
Nr A B G d ( x , y ) = r A B G d ( x , y ) r A B E d ( x , y ) + r A B G d ( x , y )
Step 0066 calculates the difference of above-mentioned normalizated correlation coefficient,
m ( x , y ) = Nr A B E d ( x , y ) - Nr A B G d ( x , y )
Step 0067 diagonal high frequency imaging fusion rule, specifically includes:
| m (x, the y) | >=T if step 00671, using the maximum of local energy or gradient as fusion results,
| m (x, y) | the < T if step 00672, using the maximum or weighted average of local energy or gradient as fusion results,
T (0 < T < 1) is a threshold value, and span is 0.2~0.5;
Fusion low frequency subgraph that step 003,004,005 and 006 are obtained by step 007, fusion horizontal high-frequent subgraph, fusion are vertical High frequency subgraph and fusion diagonal high frequency subgraph, reconstruct last picture rich in detail using wavelet inverse transformation.
2. a kind of new image interfusion method based on small echo directional correlation coefficient according to claim 1, it is characterised in that During low-frequency image cyclic blocking, sub-block does zero padding process in the edge pixel beyond border.
CN201610895138.1A 2016-10-13 2016-10-13 A kind of image interfusion method based on small echo directional correlation coefficient Active CN106530277B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610895138.1A CN106530277B (en) 2016-10-13 2016-10-13 A kind of image interfusion method based on small echo directional correlation coefficient

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610895138.1A CN106530277B (en) 2016-10-13 2016-10-13 A kind of image interfusion method based on small echo directional correlation coefficient

Publications (2)

Publication Number Publication Date
CN106530277A true CN106530277A (en) 2017-03-22
CN106530277B CN106530277B (en) 2019-09-10

Family

ID=58332043

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610895138.1A Active CN106530277B (en) 2016-10-13 2016-10-13 A kind of image interfusion method based on small echo directional correlation coefficient

Country Status (1)

Country Link
CN (1) CN106530277B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107016640A (en) * 2017-04-06 2017-08-04 广州爱图互联网有限公司 Picture energy normalized processing method and system based on multi-resolution decomposition
CN109300096A (en) * 2018-08-07 2019-02-01 北京智脉识别科技有限公司 A kind of multi-focus image fusing method and device
CN109300097A (en) * 2018-08-16 2019-02-01 南京理工大学 Multi-sequence image fusion method based on adaptive piecemeal
CN113317793A (en) * 2021-06-11 2021-08-31 宁波大学 Magnetocardiogram high-frequency signal analysis method, storage medium, and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090326383A1 (en) * 2008-06-18 2009-12-31 Michael Barnes Systems and methods for hyperspectral imaging
CN102053051A (en) * 2009-10-30 2011-05-11 西门子公司 Body fluid analysis system as well as image processing device and method for body fluid analysis
CN102063713A (en) * 2010-11-11 2011-05-18 西北工业大学 Neighborhood normalized gradient and neighborhood standard deviation-based multi-focus image fusion method
CN102637297A (en) * 2012-03-21 2012-08-15 武汉大学 Visible light and infrared image fusion method based on Curvelet transformation
US20150288941A1 (en) * 2013-10-17 2015-10-08 Northrop Grumman Systems Corporation Converting an image from a dual-band sensor to a visible color image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090326383A1 (en) * 2008-06-18 2009-12-31 Michael Barnes Systems and methods for hyperspectral imaging
CN102053051A (en) * 2009-10-30 2011-05-11 西门子公司 Body fluid analysis system as well as image processing device and method for body fluid analysis
CN102063713A (en) * 2010-11-11 2011-05-18 西北工业大学 Neighborhood normalized gradient and neighborhood standard deviation-based multi-focus image fusion method
CN102637297A (en) * 2012-03-21 2012-08-15 武汉大学 Visible light and infrared image fusion method based on Curvelet transformation
US20150288941A1 (en) * 2013-10-17 2015-10-08 Northrop Grumman Systems Corporation Converting an image from a dual-band sensor to a visible color image

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107016640A (en) * 2017-04-06 2017-08-04 广州爱图互联网有限公司 Picture energy normalized processing method and system based on multi-resolution decomposition
CN109300096A (en) * 2018-08-07 2019-02-01 北京智脉识别科技有限公司 A kind of multi-focus image fusing method and device
CN109300097A (en) * 2018-08-16 2019-02-01 南京理工大学 Multi-sequence image fusion method based on adaptive piecemeal
CN109300097B (en) * 2018-08-16 2022-04-01 南京理工大学 Multi-sequence image fusion method based on self-adaptive blocking
CN113317793A (en) * 2021-06-11 2021-08-31 宁波大学 Magnetocardiogram high-frequency signal analysis method, storage medium, and electronic device
CN113317793B (en) * 2021-06-11 2023-02-17 宁波大学 Magnetocardiogram high-frequency signal analysis method, storage medium, and electronic device

Also Published As

Publication number Publication date
CN106530277B (en) 2019-09-10

Similar Documents

Publication Publication Date Title
CN106339998B (en) Multi-focus image fusing method based on contrast pyramid transformation
Yang et al. Multifocus image fusion based on NSCT and focused area detection
CN105719263B (en) Visible ray and infrared image fusion method based on NSCT domains bottom visual signature
CN106886977A (en) A kind of many figure autoregistrations and anastomosing and splicing method
CN106846289B (en) A kind of infrared light intensity and polarization image fusion method
CN106530277A (en) Image fusion method based on wavelet direction correlation coefficient
CN104408700A (en) Morphology and PCA (principal component analysis) based contourlet fusion method for infrared and visible light images
CN105913407B (en) A method of poly focal power image co-registration is optimized based on differential chart
CN107341786A (en) The infrared and visible light image fusion method that wavelet transformation represents with joint sparse
CN105957063A (en) CT image liver segmentation method and system based on multi-scale weighting similarity measure
CN105551010A (en) Multi-focus image fusion method based on NSCT (Non-Subsampled Contourlet Transform) and depth information incentive PCNN (Pulse Coupled Neural Network)
CN105894483B (en) A kind of multi-focus image fusing method based on multi-scale image analysis and block consistency checking
CN108564597A (en) A kind of video foreground target extraction method of fusion gauss hybrid models and H-S optical flow methods
CN101980287B (en) Method for detecting image edge by nonsubsampled contourlet transform (NSCT)
CN109035172A (en) A kind of non-local mean Ultrasonic Image Denoising method based on deep learning
CN109064437A (en) Image fusion method based on guided filtering and online dictionary learning
CN106910179A (en) Multimode medical image fusion method based on wavelet transformation
CN108550145A (en) A kind of SAR image method for evaluating quality and device
CN108648174A (en) A kind of fusion method of multilayer images and system based on Autofocus Technology
Yang Multiresolution Image Fusion Based on Wavelet Transform By Using a Novel Technique for Selection Coefficients.
CN109559273A (en) A kind of quick joining method towards vehicle base map picture
Kanimozhi et al. Brain MR image segmentation using self organizing map
CN105825491A (en) Image fusion method based on hybrid model
Bhataria et al. A review of image fusion techniques
CN103065291A (en) Image fusion method based on promoting wavelet transform and correlation of pixel regions

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant