CN102289802B - Method for selecting threshold value of image fusion rule of unmanned aerial vehicle based on wavelet transformation - Google Patents

Method for selecting threshold value of image fusion rule of unmanned aerial vehicle based on wavelet transformation Download PDF

Info

Publication number
CN102289802B
CN102289802B CN 201110153692 CN201110153692A CN102289802B CN 102289802 B CN102289802 B CN 102289802B CN 201110153692 CN201110153692 CN 201110153692 CN 201110153692 A CN201110153692 A CN 201110153692A CN 102289802 B CN102289802 B CN 102289802B
Authority
CN
China
Prior art keywords
epsiv
image
threshold value
phase place
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201110153692
Other languages
Chinese (zh)
Other versions
CN102289802A (en
Inventor
赵福立
岳长松
丁文锐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN 201110153692 priority Critical patent/CN102289802B/en
Publication of CN102289802A publication Critical patent/CN102289802A/en
Application granted granted Critical
Publication of CN102289802B publication Critical patent/CN102289802B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a method for selecting a threshold value of an image fusion rule of an unmanned aerial vehicle based on wavelet transformation and is applicable to image reconnaissance of the unmanned aerial vehicle. The method comprises the following steps of: firstly, performing a series of operation, such as fusion of area energy measures, presetting of the threshold value, determination of a fusion operator according to the threshold value and image fusion according to the fusion operator, on an image in the same sequence; secondly, determining a phase-position correlation value of the fused image; and finally, determining the optimum phase-position correlation value by traversing a threshold value range, and thus obtaining the optimum threshold value with which the sequence image is fused. The method is simple in implementation. By the method, the optimum threshold value can be obtained by automatically optimizing the threshold value. After the optimum threshold value of the same sequence is determined by using the method, the image in the same sequence is fused, so a good fusion effect can be achieved, and the method is applicable to the image reconnaissance of the unmanned aerial vehicle with large data quantity.

Description

A kind of system of selection of the unmanned plane Image Fusion Rule threshold value based on wavelet transformation
Technical field
The invention belongs to the fields such as satellite remote control engineering, image fusion technology, be specifically related to a kind of system of selection to unmanned plane reconnaissance image fusion rule threshold value of being used for based on wavelet transformation.
Background technology
Unmanned spacecraft is being brought into play significant effect with its accurate, efficient and nimble scouting, interference, deception, search, the multiple fight capability such as the school is penetrated and fight in modern war under informal condition.In order to obtain more reliable Image Intelligence (IMINT), unmanned plane has carried increasing sensor.In the face of numerous image informations, need to merge for the information of unmanned plane feature of image with a plurality of source images, with obtain more accurately, more comprehensive reliably iamge description, eliminate redundancy and contradiction between the allos information, strengthen transparency information in the image, improve precision, reliability and the utilization rate explained.Therefore, the unmanned plane image is merged accurately, applied research has important meaning for unmanned plane.
The multi-resolution Fusion analysis of wavelet transformation is the study hotspot in current demand signal and image co-registration field.It can resolve into original image a series of subimages with different spatial resolutions and frequency domain characteristic, the localized variation feature that fully reflects original image, original image is decomposed in a series of channels, utilize the pyramidal structure after decomposing, will be merged at a plurality of decomposition layers, a plurality of band utilization fusion rule by feature and the details that fused images is carried separately.Fusion rule choose the important component part that is based on wavelet transform fusion.Based on the fusion rule of pixel provincial characteristics be more suitable for being applied to this class source images of unmanned plane allos image can't accuracy registration, situation that the spectral signature difference is larger, for example existing small echo region energy is estimated fusion rule, but this rule-like is definite threshold by rule of thumb just often, can't accomplish automatic adjusting, it is limited to adapt to scene, and threshold value is chosen bad meeting and caused the problems such as unmanned plane image interfusion method poor effect, poor anti jamming capability.
Summary of the invention
The present invention is directed in the present unmanned plane image interfusion method and can't independently choose optimal threshold, can cause the problems such as unmanned plane image interfusion method poor effect, poor anti jamming capability, propose a kind of system of selection of the unmanned plane Image Fusion Rule threshold value based on wavelet transformation.
The system of selection of a kind of unmanned plane Image Fusion Rule threshold value based on wavelet transformation of the present invention may further comprise the steps:
Step 1: region energy is estimated fusion, may further comprise the steps:
Step 1.1, read in unmanned plane source images A and B;
Step 1.2, determine respectively the energy of regional area on image A and image B correspondence direction, the corresponding resolution according to following formula
Figure BDA0000067088480000011
With
Figure BDA0000067088480000012
E j ϵ ( x , y ) = Σ n ∈ L , m ∈ K ω ϵ ( n , m ) [ D j ϵ ( x + n , y + m ) ] 2 , ϵ = 1,2,3
Wherein, j is the unmanned plane image resolution ratio; ε is the direction subscript, ε=1,2, and 3 represent respectively level, vertical and three directions in diagonal angle;
Figure BDA0000067088480000021
For under the j resolution, on the ε direction, the energy of local area of position centered by coordinate points (x, y);
Figure BDA0000067088480000022
For in the high fdrequency component under the j resolution, on the ε direction; For with Corresponding weight function; L, K are the size of regional area;
Step 1.3, determine the matching degree of regional area on image A and image B correspondence direction, the corresponding resolution
Figure BDA0000067088480000025
M j , AB ϵ ( x , y ) = 2 Σ n ∈ L , m ∈ K ω ϵ ( n , m ) D j , A ϵ ( x + n , y + m ) D j , B ϵ ( x + n , y + m ) E j , A ϵ ( x , y ) + E j , B ϵ ( x , y )
Wherein, With
Figure BDA0000067088480000028
Be respectively image A and image B in the high fdrequency component under the j resolution, on the ε direction;
Step 1.4, predetermined threshold value t.
Step 2: determine to merge operator S according to threshold value t 1With S 2:
If M j , AB &epsiv; ( x , y ) < t , Then
S 1 = 1 , S 2 = 0 E j , A &epsiv; ( x , y ) &GreaterEqual; E j , B &epsiv; ( x , y ) S 1 = 0 , S 2 = 1 E j , A &epsiv; ( x , y ) < E j , B &epsiv; ( x , y ) , &epsiv; = 1,2,3
If M j , AB &epsiv; ( x , y ) &GreaterEqual; t , Then
S 1 = W j , max &epsiv; ( x , y ) , S 2 = W j , min &epsiv; ( x , y ) E j , A &epsiv; ( x , y ) &GreaterEqual; E j , B &epsiv; ( x , y ) S 1 = W j , min &epsiv; ( x , y ) , S 2 = W j , max &epsiv; ( x , y ) E j , A &epsiv; ( x , y ) < E j , B &epsiv; ( x , y ) , &epsiv; = 1,2,3
In the formula (4)
Figure BDA00000670884800000213
Figure BDA00000670884800000214
Be defined as:
W j , min &epsiv; ( x , y ) = 1 2 - 1 2 [ 1 - M j , AB &epsiv; ( x , y ) 1 - t ] W j , max &epsiv; ( x , y ) = 1 - W j , min &epsiv; ( x , y ) , &epsiv; = 1,2,3 .
Step 3: image co-registration specifically may further comprise the steps:
Step 3.1, two width of cloth source images are carried out wavelet transform, make up small echo gold tower;
Step 3.2, utilization are merged operator the wavelet pyramid that obtains are merged;
D j , F &epsiv; ( x , y ) = S 1 D j , A &epsiv; ( x , y ) + S 2 D j , B &epsiv; ( x , y )
Wherein,
Figure BDA00000670884800000217
Locate the value of pixel at (x, y) at the component under the j resolution, on the ε direction for the fused images F that obtains after merging;
Step 3.3, fused images F is carried out the discrete wavelet inverse transformation.
Step 4: determine the phase place correlation, specifically may further comprise the steps:
Step 4.1, fused images F done discrete Fourier transformation obtain frequency domain figure as U (ξ):
U ( &xi; ) = &Sigma; x F ( x ) e - 2 i&pi; n < x , &xi; >
Wherein, n is the sampling point number in the fused images,<x, ξ〉be the inner product of x and ξ, x is the independent variable of fused images F time domain, ξ is the independent variable of frequency domain figure picture;
Frequency domain distribution after the fused images F discrete Fourier transformation is:
Figure BDA0000067088480000031
Wherein | U (ξ) | be the amplitude of fused images F discrete Fourier transformation,
Figure BDA0000067088480000032
It is the phase place of fused images F discrete Fourier transformation;
Step 4.2, generation add the N width of cloth image of random phase, and N is the number of times of Monte Carlo simulation:
Each time emulation is all with phase place
Figure BDA0000067088480000033
Increase a random offset δ S, obtain a random phase
Figure BDA0000067088480000034
Wherein δ is a fixed value, and S is the equally distributed stochastic variable that satisfies the independent same distribution condition at (π, π); Then add random phase and carry out the reconstructed image spatial domain H that the discrete fourier inversion obtains according to formula ψ(x) be:
H &psi; ( x ) = 1 n 2 &Sigma; &xi; | U ( &xi; ) | &CenterDot; e 2 i&pi; n < x , &xi; > + i&psi; ( &xi; )
Step 4.3, obtain phase place correlation G:
G = - log 10 &Phi; ( &mu; - TV ( H i ) &sigma; )
Wherein, Φ is the distribution function of normal distribution, according to Determine; TV (H i) be the i width of cloth image H of N width of cloth reconstructed image iTotal variation, TV (H i) be:
Figure BDA0000067088480000038
Be divergence operator, μ is TV (H i) average, σ is TV (H i) variance.
Step 5: whether judge the current phase place correlation that obtains greater than standard value, if, then current phase place correlation is made as standard phase place correlation, then execution in step six, if not, then directly enter step 6 and carry out; Described standard phase place correlation is initially 0.
Step 6: increase threshold value t take 0.01 as step-length, and whether judgment threshold t is greater than 1, if do not have, turning step 2 carries out, if it is the corresponding threshold value of standard phase place correlation that current threshold value then is set, this threshold value is exactly the optimal threshold that carries out image co-registration with the image in the sequence.Obtain just can adopting this optimal threshold that homotactic image is carried out image co-registration behind this optimal threshold.
Advantage and the good effect of the inventive method are:
(1) estimates fusion rule with respect to existing small echo region energy, the present invention utilizes the phase place correlation values detection, can automatically carry out threshold optimization to it, get optimal threshold, thereby so that the image in the same sequence can obtain better image syncretizing effect according to this optimal threshold;
(2) with respect to the image interfusion method that has employing experience definite threshold now, adopt the inventive method to determine to carry out again image co-registration behind the homotactic optimal threshold, be applicable to have the unmanned plane reconnaissance image of big data quantity, and can realize good syncretizing effect, every objective fusion mass evaluation index such as entropy, Y-PSNR (, Peak-to-peak Signal-to-Noise Ratio, be called for short PSNR), root-mean-square error (Root Mean Square Error, be called for short RMSE) etc. all be better than conventional images fusion method such as SiDWT (shift invariant DWT) fusion method, and the inventive method also is easy to realize.
Description of drawings
Fig. 1 is the overall flow synoptic diagram of the system of selection of threshold value of the present invention;
Fig. 2 is the schematic flow sheet that region energy is estimated fusion in the system of selection step 1 of threshold value of the present invention;
Fig. 3 is the schematic flow sheet of determining the phase place correlation in the system of selection step 3 of threshold value of the present invention;
Fig. 4 is for adopting the inventive method to obtain carrying out image syncretizing effect figure behind the optimal threshold: (a) for unmanned plane Visible Light Reconnaissance image, (b) being unmanned plane infrared reconnaissance image, (c) is fused images.
Embodiment
The present invention is further illustrated below in conjunction with accompanying drawing and implementation example.
The system of selection based on the unmanned plane Image Fusion Rule threshold value of wavelet transformation that the present invention proposes, unmanned plane Image Fusion Rule based on wavelet transformation, utilize the method for phase place correlation values detection, automatically carry out threshold optimization, allow to obtain better image and melt effect.As shown in Figure 1, the inventive method specifically comprises following 6 steps.
Step 1: region energy is estimated fusion, specifically comprises following step 1.1~step 1.4.
Step 1.1: read in two width of cloth unmanned plane source images A, B.
Step 1.2: through type (1) calculates respectively two width of cloth image A, the energy of regional area on B correspondence direction, the corresponding resolution
Figure BDA0000067088480000042
E j &epsiv; ( x , y ) = &Sigma; n &Element; L , m &Element; K &omega; &epsiv; ( n , m ) [ D j &epsiv; ( x + n , y + m ) ] 2 , &epsiv; = 1,2,3 - - - ( 1 )
In the formula, j is the unmanned plane image resolution ratio, and ε is the direction subscript, ε=1,2, and 3 represent respectively level, vertical and three directions in diagonal angle;
Figure BDA0000067088480000044
For under the j resolution, on the ε direction, the energy of local area of position centered by (x, y);
Figure BDA0000067088480000045
For in the high fdrequency component under the j resolution, on the ε direction; ω ε(n, m) be with
Figure BDA0000067088480000046
Corresponding weight function; L, K are the size of regional area, and normally used range size is 3*3,5*5 or 7*7, and unit is pixel; M, the variation range of n is in L, K.Take certain type unmanned plane as example, the value that can select L, K all is 5 pixels.
Step 1.3: determine two width of cloth image A, the matching degree of regional area on B correspondence direction, the corresponding resolution
Figure BDA0000067088480000047
M j , AB &epsiv; ( x , y ) = 2 &Sigma; n &Element; L , m &Element; K &omega; &epsiv; ( n , m ) D j , A &epsiv; ( x + n , y + m ) D j , B &epsiv; ( x + n , y + m ) E j , A &epsiv; ( x , y ) + E j , B &epsiv; ( x , y ) - - - ( 2 )
Wherein,
Figure BDA0000067088480000049
And
Figure BDA00000670884800000410
Obtain by formula (1),
Figure BDA00000670884800000411
Be respectively image A, B in the high fdrequency component under the j resolution, on the ε direction.
Step 1.4: predetermined threshold value t, general t gets 0.5~1, predetermined threshold value t=0.5
Step 2: determine to merge operator S 1With S 2:
If M j , AB &epsiv; ( x , y ) < t , Then
S 1 = 1 , S 2 = 0 E j , A &epsiv; ( x , y ) &GreaterEqual; E j , B &epsiv; ( x , y ) S 1 = 0 , S 2 = 1 E j , A &epsiv; ( x , y ) < E j , B &epsiv; ( x , y ) , &epsiv; = 1,2,3 - - - ( 3 )
If M j , AB &epsiv; ( x , y ) &GreaterEqual; t , Then
S 1 = W j , max &epsiv; ( x , y ) , S 2 = W j , min &epsiv; ( x , y ) E j , A &epsiv; ( x , y ) &GreaterEqual; E j , B &epsiv; ( x , y ) S 1 = W j , min &epsiv; ( x , y ) , S 2 = W j , max &epsiv; ( x , y ) E j , A &epsiv; ( x , y ) < E j , B &epsiv; ( x , y ) , &epsiv; = 1,2,3 - - - ( 4 )
S in the formula 1, S 2Be respectively the weight of source images A and B in the image co-registration,
Figure BDA00000670884800000416
Be defined as:
W j , min &epsiv; ( x , y ) = 1 2 - 1 2 [ 1 - M j , AB &epsiv; ( x , y ) 1 - t ] W j , max &epsiv; ( x , y ) = 1 - W j , min &epsiv; ( x , y ) , &epsiv; = 1,2,3 - - - ( 5 )
Step 3: image co-registration; Detailed process is: step 3.1: two width of cloth source images are carried out wavelet transform, make up respectively small echo gold tower; Step 3.2: utilize the fusion operator that wavelet pyramid is merged, obtain fused images F; Step 3.3: fused images F is carried out the discrete wavelet inverse transformation.
Wherein, step 3.2 specifically utilizes the fusion operator that wavelet pyramid is merged according to formula (6);
D j , F &epsiv; ( x , y ) = S 1 D j , A &epsiv; ( x , y ) + S 2 D j , B &epsiv; ( x , y ) - - - ( 6 )
In the formula,
Figure BDA0000067088480000053
The value of locating at (x, y) in the high fdrequency component under the j resolution, on the ε direction for the wavelet pyramid of image A;
Figure BDA0000067088480000054
Locate the value of pixel at (x, y) in the high fdrequency component under the j resolution, on the ε direction for the wavelet pyramid of image B;
Figure BDA0000067088480000055
Locate the value of pixel at (x, y) in the high fdrequency component under the j resolution, on the ε direction for fused images F.
Step 4: phase place correlation value calculation; Specifically comprise following three steps:
Step 4.1: discrete Fourier transformation conversion.
The discrete Fourier transformation of fused images F (DFT) obtains following frequency domain figure as U (ξ):
U ( &xi; ) = &Sigma; x F ( x ) e - 2 i&pi; n < x , &xi; > - - - ( 7 )
In the formula, n is the sampling point number in the fused images,<x, ξ〉be the inner product of x and ξ, x is the independent variable of fused images F time domain, ξ is the independent variable of the U that obtains after discrete Fourier transformation of fused images F.So just, can obtain the frequency domain distribution after the discrete Fourier transformation of fused images F
Figure BDA0000067088480000057
Wherein | U (ξ) | be the amplitude after the fused images F discrete Fourier transformation,
Figure BDA0000067088480000058
It is the phase place after the fused images F discrete Fourier transformation;
Step 4.2: generate the N width of cloth image that adds random phase;
With phase place
Figure BDA0000067088480000059
Increase a random offset δ S, obtain a new phase function
Figure BDA00000670884800000510
ψ (ξ) is called random phase, and δ is a fixed value, and S is the equally distributed stochastic variable that satisfies the independent same distribution condition at (π, π).If the number of times with Monte Carlo simulation is N, then need phase place Do respectively N skew, and carry out the reconstructed image spatial domain H that inverse discrete Fourier transform obtains according to formula (8) ψ(x) be:
H &psi; ( x ) = 1 n 2 &Sigma; &xi; | U ( &xi; ) | &CenterDot; e 2 i&pi; n < x , &xi; > + i&psi; ( &xi; ) - - - ( 8 )
Step 4.3: obtain the phase place correlation;
The i width of cloth image H of N width of cloth reconstructed image iTotal variation (ROF model) TV (H i) be:
TV ( H i ) = &Sigma; x | &dtri; H i ( x ) | - - - ( 9 )
In the formula
Figure BDA00000670884800000514
It is divergence operator.Obtain TV (H i) after can further calculate TV (H i) average μ and variances sigma, then obtain phase place correlation G according to formula (10):
G = - log 10 &Phi; ( &mu; - TV ( H i ) &sigma; ) - - - ( 10 )
Wherein Φ is the distribution function of normal distribution:
&phi; ( x ) = ( 2 &pi; ) - 1 / 2 &Integral; x + &infin; e - t 2 / 2 dt - - - ( 11 )
Step 5: whether judge the current phase place correlation that obtains greater than standard value, if, then current phase place correlation is made as standard phase place correlation, then execution in step six, if not, then directly enter step 6 and carry out.Described standard phase place correlation is initially 0.
Step 6: increase threshold value t take 0.01 as step-length, then whether judgment threshold t, turns step 2 and carries out if do not have greater than 1, if then the corresponding threshold value of standard phase place correlation this moment is best, it is the corresponding threshold value of standard phase place correlation that current threshold value is set.After obtaining optimal threshold, according to this thresholding to carrying out image co-registration with the image in the sequence.
The image sequence that the unmanned plane image is normally a large amount of, after utilizing the inventive method to obtain optimal threshold in the fusion process with the piece image in the sequence, this threshold value goes for the arbitrary image in the sequence, so get final product directly adopting this threshold value to merge with other images in the sequence, do not need to carry out again the selection of optimal threshold, the method that merges can be according to the present invention step 1.1~step 1.3 obtain the matching degree of image, then determine to merge operator according to step 2, then carry out image co-registration according to step 3; Also resulting optimal threshold can be applied in the image that comes these to be in same sequence in the present Wavelet Fusion method merges.
Under the application background of a large amount of unmanned plane image sequences, carry out again image co-registration behind employing the inventive method acquisition optimal threshold and carry out image co-registration with respect to adopting existing method rule of thumb to be worth, the image syncretizing effect of acquisition is more excellent.The below has provided and has adopted the inventive method and the existing methodical contrast of adopting empirical value.
The existing three kinds of image interfusion methods that carry out image co-registration with the employing empirical value that adopts the inventive method to do contrast in the embodiment of the invention are: LAP (Laplacian) pyramid fusion method, MORPH (morphological) pyramid fusion method and SiDWT fusion method.LAP pyramid fusion method is a kind of fusion method of utilizing pyramid decomposition, it is a kind of multiple dimensioned, multi-Resolution Image Fusion method, fusion process can be carried out respectively on different scale, different spatial resolutions, different decomposition layer, basic thought is: each width of cloth source images is carried out the LAP pyramid decomposition, then by selecting coefficient to consist of the fusion pyramid from the original image pyramid, will merge again pyramid and carry out inverse transformation and can obtain fused images.The basis of MORPH pyramid fusion method is the morphology sampling policy, the image point set at first carries out pre-service by morphology opening operation or closed operation, then set up image pyramid by the morphology sampling, at the tower layer certain feature selecting strategy is set and also caves in step by step, at last by morphology dual operations reconstructed image.The SiDWT fusion method is not by adopting down-sampled process can obtain having the wavelet transformation (SiDWT) of translation invariance.Merge threshold value in these three kinds of fusion methods and all choose the threshold value 0.75 of usually selecting.Adopt again the inventive method to obtain to obtain according to the invention described above step 1.1~step 1.3 again behind the optimal threshold matching degree of image, determine to merge operator according to step 2, then carry out image co-registration according to step 3.
As shown in table 1, for the inventive method and above-mentioned three kinds of existing methods are applied to respectively same sequence unmanned plane image is merged the fusion performance evaluation table that obtains, (PSNR) is higher for Y-PSNR, illustrate that syncretizing effect and quality are better, (RMSE) is less for root-mean-square error, illustrates that fused images and ideal image are more approaching, and syncretizing effect and quality are better, adopt as can be seen from Table 1 image co-registration result's the Y-PSNR of the inventive method the highest, root-mean-square error is minimum.
Table 1: use the fusion performance evaluation table that 4 kinds of methods are carried out the unmanned plane image co-registration
Fusion method PSNR RMSE
LAP 42.193 3.9776
MORPH 37.6076 6.7436
The SiDWT fusion method 41.7474 4.187
The inventive method 43.0585 3.6004
As shown in Figure 4, (a) be unmanned plane Visible Light Reconnaissance image, (b) be unmanned plane infrared reconnaissance image, (c) for will image shown in (a) with (b) shown in image adopt method of the present invention to merge the fused images that obtains.(c) can find out that employing the inventive method obtains to carry out image co-registration behind the optimal threshold again from Fig. 4, and the fused images effect of acquisition is relatively good.

Claims (3)

1. the system of selection based on the unmanned plane Image Fusion Rule threshold value of wavelet transformation is characterized in that, may further comprise the steps:
Step 1: region energy is estimated fusion, may further comprise the steps:
Step 1.1, read in unmanned plane source images A and B;
Step 1.2, determine respectively the energy of regional area on image A and image B correspondence direction, the corresponding resolution according to formula (1)
Figure FDA00002064891400011
With
E j &epsiv; ( x , y ) = &Sigma; n &Element; L , m &Element; K &omega; &epsiv; ( n , m ) [ D j &epsiv; ( x + n , y + m ) ] 2 , &epsiv; = 1,2,3 - - - ( 1 )
Wherein, j is the unmanned plane image resolution ratio; ε is the direction subscript, ε=1,2, and 3 represent respectively level, vertical and three directions in diagonal angle; For under the j resolution, on the ε direction, the energy of local area of position centered by coordinate points (x, y);
Figure FDA00002064891400015
For in the high fdrequency component under the j resolution, on the ε direction; ω ε(n, m) be with Corresponding weight function; L, K are the size of regional area;
Step 1.3, determine the matching degree of regional area on image A and image B correspondence direction, the corresponding resolution
M j , AB &epsiv; ( x , y ) = 2 &Sigma; n &Element; L , m &Element; K &omega; &epsiv; ( n , m ) D j , A &epsiv; ( x + n , y + m ) D j , B &epsiv; ( x + n , y + m ) E j , A &epsiv; ( x , y ) + E j , B &epsiv; ( x , y ) - - - ( 2 )
Wherein,
Figure FDA00002064891400019
With Be respectively image A and image B in the high fdrequency component under the j resolution, on the ε direction;
Step 1.4, predetermined threshold value t;
Step 2: determine to merge operator S according to threshold value t 1With S 2:
If
Figure FDA000020648914000111
Then
S 1 = 1 , S 2 = 0 E j , A &epsiv; ( x , y ) &GreaterEqual; E j , B &epsiv; ( x , y ) S 1 = 0 , S 2 = 1 E j , A &epsiv; ( x , y ) < E j , B &epsiv; ( x , y ) , &epsiv; = 1,2,3 - - - ( 3 )
If
Figure FDA000020648914000113
Then
S 1 = W j , max &epsiv; ( x , y ) , S 2 = W j , min &epsiv; ( x , y ) E j , A &epsiv; ( x , y ) &GreaterEqual; E j , B &epsiv; ( x , y ) S 1 = W j , min &epsiv; ( x , y ) , S 2 = W j , max &epsiv; ( x , y ) E j , A &epsiv; ( x , y ) < E j , B &epsiv; ( x , y ) , &epsiv; = 1,2,3 - - - ( 4 )
In the formula (4)
Figure FDA000020648914000115
Be defined as:
W j , min &epsiv; ( x , y ) = 1 2 - 1 2 [ 1 - M j , AB &epsiv; ( x , y ) 1 - t ] W j , max &epsiv; ( x , y ) = 1 - W j , min &epsiv; ( x , y ) , &epsiv; = 1,2,3 - - - ( 5 )
Step 3: image co-registration specifically may further comprise the steps:
Step 3.1, two width of cloth source images are carried out wavelet transform, make up small echo gold tower;
Step 3.2, utilization are merged operator the wavelet pyramid that obtains are merged;
D j , F &epsiv; ( x , y ) = S 1 D j , A &epsiv; ( x , y ) + S 2 D j , B &epsiv; ( x , y ) - - - ( 6 )
Wherein,
Figure FDA000020648914000118
Locate the value of pixel at (x, y) at the component under the j resolution, on the ε direction for the fused images F that obtains after merging;
Step 3.3, fused images F is carried out the discrete wavelet inverse transformation;
Step 4: determine the phase place correlation, specifically may further comprise the steps:
Step 4.1, fused images F done discrete Fourier transformation obtain frequency domain figure as U (ξ):
U ( &xi; ) = &Sigma; x F ( x ) e - 2 i&pi; n < x , &xi; > - - - ( 7 )
Wherein, n is the sampling point number in the fused images,<x, and ξ>be the inner product of x and ξ, x is the independent variable of fused images F time domain, ξ is the independent variable of frequency domain figure picture;
Frequency domain distribution after the fused images F discrete Fourier transformation is: Wherein | U (ξ) | be the amplitude of fused images F discrete Fourier transformation,
Figure FDA00002064891400023
It is the phase place of fused images F discrete Fourier transformation;
Step 4.2, generation add the N width of cloth image of random phase, and N is the number of times of Monte Carlo simulation:
Each time emulation is all with phase place
Figure FDA00002064891400024
Increase a random offset δ S, obtain a random phase
Figure FDA00002064891400025
Wherein δ is a fixed value, and S is the equally distributed stochastic variable that satisfies the independent same distribution condition at (π, π); Then add random phase and carry out the reconstructed image spatial domain H that inverse discrete Fourier transform obtains ψ(x) be:
H &psi; ( x ) = 1 n 2 &Sigma; &xi; | U ( &xi; ) | &CenterDot; e 2 i&pi; n < x , &xi; > + i&psi; ( &xi; ) - - - ( 8 )
Step 4.3, obtain phase place correlation G:
G = - log 10 &Phi; ( &mu; - TV ( H i ) &sigma; ) - - - ( 9 )
Wherein, φ is the distribution function of normal distribution, according to
Figure FDA00002064891400028
Determine; TV (H i) be the i width of cloth image H of N width of cloth reconstructed image iTotal variation, TV (H i) be:
Figure FDA00002064891400029
Figure FDA000020648914000210
Be divergence operator, μ is TV (H i) average, σ is TV (H i) variance;
Step 5: whether the current phase place correlation that judge to obtain greater than standard value, if, then current phase place correlation is made as standard phase place correlation, then execution in step six, if not, direct execution in step six then; Described standard phase place correlation is initially 0;
Step 6: increase threshold value t take 0.01 as step-length, and whether judgment threshold t is greater than 1, if do not have, execution in step two, if then the corresponding threshold value of standard phase place correlation is exactly optimal threshold, then adopt this optimal threshold to carrying out image co-registration with the image in the sequence.
2. the system of selection of a kind of unmanned plane Image Fusion Rule threshold value based on wavelet transformation according to claim 1 is characterized in that the big or small L of the described regional area of step 1.2 and K are 5 pixels.
3. the system of selection of a kind of unmanned plane Image Fusion Rule threshold value based on wavelet transformation according to claim 1 is characterized in that the described threshold value t of step 1.4 gets 0.5 ~ 1.
CN 201110153692 2011-06-09 2011-06-09 Method for selecting threshold value of image fusion rule of unmanned aerial vehicle based on wavelet transformation Expired - Fee Related CN102289802B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110153692 CN102289802B (en) 2011-06-09 2011-06-09 Method for selecting threshold value of image fusion rule of unmanned aerial vehicle based on wavelet transformation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110153692 CN102289802B (en) 2011-06-09 2011-06-09 Method for selecting threshold value of image fusion rule of unmanned aerial vehicle based on wavelet transformation

Publications (2)

Publication Number Publication Date
CN102289802A CN102289802A (en) 2011-12-21
CN102289802B true CN102289802B (en) 2013-01-30

Family

ID=45336194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110153692 Expired - Fee Related CN102289802B (en) 2011-06-09 2011-06-09 Method for selecting threshold value of image fusion rule of unmanned aerial vehicle based on wavelet transformation

Country Status (1)

Country Link
CN (1) CN102289802B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279940B (en) * 2013-06-18 2016-04-20 清华大学 Medical imaging many light images Enhancement Method and device
CN107895360A (en) * 2017-10-21 2018-04-10 天津大学 A kind of Phase Build Out method based on wavelet field regularization

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1284975C (en) * 2003-01-16 2006-11-15 上海交通大学 Optimization method of remote sensing image by use of syncretic bilinear interpolation and wavelet transformation
CN1316431C (en) * 2004-11-05 2007-05-16 北京师范大学 Adjustable remote sensing image fusion method based on wavelet transform

Also Published As

Publication number Publication date
CN102289802A (en) 2011-12-21

Similar Documents

Publication Publication Date Title
CN104459633B (en) Wavelet field InSAR interferometric phase filtering method in conjunction with local frequency estimation
Komarov et al. Sea ice motion tracking from sequential dual-polarization RADARSAT-2 images
CN102609701B (en) Remote sensing detection method based on optimal scale for high-resolution SAR (synthetic aperture radar)
CN102073873B (en) Method for selecting SAR (spaceborne synthetic aperture radar) scene matching area on basis of SVM (support vector machine)
CN102393958B (en) Multi-focus image fusion method based on compressive sensing
CN102629378B (en) Remote sensing image change detection method based on multi-feature fusion
CN101777181B (en) Ridgelet bi-frame system-based SAR image airfield runway extraction method
CN103472450B (en) Based on the nonuniform space configuration distributed SAR moving target three-D imaging method of compressed sensing
CN103679674A (en) Method and system for splicing images of unmanned aircrafts in real time
CN103149561A (en) Microwave imaging method based on scenario block sparsity
CN105528619A (en) SAR remote sensing image change detection method based on wavelet transform and SVM
CN106156758B (en) A kind of tidal saltmarsh method in SAR seashore image
CN103236063A (en) Multi-scale spectral clustering and decision fusion-based oil spillage detection method for synthetic aperture radar (SAR) images
CN102063715A (en) Method for fusing typhoon cloud pictures based on NSCT (Nonsubsampled Controurlet Transformation) and particle swarm optimization algorithm
CN103106658A (en) Island or reef coastline rapid obtaining method
CN103218811A (en) Statistical distribution-based satellite multi-spectral image waveband registration method
CN107742133A (en) A kind of sorting technique for Polarimetric SAR Image
CN104200471A (en) SAR image change detection method based on adaptive weight image fusion
CN103473559A (en) SAR image change detection method based on NSCT domain synthetic kernels
CN102866260A (en) Non-contact river surface flow field imaging measuring method
CN105785369A (en) SAR image ice and snow coverage information extraction method based on InSAR technology
CN105184804A (en) Sea surface small target detection method based on airborne infrared camera aerially-photographed image
CN102289802B (en) Method for selecting threshold value of image fusion rule of unmanned aerial vehicle based on wavelet transformation
CN106097292A (en) Sequential SAR space-time neighborhood Gauss Weighted median filtering speckle is made an uproar suppression fast algorithm
CN108957479A (en) A kind of remote-sensing monitoring method for border infrastructure

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130130

Termination date: 20130609