CN109829874A - SAR image fusion method based on Frame Theory - Google Patents

SAR image fusion method based on Frame Theory Download PDF

Info

Publication number
CN109829874A
CN109829874A CN201910091037.2A CN201910091037A CN109829874A CN 109829874 A CN109829874 A CN 109829874A CN 201910091037 A CN201910091037 A CN 201910091037A CN 109829874 A CN109829874 A CN 109829874A
Authority
CN
China
Prior art keywords
image
matrix
sar
row
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910091037.2A
Other languages
Chinese (zh)
Other versions
CN109829874B (en
Inventor
廖桂生
刘洋
李世东
曾操
朱圣棋
杜佩鞠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201910091037.2A priority Critical patent/CN109829874B/en
Publication of CN109829874A publication Critical patent/CN109829874A/en
Application granted granted Critical
Publication of CN109829874B publication Critical patent/CN109829874B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses the SAR image fusion methods based on Frame Theory, comprising the following steps: obtains the n width low resolution SAR image of Same Scene as image to be fused, treats blending image and carry out bilinear interpolation processing, obtain intermediate image;Piece image is chosen as benchmark image, all intermediate images are staggeredly merged according to pixel value deviation minimum principle, obtain the initial estimation image of high-resolution SAR image;Framework matrix H is constructed using point spread function;According to framework matrix H, restoration disposal is carried out to initial estimation image using the Tikhonov regularization method of threshold value iteration, obtains High Resolution SAR Images.The present invention uses Frame Theory, constructs more acurrate, more careful, more robust imaging device model, then carry out Pixel-level fusion to several low resolution SAR images, increases the Target scalar edge details of SAR image, improve the clarity of SAR image.

Description

SAR image fusion method based on Frame Theory
Technical field
The invention belongs to technical field of image processing more particularly to a kind of SAR image fusion methods based on Frame Theory.
Background technique
Synthetic aperture radar (SAR) imaging is a kind of round-the-clock, round-the-clock microwave remote sensing imaging radar, due to SAR at The characteristics such as picture has weatherproof, penetrability is strong, detection range is remote have extensive in national economy and Military Application field Application, therefore, resulting SAR image is imaged with important researching value in SAR.
Since there are inevitable error components in acquisition process for SAR image, such as the non-ideal movement of platform, atmosphere Disturbance, data acquisition errors, imaging algorithm error and system noise etc., these factors can make finally obtained SAR image clear Clear degree is greatly affected.The quality of SAR image can seriously affect the subsequent applications efficiency of SAR image, and SAR image pixel is differentiated Rate is higher, and image is more clear, and the target information that people therefrom obtain is abundanter, and subsequent application performance will be better, because This promotes SAR image quality and has a very important significance.
With the development of a variety of SAR technologies, several SAR image data for obtaining areal have become possibility, due at As the slight jitter of platform, certain sub-pix offset, and then sequence SAR can be generated between several SAR images of Same Scene There are redundancies, complementary information between image.Image fusion technology can use redundancy between multiple image, complementary information, By multiple image it is comprehensive at a width to scene have more comprehensively, the new images of apparent description.
It is rather mature in optical imagery that multiple image fusion promotes picture quality technology, but SAR area research compared with Few, its research is concentrated on carry out increased quality to single width SAR image.Single image processing does not increase compared to multiple image Add new image information, the edge details of ground object target are characterized by the high-frequency information of image in SAR image, and single image Processing, merely add the low-frequency information of image, do not increase the high-frequency information of image, thus single image processing is for some Distance farther out can not the ground object target of resolve minutiae do not improve ability.
Conventional images integration technology is mainly based upon the image interfusion method of small echo, and this method carrys out approximate retouch using wavelet basis Imaging device model is stated, not fine enough to portraying for model, inadequate robust, making the imaging device model constructed, there are errors, lead Cause restored image unintelligible.
Frame is a kind of extension of base in the space Hilbert, and energy has one group of vector of bound in the space Hilbert, Referred to as frame.Frame Theory can describe the model of numerous signal/image procossings, and many numerous of signal/image procossing ask Topic is inherently frame problem.
All linear transducers are illustrated as the translation and linear combination of a scattering function, frame to the perception of environment It is exactly the set of all translation point scattering functions, in physical world, frame is most natural tool, is to imaging using frame Construction in a systematic way mould is portrayed, and is modeled than using base more really, more finely, enhances robustness.
Summary of the invention
To solve the above-mentioned problems, the SAR image fusion method based on Frame Theory that the purpose of the present invention is to propose to a kind of, Using Frame Theory, more acurrate, more careful, more robust imaging device model is constructed, on this basis to several low resolutions SAR image carries out Pixel-level fusion, increases the Target scalar edge details of SAR image, improves the clarity of SAR image.
In order to achieve the above object, the present invention is resolved using following technical scheme.
SAR image fusion method based on Frame Theory, comprising the following steps:
Step 1, in SAR imaging system, the n width low resolution SAR image of Same Scene is obtained as image to be fused, N ∈ N, and n >=2.
Step 2, bilinear interpolation processing is carried out to n image to be fused respectively, obtains n width intermediate image.
Step 3, piece image is chosen in n width intermediate image as benchmark image Iref, remaining n-1 width intermediate image It is denoted as Ik, k=1 ..., n-1 are minimum according to pixel value deviation to n width intermediate image using the pixel value of benchmark image as benchmark Principle is staggeredly merged, and a width blending image, i.e. the initial estimation image g of high-resolution SAR image are obtained.
Step 4, the point spread function of SAR imaging system is obtained, Frame Theory is based on, constructs SAR using point spread function The framework matrix H of imaging system.
Step 5, according to framework matrix H, using threshold value iteration Tikhonov regularization method to initial estimation image g into Row restoration disposal, obtains High Resolution SAR Images.
Compared with prior art, the invention has the benefit that
First, the present invention passes through frame square using the framework matrix of the point spread function building imaging model of imaging system Battle array realize imaging model it is more acurrate, it is more careful, more robustly portray, missed to solve during regularization deconvolution model The not high problem of restored image clarity caused by difference.
Second, the present invention uses the Tikhonov regularization method of threshold value iteration, can effectively inhibit in image restoration process Influence of the middle system noise to image definition.
Detailed description of the invention
The present invention is described in further details in the following with reference to the drawings and specific embodiments.
Fig. 1 is the flow diagram of the SAR image fusion method of the invention based on Frame Theory.
Fig. 2 (a)-(d) is four images to be fused in the embodiment of the present invention.
Fig. 3 is that resulting intermediate image after bilinear interpolation is carried out to Fig. 2 (a).
Fig. 4 is initial estimation image obtained in the embodiment of the present invention.
Fig. 5 is the template three-dimensional figure of the point spread function of the SAR imaging system simulated in the embodiment of the present invention.
Fig. 6 is the resulting High Resolution SAR Images of the embodiment of the present invention.
Specific embodiment
Embodiment of the present invention is described in detail below in conjunction with embodiment, but those skilled in the art will It will be appreciated that following embodiment is merely to illustrate the present invention, and it is not construed as limiting the scope of the invention.
The process that Fig. 1 show a kind of SAR image fusion method based on Frame Theory provided in an embodiment of the present invention is shown It is intended to.Referring to Fig.1, realization step of the invention includes the following:
Step 1, in SAR imaging system, the n width low resolution SAR image of Same Scene is obtained as image to be fused, N ∈ N, and n >=2.
Wherein, used n image to be fused is kinematic error and characterization Same Scene area between having had corrected that image The low resolution SAR image in domain.
Step 2, bilinear interpolation processing is carried out to n image to be fused respectively, obtains n width intermediate image.Every width is waited for The pixel of blending image carries out operation as the following formula:
Wherein, F (x, y) indicates the pixel value after pixel (x, y) interpolation processing, F (x0, y0)、F(x0, y1)、F(x1, y0)、F(x1, y1) indicate the four pixel (xs nearest with point (x, y) distance0, y0)、(x0, y1)、(x1, y0)、(x1, y1) picture Element value.
Step 3, piece image is chosen in n width intermediate image as benchmark image Iref, remaining n-1 width intermediate image It is denoted as Ik, k=1 ..., n-1 are minimum according to pixel value deviation to n width intermediate image using the pixel value of benchmark image as benchmark Principle is staggeredly merged, and a width blending image, i.e. the initial estimation image g of high-resolution SAR image are obtained;Comprising following Sub-step:
Sub-step 3.1, any width intermediate image of choosing is as benchmark image Iref, remaining n-1 width intermediate image is denoted as Ik, sequence number k=1 ..., n-1.
Sub-step 3.2, by benchmark image IrefWith remaining n-1 width intermediate image IkPiecemeal is carried out respectively:
Using every 2 × 2 pixels as one group of block of pixels (p, q), (p, q) is (p, q) a 2 × 2 block of pixels, (p, q) Four pixels in a block of pixels are respectively (m0, n0), (m1, n1), (m2, n2), (m3, n3);Wherein, (m0, n0)=(2p, 2q), (m1, n1)=(2p, 2q+1), (m2, n2)=(2p+1,2q), (m3, n3)=(2p+1,2q+1), p=0,1 ..., M-1, q =0,1 ..., N-1, M × N are the size of low resolution SAR image.
Sub-step 3.3, to each pixel value in the block of pixels (p, q) of benchmark image respectively and among remaining n-1 width The pixel value of the corresponding position of image is compared, and obtains the sequence number of the smallest intermediate image of difference;
Wherein, k1, k2, k3For the sequence number of intermediate image respectively, Iref(m1, n1) indicate (p, q) a picture in benchmark image Pixel (m in plain block1, n1) corresponding pixel value, Ik(m1, n1) indicate (p, q) a picture in remaining n-1 width intermediate image Pixel (m in plain block1, n1) corresponding pixel value, Iref(m2, n2) indicate in benchmark image in (p, q) a block of pixels Pixel (m2, n2) corresponding pixel value, Ik(m1, n1) indicate in remaining n-1 width intermediate image in (p, q) a block of pixels Pixel (m2, n2) corresponding pixel value, Iref(m3, n3) indicate pixel (m in benchmark image in (p, q) a block of pixels3, n3) corresponding pixel value, Ik(m1, n1) indicate pixel (m in remaining n-1 width intermediate image in (p, q) a block of pixels3, n3) corresponding pixel value.
Sub-step 3.4, according to the sequence number of the smallest intermediate image of difference, by the pixel value in the block of pixels of benchmark image With remaining n-1 width intermediate image IkBlock of pixels in pixel value staggeredly merged:
Obtain the initial estimation image g of a width high-resolution SAR image.
Step 4, the point spread function of SAR imaging system is obtained, Frame Theory is based on, constructs SAR using point spread function The framework matrix H of imaging system;Include following sub-step:
Sub-step 4.1 obtains the point spread function of SAR imaging system, the pattern matrix h of point spread function is determined, to point The pattern matrix h of spread function carries out zero padding, makes the dimension of the pattern matrix h of point spread function and the dimension of initial estimation image Identical, pattern matrix h ', h ' after obtaining zero padding are that R × T ties up matrix;Wherein, R × T is the size of initial estimation image.
Sub-step 4.2 constructs the frame of SAR imaging system to the pattern matrix h ' carry out cyclic shift processing after zero padding Frame matrix H.
Specific cyclic shift processing includes following sub-step:
Sub-step 4.2.1 obtains the row vector of 1 R × T to the pattern matrix h ' carry out matrix expansion processing after zero padding, The 1st row as framework matrix H;To the column of the pattern matrix h ' after zero padding 1 list position of cyclic shift backward, then carry out matrix Expansion processing, obtains the 2nd row of framework matrix H;Successively to the column of the pattern matrix h ' after zero padding, cyclic shift j-1 is single backward Position, j=3 ..., R, j often take a value, carry out 1 submatrix expansion processing, obtain 1 row of framework matrix H;J gets R from 1, i.e., Obtain the preceding R row of framework matrix H.
Sub-step 4.2.2 carries out matrix exhibition to the row of the pattern matrix h ' after zero padding 1 row unit of downward cyclic shift Processing is opened, the row vector of 1 R × T is obtained, the R+1 row as framework matrix H;1 list of cyclic shift backward will be arranged again Position, then matrix expansion processing is carried out, obtain the R+2 row of framework matrix H;Successively backward to the column of the pattern matrix h ' after zero padding J-1 list position of cyclic shift, j=3 ..., R, j often take a value, carry out matrix expansion processing, obtain framework matrix H 1 row;J gets R from 1, obtains the 2nd R row of framework matrix H.
Sub-step 4.2.3, to i-1 row unit of row downward cyclic shift of the pattern matrix h ' after zero padding, i=3 ..., T, 1 row unit of every cyclic shift, obtains the R row of framework matrix H;I gets T from 1, obtains R × T R × T row vector, i.e. RT × RT square matrix, as the framework matrix H of SAR imaging system.
Further, after the 2nd row of matrix is successively is moved sequentially to the 1st row to last 1 row by the matrix expansion processing It is arranged.R × T is the size of initial estimation image.
Step 5, according to framework matrix H, using threshold value iteration Tikhonov regularization method to initial estimation image into Row restoration disposal, obtains High Resolution SAR Images.
Specific Tikhonov threshold value alternative manner is as follows:
The Tikhonov regularization method of the threshold value iteration are as follows:
Iteration function is set as f, enables f0=0, then Tikhonov threshold value iterative formula are as follows:
Wherein, S is the high fdrequency component in initial estimation image, and Γ is soft-threshold operator,For the transposition of H;
I indicate withThe identical unit diagonal matrix of dimension, sgn (x) are sign function, and th is threshold value,γ is threshold coefficient, 0 < γ < 1;σnFor the noise variance of initial estimation image;
The termination condition of iterative algorithm are as follows:
Wherein, δ > 0 and δ are the constants to go to zero, | | | |2Indicate 2- norm.
Illustratively, fusion experiment is carried out to the four width low resolution SAR images that somewhere is surveyed using method of the invention, N=4 is taken, detailed process is as follows for experiment:
(1) low resolution SAR image is obtained as image to be fused
The actual measurement low resolution SAR image of the characterization Same Scene of kinematic error between four corrected images of selection, i.e., In Fig. 2 a), b), c), d) four width images are as image to be fused.
(2) intermediate image is obtained
Bilinear interpolation is carried out respectively to four width actual measurement low resolution SAR image:
Wherein, F (x, y) indicates the pixel value after pixel (x, y) interpolation processing, F (x0, y0)、F(x0, y1)、F(x1, y0)、F(x1, y1) indicate the four pixel (xs nearest with point (x, y) distance0, y0)、(x0, y1)、(x1, y0)、(x1, y1) picture Element value.Four width intermediate images are obtained, as shown in figure 3, Fig. 3 is the figure that a) figure obtains after bilinear interpolation processing in Fig. 2 Picture.
Comparison diagram 2a) and Fig. 3, it is found that the pixel of obtained intermediate image becomes more, in fact, by two-wire Property interpolation processing after the pixel of image will increase 1 times, support can be provided in this way for subsequent staggeredly fusion;It can from Fig. 3 To see, the intermediate image after bilinear interpolation fogs, this is mainly due to single image interpolation merely add it is low Frequency information does not increase high-frequency information, and therefore, the corresponding edge definition of image high frequency section is not greatly improved.
(3) the initial estimation image of high-resolution SAR image is obtained
Piece image is arbitrarily chosen in four width intermediate images as benchmark image, to its excess-three width intermediate image according to most The different fusion method of small pixel value difference is merged with benchmark image, and obtained blending image is as high-resolution SAR image Initial estimation image.
Specifically, a width intermediate image is arbitrarily chosen as benchmark image Iref, remaining 3 width intermediate image is denoted as Ik;It will Benchmark image IrefPiecemeal is carried out respectively with remaining 3 width intermediate image Ik: i.e. using every 2 × 2 pixels as one group of block of pixels (p, q), (p, q) are (p, q) a 2 × 2 block of pixels, and four pixels in (p, q) a block of pixels are respectively (m0, n0), (m1, n1), (m2, n2), (m3, n3);Wherein, (m0, n0)=(2p, 2q), (m1, n1)=(2p, 2q+1), (m2, n2)=(2p+1,2q), (m3, n3)=(2p+1,2q+1), p=0,1 ..., M-1, q=0,1 ..., N-1, M × N are the size of low resolution SAR image. Then, to each pixel in the block of pixels (p, q) of benchmark image respectively with the corresponding position of remaining n-1 width intermediate image Pixel is compared, and obtains the smallest middle image sequence number of difference;
Wherein, k1, k2, k3The respectively sequence number of intermediate image, Iref(m1, n1) indicate (p, q) a picture in benchmark image Pixel (m in plain block1, n1) corresponding pixel value, Ik(m1, n1) indicate (p, q) a picture in remaining n-1 width intermediate image Pixel (m in plain block1, n1) corresponding pixel value, Iref(m2, n2) indicate in benchmark image in (p, q) a block of pixels Pixel (m2, n2) corresponding pixel value, Ik(m1, n1) indicate in remaining n-1 width intermediate image in (p, q) a block of pixels Pixel (m2, n2) corresponding pixel value, Iref(m3, n3) indicate pixel (m in benchmark image in (p, q) a block of pixels3, n3) corresponding pixel value, Ik(m1, n1) indicate pixel (m in remaining n-1 width intermediate image in (p, q) a block of pixels3, n3) corresponding pixel value;
Finally, according to the smallest middle image sequence number of difference, by the pixel value and residue in the block of pixels of benchmark image 3 width intermediate image IkBlock of pixels in pixel value staggeredly merged:
The initial estimation image of a width high-resolution SAR image is obtained, R × T is the size of initial estimation image.
It is the initial estimation image that four width intermediate images are obtained after staggeredly merging shown in Fig. 4.It can be seen that by There is the offset of sub-pix, thus the information with redundancy complementation in the SAR image of four width Same Scenes, staggeredly merged Afterwards, complementary information is fused to a high-resolution grid, improves resolution ratio to a certain degree.But image still have by Image caused by the point spread function of SAR imaging system is fuzzy.
(4) framework matrix of SAR imaging system is obtained
The point spread function of SAR imaging system is obtained, Frame Theory is based on, utilizes point spread function building SAR imaging system The framework matrix of system.Include following sub-step:
Sub-step 4.1 obtains the point spread function of SAR imaging system, the pattern matrix h of point spread function is determined, to point The pattern matrix h of spread function carries out zero padding, makes the dimension of the pattern matrix h of point spread function and the dimension of initial estimation image Identical, pattern matrix h ', h ' after obtaining zero padding are that R × T ties up matrix;Wherein, R × T is the size of initial estimation image;
Sub-step 4.2 constructs the frame of SAR imaging system to the pattern matrix h ' carry out cyclic shift processing after zero padding Frame matrix H.
Specific cyclic shift processing are as follows: sub-step 4.2.1, at the pattern matrix h ' carry out matrix expansion after zero padding Reason, obtains the row vector of 1 R × T, the 1st row as framework matrix H;The column of pattern matrix h ' after zero padding are recycled backward 1 list position is shifted, then carries out matrix expansion processing, obtains the 2nd row of framework matrix H;Successively to the pattern matrix after zero padding J-1 unit of cyclic shift, j=3 ..., R, j often take a value to the column of h ' backward, carry out 1 submatrix expansion processing, obtain frame 1 row of frame matrix H;J gets R from 1 to get to the preceding R row of framework matrix H;
Sub-step 4.2.2 carries out matrix exhibition to the row of the pattern matrix h ' after zero padding 1 row unit of downward cyclic shift Processing is opened, the row vector of 1 R × T is obtained, the R+1 row as framework matrix H;1 list of cyclic shift backward will be arranged again Position, then matrix expansion processing is carried out, obtain the R+2 row of framework matrix H;Successively backward to the column of the pattern matrix h ' after zero padding J-1 list position of cyclic shift, j=3 ..., R, j often take a value, carry out matrix expansion processing, obtain framework matrix H 1 row;J gets R from 1, obtains the 2nd R row of framework matrix H;
Sub-step 4.2.3, to i-1 row unit of row downward cyclic shift of the pattern matrix h ' after zero padding, i=3 ..., T, 1 row unit of every cyclic shift, obtains the R row of framework matrix H;I gets T from 1, obtains R × T R × T row vector, i.e. RT × RT square matrix, as the framework matrix H of SAR imaging system.
During above-mentioned cyclic shift, downward cyclic shift 1 time every to the row of the pattern matrix h ' after zero padding, backward to column Cyclic shift R times;It arranges every cyclic shift 1 time, carries out 1 submatrix expansion processing, obtain 1 row of framework matrix H;Every circulation at once Displacement 1 time obtains the row vector of R R × T to get the T row of framework matrix H is arrived;Successively to the row of the pattern matrix h ' after zero padding Downward cyclic shift T times, R × T R × T row vector, i.e. RT × RT square matrix, as the frame square of SAR imaging system can be obtained Battle array H.
The present invention program needs to construct the frame of SAR imaging system models using the point spread function of SAR imaging system Matrix, the corresponding point spread function of different SAR imaging systems are different.The point spread function of SAR imaging system in the present embodiment The point target data that number is emulated by the radar parameter of practical SAR imaging system are Lai approximate acquisition.It is this implementation shown in Fig. 5 The point spread function for the SAR imaging system that example simulates.
(5) High Resolution SAR Images are obtained
According to comprehensive framework matrix, initial estimation Image Iterative is asked using the Tikhonov regularization method of threshold value iteration Solution, removal image as caused by system point spread function obscure, and finally obtain a fused High Resolution SAR Images.
The Tikhonov regularization method of specific threshold value iteration are as follows:
Iteration function is set as f, enables f0=0, then Tikhonov threshold value iterative formula are as follows:
Wherein, S is the high fdrequency component in initial estimation image, and Γ is soft-threshold operator,For the transposition of H;
I indicate withThe identical unit diagonal matrix of dimension, sgn (x) are sign function, and th is threshold value,γ is threshold coefficient, 0 < γ < 1;σnFor the noise variance of initial estimation image;
The termination condition of iterative algorithm are as follows:
Wherein, δ > 0 and δ are the constants to go to zero, | | | |2Indicate 2- norm.
It should be further noted that the specific iterative step of the Tikhonov threshold value alternative manner of the present embodiment is as follows:
Firstly, given δ=10-5, enable f0=0;
Secondly, according to Tikhonov threshold value iterative formulaCalculate fr+1
Finally, working asWhen, iteration is terminated, is enabled As utilize Tikhonov threshold value iteration Method carries out obtained High Resolution SAR Images after restoration disposal.
It is resulting High Resolution SAR Images after merging shown in Fig. 6, comparison diagram 6 and Fig. 1 can be seen that the mesh in Fig. 6 The edge details for marking atural object are more richer than the edge details of the same Target scalar of figure each in figure Fig. 1, and image is apparent, with back The contrast of scape becomes apparent from, and illustrates to may be implemented to obtain by the low fusion for differentiating SAR image sequence using the method for the present invention high The clear SAR image of pixel resolution realizes being obviously improved for SAR image quality.
To each image obtained in the above embodiment of the present invention, by comparing image to be fused and final resulting fusion The Brenner gradient function and entropy of High Resolution SAR Images after reconstruction illustrates this method to the effect of SAR image increased quality Fruit.
1Brenner gradient function
Brenner gradient function is a kind of common Image Definition.In SAR image, clear image Target scalar has more sharp edge, and edge details are richer, thus Brenner gradient function value is bigger.
The Brenner gradient function value D (f) of image:
D (f)=∑yx| f (x+2, y)-f (x, y) |2
Wherein, f (x, y) indicates that the gray value of image f corresponding pixel points (x, y), f (x+2, y) indicate image f respective pixel The gray value of point (x+2, y).
2 entropys statistics
For SAR image, the entropy of image is smaller, illustrates that image object is more obvious with background differentiation, image is more clear.
The entropy H of image:
Wherein, piOccurs the probability for the pixel that gray value is i for image.
The Brenner gradient letter of image to be fused and final resulting High Resolution SAR Images in the embodiment of the present invention Several and entropy calculated result is as shown in table 1.
The calculated result of 1 embodiment of table
Index Image to be fused High Resolution SAR Images
Brenner gradient function 1751.23 2517.72
Entropy 6.97 6.68
As it can be seen from table 1 the method merges image through the invention, the Brenner gradient function of image Value is significant to be increased, and illustrates that the method for the present invention can significantly improve the marginal definition of SAR image, and entropy reduces, and illustrates present invention side Method can enhance the differentiation of image and background, and therefore, the method for the present invention is remarkably improved the clarity of image.
Those of ordinary skill in the art will appreciate that: realize that all or part of the steps of above method embodiment can pass through The relevant hardware of program instruction is completed, and program above-mentioned can be stored in a computer readable storage medium, the program When being executed, step including the steps of the foregoing method embodiments is executed;And storage medium above-mentioned includes: ROM, RAM, magnetic disk or light The various media that can store program code such as disk.
The above description is merely a specific embodiment, but scope of protection of the present invention is not limited thereto, any Those familiar with the art in the technical scope disclosed by the present invention, can easily think of the change or the replacement, and should all contain Lid is within protection scope of the present invention.Therefore, protection scope of the present invention should be based on the protection scope of the described claims.

Claims (8)

1. the SAR image fusion method based on Frame Theory, which comprises the following steps:
Step 1, in SAR imaging system, the n width low resolution SAR image of Same Scene is obtained as image to be fused, n ∈ N, and n >=2;
Step 2, bilinear interpolation processing is carried out to n image to be fused respectively, obtains n width intermediate image;
Step 3, piece image is chosen in n width intermediate image as benchmark image Iref, remaining n-1 width intermediate image is denoted as Ik, k=1 ..., n-1, using the pixel value of benchmark image as benchmark, to n width intermediate image according to pixel value deviation minimum principle It is staggeredly merged, obtains a width blending image, i.e. the initial estimation image g of high-resolution SAR image;
Step 4, the point spread function of SAR imaging system is obtained, Frame Theory is based on, is imaged using point spread function building SAR The framework matrix H of system;
Step 5, according to framework matrix H, initial estimation image g is answered using the Tikhonov regularization method of threshold value iteration Original place reason, obtains High Resolution SAR Images.
2. the SAR image fusion method according to claim 1 based on Frame Theory, which is characterized in that in step 1, institute State the image of image to be fused kinematic error between corrected image.
3. the SAR image fusion method according to claim 1 based on Frame Theory, which is characterized in that in step 2, institute State bilinear interpolation processing are as follows: operation is carried out as the following formula to the pixel of every image to be fused:
Wherein, F (x, y) indicates the pixel value after pixel (x, y) interpolation processing, F (x0, y0)、F(x0, y1)、F(x1, y0)、F (x1, y1) indicate the four pixel (xs nearest with point (x, y) distance0, y0)、(x0, y1)、(x1, y0)、(x1, y1) pixel value.
4. the SAR image fusion method according to claim 1 based on Frame Theory, which is characterized in that step 3 include with Lower sub-step:
Sub-step 3.1, any width intermediate image of choosing is as benchmark image Iref, remaining n-1 width intermediate image is denoted as Ik, sequence Row number k=1 ..., n-1;
Sub-step 3.2, by benchmark image IrefWith remaining n-1 width intermediate image IkPiecemeal is carried out respectively;
Using every 2 × 2 pixels as one group of block of pixels (p, q), (p, q) is (p, q) a 2 × 2 block of pixels, (p, q) a picture Four pixels in plain block are respectively (m0, n0), (m1, n1), (m2, n2), (m3, n3);Wherein, (m0, n0)=(2p, 2q), (m1, n1)=(2p, 2q+1), (m2, n2)=(2p+1,2q), (m3, n3)=(2p+1,2q+1), p=0,1 ..., M-1, q=0, 1 ..., N-1, M × N are the size of low resolution SAR image;
Sub-step 3.3, to each pixel in the block of pixels (p, q) of benchmark image respectively with remaining n-1 width intermediate image The pixel value of corresponding position is compared, and obtains the sequence number of the smallest intermediate image of difference;
Wherein, k1, k2, k3The respectively sequence number of intermediate image, Iref(m1, n1) indicate (p, q) a block of pixels in benchmark image In pixel (m1, n1) corresponding pixel value, Ik(m1, n1) indicate (p, q) a block of pixels in remaining n-1 width intermediate image In pixel (m1, n1) corresponding pixel value, Iref(m2, n2) indicate pixel in benchmark image in (p, q) a block of pixels Point (m2, n2) corresponding pixel value, Ik(m1, n1) indicate pixel in remaining n-1 width intermediate image in (p, q) a block of pixels Point (m2, n2) corresponding pixel value, Iref(m3, n3) indicate pixel (m in benchmark image in (p, q) a block of pixels3, n3) Corresponding pixel value, Ik(m1, n1) indicate pixel (m in remaining n-1 width intermediate image in (p, q) a block of pixels3, n3) Corresponding pixel value;
Sub-step 3.4 by the pixel value in the block of pixels of benchmark image and is remained according to the sequence number of the smallest intermediate image of difference Remaining n-1 width intermediate image IkBlock of pixels in pixel value staggeredly merged:
Obtain the initial estimation image g of a width high-resolution SAR image.
5. the SAR image fusion method according to claim 1 based on Frame Theory, which is characterized in that step 4 include with Lower sub-step:
Sub-step 4.1 obtains the point spread function of SAR imaging system, determines the pattern matrix h of point spread function, to a diffusion The pattern matrix h of function carries out zero padding, makes the dimension of the pattern matrix h of point spread function with the dimension phase of initial estimation image Together, pattern matrix h ', h ' after obtaining zero padding are that R × T ties up matrix;Wherein, R × T is the size of initial estimation image;
Sub-step 4.2 constructs the frame square of SAR imaging system to the pattern matrix h ' carry out cyclic shift processing after zero padding Battle array H.
6. the SAR image fusion method according to claim 5 based on Frame Theory, which is characterized in that in step 4, institute It states cyclic shift processing and includes following sub-step:
Sub-step 4.2.1 obtains the row vector of 1 R × T to the pattern matrix h ' carry out matrix expansion processing after zero padding, as The 1st row of framework matrix H;To the column of the pattern matrix h ' after zero padding 1 list position of cyclic shift backward, then carry out matrix expansion Processing, obtains the 2nd row of framework matrix H;Successively to the column of the pattern matrix h ' after zero padding j-1 unit of cyclic shift backward, j =3 ..., R, j often take a value, carry out 1 submatrix expansion processing, obtain 1 row of framework matrix H;
J gets R from 1 to get to the preceding R row of framework matrix H;
Sub-step 4.2.2 carries out at matrix expansion the row of the pattern matrix h ' after zero padding 1 row unit of downward cyclic shift Reason, obtains the row vector of 1 R × T, the R+1 row as framework matrix H;1 list position of cyclic shift backward will be arranged again, then Matrix expansion processing is carried out, the R+2 row of framework matrix H is obtained;Successively the column of the pattern matrix h ' after zero padding are recycled backward J-1 list position, j=3 ..., T are shifted, j often takes a value, carries out matrix expansion processing, obtains the 1 of framework matrix H Row;J gets R from 1, obtains the 2nd R row of framework matrix H;
Sub-step 4.2.3, to i-1 row unit of row downward cyclic shift of the pattern matrix h ' after zero padding, i=3 ..., T, often 1 row unit of cyclic shift, obtains the R row of framework matrix H;I gets T from 1, obtains R × T R × T row vector, i.e. RT × RT Square matrix, as the framework matrix H of SAR imaging system.
7. the SAR image fusion method according to claim 6 based on Frame Theory, which is characterized in that sub-step 4.2.1, in sub-step 4.2.2 and sub-step 4.2.3, matrix expansion processing for by the 2nd row of matrix to last 1 row successively It is arranged after being moved sequentially to the 1st row.
8. the SAR image fusion method according to claim 4 based on Frame Theory, which is characterized in that in step 5, institute State the Tikhonov regularization method of threshold value iteration are as follows:
Iteration function is set as f, enables f0=0, then Tikhonov threshold value iterative formula are as follows:
Wherein, S is the high fdrequency component in initial estimation image, and Γ is soft-threshold operator,For the transposition of H;
I indicate withThe identical unit diagonal matrix of dimension, sgn (x) are sign function, and th is threshold value,γ is threshold coefficient, 0 < γ < 1;σnFor the noise variance of initial estimation image;
The termination condition of iterative algorithm are as follows:
Wherein, δ > 0 and δ are the constants to go to zero, | | | |2Indicate 2- norm.
CN201910091037.2A 2019-01-30 2019-01-30 SAR image fusion method based on frame theory Active CN109829874B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910091037.2A CN109829874B (en) 2019-01-30 2019-01-30 SAR image fusion method based on frame theory

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910091037.2A CN109829874B (en) 2019-01-30 2019-01-30 SAR image fusion method based on frame theory

Publications (2)

Publication Number Publication Date
CN109829874A true CN109829874A (en) 2019-05-31
CN109829874B CN109829874B (en) 2023-06-30

Family

ID=66863070

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910091037.2A Active CN109829874B (en) 2019-01-30 2019-01-30 SAR image fusion method based on frame theory

Country Status (1)

Country Link
CN (1) CN109829874B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538306A (en) * 2021-06-15 2021-10-22 西安电子科技大学 Multi-image fusion method for SAR image and low-resolution optical image
CN115712118A (en) * 2022-11-07 2023-02-24 江苏省水利科学研究院 Pixel offset tracking monitoring and correcting method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060279585A1 (en) * 2004-12-17 2006-12-14 Peyman Milanfar System and method for robust multi-frame demosaicing and color super resolution
CN101609549A (en) * 2009-07-24 2009-12-23 河海大学常州校区 The multi-scale geometric analysis super-resolution processing method of video blurred image
US20150139560A1 (en) * 2013-11-21 2015-05-21 Bae Systems Information And Electronic Systems Integration Inc. Coded image system and method thereof
US20150181131A1 (en) * 2012-07-20 2015-06-25 Carl Zeiss Ag Method and apparatus for image reconstruction
CN106157317A (en) * 2016-07-21 2016-11-23 武汉大学 The high-resolution remote sensing image fusion rules method guided based on dispersion tensor
CN106709877A (en) * 2016-11-11 2017-05-24 天津大学 Image deblurring method based on multi-parameter regular optimization model
CN106920213A (en) * 2017-01-19 2017-07-04 首都师范大学 The acquisition methods and system of a kind of high-definition picture

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060279585A1 (en) * 2004-12-17 2006-12-14 Peyman Milanfar System and method for robust multi-frame demosaicing and color super resolution
CN101609549A (en) * 2009-07-24 2009-12-23 河海大学常州校区 The multi-scale geometric analysis super-resolution processing method of video blurred image
US20150181131A1 (en) * 2012-07-20 2015-06-25 Carl Zeiss Ag Method and apparatus for image reconstruction
US20150139560A1 (en) * 2013-11-21 2015-05-21 Bae Systems Information And Electronic Systems Integration Inc. Coded image system and method thereof
CN106157317A (en) * 2016-07-21 2016-11-23 武汉大学 The high-resolution remote sensing image fusion rules method guided based on dispersion tensor
CN106709877A (en) * 2016-11-11 2017-05-24 天津大学 Image deblurring method based on multi-parameter regular optimization model
CN106920213A (en) * 2017-01-19 2017-07-04 首都师范大学 The acquisition methods and system of a kind of high-definition picture

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
VEYSEL ASLANTAS等: "Multi Focus Image Fusion by Differential Evolution Algorithm", 《2014 11TH INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL,AUTOMATION AND ROBOTICS》 *
梅金金: "基于正则化方法的图像复原与融合研究", 《中国博士学位论文全文数据库(信息科技辑)》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538306A (en) * 2021-06-15 2021-10-22 西安电子科技大学 Multi-image fusion method for SAR image and low-resolution optical image
CN113538306B (en) * 2021-06-15 2024-02-13 西安电子科技大学 SAR image and low-resolution optical image multi-image fusion method
CN115712118A (en) * 2022-11-07 2023-02-24 江苏省水利科学研究院 Pixel offset tracking monitoring and correcting method
CN115712118B (en) * 2022-11-07 2023-08-11 江苏省水利科学研究院 Pixel offset tracking monitoring and correcting method

Also Published As

Publication number Publication date
CN109829874B (en) 2023-06-30

Similar Documents

Publication Publication Date Title
Ozkan et al. POCS-based restoration of space-varying blurred images
Lertrattanapanich et al. High resolution image formation from low resolution frames using Delaunay triangulation
US8594464B2 (en) Adaptive super resolution for video enhancement
Li et al. Super resolution for remote sensing images based on a universal hidden Markov tree model
CN103093445B (en) Unified feature space image super-resolution reconstruction method based on joint sparse constraint
CN101027679B (en) System and method for representing a general two dimensional spatial transformation
CN113222825B (en) Infrared image super-resolution reconstruction method based on visible light image training and application
CN107133923A (en) A kind of blurred picture non-blind deblurring method based on self-adaption gradient sparse model
CN107292819A (en) A kind of infrared image super resolution ratio reconstruction method protected based on edge details
CN109829874A (en) SAR image fusion method based on Frame Theory
CN112184547B (en) Super resolution method of infrared image and computer readable storage medium
CN105513033A (en) Super-resolution reconstruction method based on non-local simultaneous sparse representation
Hung et al. Novel DCT-Based Image Up-Sampling Using Learning-Based Adaptive ${k} $-NN MMSE Estimation
Wang et al. SAR images super-resolution via cartoon-texture image decomposition and jointly optimized regressors
Katartzis et al. Robust Bayesian estimation and normalized convolution for super-resolution image reconstruction
Choi et al. Group-based bi-directional recurrent wavelet neural network for efficient video super-resolution (VSR)
Lu et al. Utilizing homotopy for single image superresolution
Qureshi et al. Investigating image super resolution techniques: What to choose?
Frucci et al. An automatic image scaling up algorithm
CN114387258B (en) Hyperspectral image reconstruction method based on regional dynamic depth expansion neural network
CN113382247B (en) Video compression sensing system and method based on interval observation, equipment and storage medium
CN114998138B (en) High dynamic range image artifact removal method based on attention mechanism
Ye et al. MRA-IDN: A Lightweight Super-Resolution Framework of Remote Sensing Images based on Multi-Scale Residual Attention Fusion Mechanism
장건운 Noise Modeling Generative Adversarial Network for Real-World Denoising
CN116309077A (en) Image blind super-resolution reconstruction method based on airspace variable fuzzy core estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant