CN116167956A - ISAR and VIS image fusion method based on asymmetric multi-layer decomposition - Google Patents
ISAR and VIS image fusion method based on asymmetric multi-layer decomposition Download PDFInfo
- Publication number
- CN116167956A CN116167956A CN202310313924.6A CN202310313924A CN116167956A CN 116167956 A CN116167956 A CN 116167956A CN 202310313924 A CN202310313924 A CN 202310313924A CN 116167956 A CN116167956 A CN 116167956A
- Authority
- CN
- China
- Prior art keywords
- layer
- image
- fusion
- sigma
- spatial frequency
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000354 decomposition reaction Methods 0.000 title claims abstract description 26
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 9
- 230000004927 fusion Effects 0.000 claims abstract description 58
- 238000000034 method Methods 0.000 claims abstract description 26
- 238000001914 filtration Methods 0.000 claims description 16
- 238000000605 extraction Methods 0.000 claims description 6
- 230000002787 reinforcement Effects 0.000 claims description 5
- 238000007792 addition Methods 0.000 claims description 3
- 238000011068 loading method Methods 0.000 claims description 3
- 230000014759 maintenance of location Effects 0.000 abstract 2
- 238000004321 preservation Methods 0.000 abstract 1
- 238000013459 approach Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000011426 transformation method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The invention discloses an ISAR and VIS image fusion method based on asymmetric multi-layer decomposition, which loads an inverse synthetic aperture radar image and a visible light image with the same spatial resolution, compares the weighted spatial frequency variance of the inverse synthetic aperture radar image and the visible light image, and divides the two images into a detail image I a And coarse image I b The method comprises the steps of carrying out a first treatment on the surface of the Decomposing a frame pair I using a multi-layer Gaussian edge window filter a And I b Respectively decomposing to obtain I a Detail-preserving layer S of (2) da Edge retention layer S ea Basic energy layer S ga 、I b Detail-preserving layer S of (2) db Edge retention layer S eb And a basic energy layer S gb The method comprises the steps of carrying out a first treatment on the surface of the By S obtained da For S db Obtaining I by conducting guidance fusion strategy b Final asymmetric detail preserving fusion layer S fb The method comprises the steps of carrying out a first treatment on the surface of the Constructing discriminant using local variance and spatial frequencyFor S da And S is fb Fusing to obtain a final detail preserving fusion layer S fd The method comprises the steps of carrying out a first treatment on the surface of the By omega vs S ea And S is eb Fusing to obtain a final edge preservation fusion layer S fe The method comprises the steps of carrying out a first treatment on the surface of the Will S ga And S is equal to gb Fusion to obtain the final basic energy layer S fg The method comprises the steps of carrying out a first treatment on the surface of the Will S fd ,S fe And S is fg Adding to obtain final fusion image I f 。
Description
Technical Field
The invention belongs to the field of image processing, and particularly relates to an ISAR and VIS image fusion method based on asymmetric multi-layer decomposition.
Background
Image fusion is an important branch in the field of image processing, and is widely applied to various civil and military fields such as medical diagnosis, multi-source data decision, multi-depth-of-field imaging and the like. An image obtained from one source can only acquire part of the information in the scene due to limitations in sensor physical characteristics and environmental factors. Due to the limitations of the visible light sensor, the resulting image cannot obtain the motion state and material information of the object within one frame. Inverse Synthetic Aperture Radar (ISAR) images may receive motion information of a target according to their imaging principles, but such images have poor ability to obtain information of a relatively stationary object, resulting in a large area of dark area of the image. Therefore, it is necessary to fuse various key information from different types of sensors to better understand the scene.
Common image fusion methods are largely divided into three main categories, a multi-scale transformation method, a sparse representation method and a hybrid model. These approaches attempt to fuse two source images by converting the images into a conversion space with independent features. Sparse representation based methods attempt to represent different components in the source image by sparse coefficients, hybrid models attempt to otherwise re-represent image features, and multi-scale transform methods attempt to decompose the image into different layers containing different details. In most image sources, the amount of information is different, and conventional image decomposition methods typically employ the same approach, which can lead to feature layer mismatch, affecting the final fusion performance.
Disclosure of Invention
Accordingly, the present invention is directed to an ISAR and VIS image fusion method based on asymmetric multi-layer decomposition.
In order to achieve the above purpose, the technical scheme of the invention is realized as follows:
the embodiment of the invention provides an ISAR and VIS image fusion method based on asymmetric multi-layer decomposition, which comprises the following steps:
step one, loading inverse synthetic aperture radar Image (ISAR) I with same spatial resolution 1 And visible light image (VIS) I 2 ;
Step two, determining I 1 Spatial frequency F of (2) 1 And pass through F 1 Determining I 1 Is of weighted spatial frequency variance sigma 1 The method comprises the steps of carrying out a first treatment on the surface of the Determining I 2 Spatial frequency F of (2) 2 And pass through F 2 Determining I 2 Is of weighted spatial frequency variance sigma 2 ;
Step three, comparing sigma 1 And sigma (sigma) 2 If sigma 1 Greater than or equal to sigma 2 Will I 1 Recorded as detail image I a ,I 2 Recorded as coarse image I b The method comprises the steps of carrying out a first treatment on the surface of the If sigma 1 Less than sigma 2 Will I 2 Recorded as detail image I a ,I 1 Recorded as coarse image I b ;
Step four, decomposing the frame pair I through a plurality of layers of Gaussian edge window filters a And I b Respectively carrying out multi-layer decomposition to obtain I a Detail-preserving layer S of (2) da 、I a Edge preserving layer S of (2) ea 、I a Is a basic energy layer S of (1) ga 、I b Detail-preserving layer S of (2) db 、I b Edge preserving layer S of (2) eb And I b Is a basic energy layer S of (1) gb ;
Step five, through S da For S db Performing guided fusion to obtain I b Detail preserving fusion layer S of (2) fb ;
Step six, determining S da Local variance L of (2) da (k, o) and spatial frequency F da Determining S fb Local variance L of (2) fb (k, o) and spatial frequency F fb Through L da (k,o)、L fb (k,o)、F da And F fb Structural discriminant pair S da And S is fb Fusing to obtain a final detail preserving fusion layer S fd ;
Step seven, determining S ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Through sigma ea Sum sigma eb Determining a weight coefficient omega, and comparing S with omega ea And S is eb Weighted fusion is carried out to obtain a final edge preserving fusion layer S fe ;
Step eight, S is carried out ga And S is equal to gb Fusion to obtain the final basic energy fusion layer S fg ;
Step nine, through S fd 、S fe And S is fg The sum of the additions determines the final fusion image I f 。
In the above scheme, the second step is specifically implemented by the following steps:
(201) I is determined according to 1 Line frequency R of (2) 1 And I 1 Column frequency C of (2) 1 Is that
Wherein M and N respectively represent I 1 Length and width of I 1 (x, y) represents I 1 Gray values of pixel points in the x-th row and the y-th column, x is {1,.. M }, y is {1,.. A., N }, and sigma (·) represents a summation operation;
(202) I is determined according to 2 Line frequency R of (2) 2 And I 2 Column frequency C of (2) 2 Is that
Wherein M and N respectively represent I 2 Length and width of I 2 (x, y) represents I 2 Gray values of pixel points in the x-th row and y-th column, x epsilon { 1..once, M }, y epsilon { 1..once, N };
(203) I is determined according to 1 Spatial frequency F of (2) 1 And I 2 Spatial frequency F of (2) 2 Is that
The spatial frequency operation of one image can be completed by utilizing the steps (201) - (203);
(204) I is determined according to 1 Average value A of gray scale of (2) 1 And I 2 Average value A of gray scale of (2) 2 Is that
(205) I is determined according to 1 Gray value variance V of (2) 1 And I 2 Gray scale of (a)Value variance V 2 Is that
(206) I is determined according to 1 Is of weighted spatial frequency variance sigma 1 And I 2 Is of weighted spatial frequency variance sigma 2 Is that
In the above scheme, the fourth step is specifically implemented by the following steps:
(301) Gauss high frequency extraction method in decomposition framework of multilayer Gauss side window filter a And I b Respectively decomposing, and determining I according to the following formula a Ith low frequency information I through gaussian filter gi ,I b Mth low frequency information I through gaussian filter fm Is that
Wherein I is {1,2,3}, m is {1,2,3}, I g0 =I a ,I f0 =I b GF (·) represents performing a gaussian filtering operation;
(302) Edge window high frequency extraction method in decomposition frame of multi-layer Gaussian edge window filter a And I b Respectively decomposing, and determining I according to the following formula a Jth low frequency information I through side window filter dj And I b Nth low frequency information I through side window filter hn Is that
Where j is {1,2,3}, n is {1,2,3}, I d0 =I a ,I h0 =I b SWF (·) represents performing a side window filtering operation, r represents the radius of the filtering window, e represents the number of filter iterations;
(303) I is determined according to a Detail-preserving layer S of (2) da And I b Detail-preserving layer S of (2) db Is that
(304) I is determined according to a Edge preserving layer S of (2) ea And I b Edge preserving layer S of (2) eb Is that
(305) I is determined according to a Is a basic energy layer S of (1) ga And I b Is a basic energy layer S of (1) gb Is that
In the above scheme, the fifth step is specifically implemented by the following steps:
(401) Using guided filtering operations, using S da For S db Guiding to obtain asymmetric guiding reinforcement layer G as
G=GUF(S da ,S db ) (12)
Wherein, GUF (·) represents performing a guided filtering operation;
(402) I is determined according to b Asymmetric detail preserving fusion layer S of (1) fb Is that
S fb =S db +G (13)。
In the above scheme, the sixth step is specifically implemented by the following steps:
(501) S is determined according to the following da Local variance L of (2) da (k, o) and S fb Local variance L of (2) fb (k, o) is
Wherein at S da Randomly generates a region with the length of P and the width of Q, and at S fb A region of the size P and Q is also generated, k represents the abscissa of the region center position, o represents the ordinate of the region center position, w represents the abscissa pixel index in the region, z represents the ordinate pixel index in the region, μ da Represent S da Average value of pixel gray values in this region, mu fb Represent S fb An average value of pixel gradation values in this region;
(502) S is determined according to the following da Spatial frequency F of (2) da And S is fb Spatial frequency F of (2) fb Is that
Wherein C is da Represent S da R is the column frequency of da Represent S da Line frequency of C fb Represent S fb R is the column frequency of fb Represent S fb Is a line frequency of (2);
(503) Determining a final detail preserving fusion layer S according to fd Is that
In the above scheme, the seventh step is specifically implemented by the following steps:
(601) S is determined according to the following ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Is that
Wherein F is ea Represent S ea Spatial frequency of V ea Represent S ea Gray value variance of F eb Represent S eb Spatial frequency of V eb Represent S eb Is a gray value variance of (2);
(602) Determining the weight coefficient omega as follows
(603) Determining the final edge preserving fusion layer S according to the following fe Is that
S fe =ω×S ea +(1-ω)×S eb (19)。
In the above scheme, the step eight specifically includes: the final basic energy fusion layer S is determined according to the following formula fg Is that
In the above scheme, the step nine specifically includes: the final fused image I is determined according to the following formula f Is that
I f =S fd +S fe +S fg (21)。
Compared with the prior art, the method has the advantages that the weighted spatial frequency variance is used as an image information richness discrimination standard to divide two images into the detail image and the rough image, an asymmetric decomposition method is adopted to decompose the two images to prevent the loss of image information, and the detail retaining layer of the rough image is used for guiding and fusing the detail retaining layer of the rough image so as to strengthen the details of the detail retaining layer of the rough image; and then adopting three different fusion strategies to fuse different types of layers, and finally adding different fusion results to obtain a final fusion result.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is an inverse synthetic aperture radar image of an input of the present invention;
FIG. 3 is a visible light image of an input of the present invention;
FIG. 4 is a detail preserving layer of a coarse image of the present invention;
FIG. 5 is a guiding reinforcement layer of the present invention;
FIG. 6 is an asymmetric detail preserving fusion layer of a coarse image in the present invention;
FIG. 7 is a final detail-preserving fusion layer in the present invention;
FIG. 8 is a final edge preserving fusion layer in the present invention;
FIG. 9 is a final basic energy fusion layer of the present invention;
FIG. 10 is a graph of the final fusion results in the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
The embodiment of the invention provides an ISAR and VIS image fusion method based on asymmetric multi-layer decomposition, which is shown in figure 1, and comprises the following steps:
step one, loading inverse synthetic aperture radar Image (ISAR) I with same spatial resolution 1 And visible light image (VIS) I 2 ;
Specifically, FIG. 2 is a loaded inverse synthetic aperture radar Image (ISAR) I 1 Containing certain aircraft speed information, FIG. 3 is a loaded visible light image (VIS) I 2 Contains certain scene information.
Step two, determining inverse synthetic aperture radar Image (ISAR) I 1 Spatial frequency F of (2) 1 By F 1 Determining I 1 Is of weighted spatial frequency variance sigma 1 The method comprises the steps of carrying out a first treatment on the surface of the Determining visible light image (VIS) I 2 Spatial frequency F of (2) 2 By F 2 Determining I 2 Is of weighted spatial frequency variance sigma 2 ;
(201) I is determined according to 1 Line frequency R of (2) 1 And I 1 Column frequency C of (2) 1 Is that
Wherein M and N respectively represent I 1 Length and width of I 1 (x, y) represents I 1 Gray values of pixel points in the x-th row and the y-th column, x is {1,.. M }, y is {1,.. A., N }, and sigma (·) represents a summation operation;
specifically, I 1 Length M of 100, I 1 Is 100, I 1 Line frequency R of (2) 1 0.0456, I 1 Column frequency C 1 0.0275.
(202) I is determined according to 2 Line frequency R of (2) 2 And I 2 Column frequency C of (2) 2 Is that
Wherein M and N respectively represent I 2 Length and width of I 2 (x, y) represents I 2 Gray values of pixel points in the x-th row and y-th column, x epsilon { 1..once, M }, y epsilon { 1..once, N };
specifically, I 2 Length M of 100, I 2 Is 100, I 2 Line frequency R of (2) 2 0.0572, I 2 Column frequency C of (2) 2 0.0684.
(203) I is determined according to 1 Spatial frequency F of (2) 1 And I 2 Spatial frequency F of (2) 2 Is that
The spatial frequency operation of one image can be completed by utilizing the steps (201) - (203);
specifically, I 1 Spatial frequency F of (2) 1 0.0533, I 2 Spatial frequency F of (2) 2 0.0892.
(204) I is determined according to 1 Average value A of gray scale of (2) 1 And I 2 Average value A of gray scale of (2) 2 Is that
Specifically, I 1 Average value A of gray scale of (2) 1 0.0463, I 2 Average value A of gray scale of (2) 2 0.0459.
(205) I is determined according to 1 Gray value variance V of (2) 1 And I 2 Gray value variance V of (2) 2 Is that
Specifically, I 1 Gray value variance V of (2) 1 0.000107, I 2 Gray value variance V of (2) 2 0.000269.
(206) I is determined according to 1 Is of weighted spatial frequency variance sigma 1 And I 2 Is of weighted spatial frequency variance sigma 2 Is that
Specifically, I 1 Is of weighted spatial frequency variance sigma 1 0.000107, I 2 Is of weighted spatial frequency variance sigma 2 For 0.000268, the weighted spatial frequency variance is a global evaluation parameter, and the larger the weighted spatial frequency variance is, the more abundant the information carried by the image is.
Step three, comparing sigma 1 And sigma (sigma) 2 If sigma 1 Greater than or equal to sigma 2 Will I 1 Recorded as detail image I a ,I 2 Recorded as coarse image I b The method comprises the steps of carrying out a first treatment on the surface of the If sigma 1 Less than sigma 2 Will I 2 Recorded as detail image I a ,I 1 Recorded as coarse image I b ;
In particular, due to sigma 1 0.000107, sigma 2 0.000268, sigma 1 Less than sigma 2 So will I 2 Recorded as detail image I a ,I 1 Recorded as coarse image I b 。
Step four, decomposing the frame pair I by using a multi-layer Gaussian edge window filter a And I b Respectively carrying out multi-layer decomposition to obtain I a Detail-preserving layer S of (2) da 、I a Edge preserving layer S of (2) ea 、I a Is a basic energy layer S of (1) ga 、I b Detail-preserving layer S of (2) db 、I b Edge preserving layer S of (2) eb And I b Is a basic energy layer S of (1) gb ;
(301) Gauss high frequency extraction method in decomposition framework of multilayer Gauss side window filter a And I b Respectively decomposing, and determining I according to the following formula a Ith low frequency information I through gaussian filter gi And I b Mth low frequency information I through gaussian filter fm Is that
Wherein I is {1,2,3}, m is {1,2,3}, I g0 =I a ,I f0 =I b GF (·) represents performing a gaussian filtering operation;
(302) Edge window high frequency extraction method in decomposition frame of multi-layer Gaussian edge window filter a And I b Respectively decomposing, and determining I according to the following formula a Jth low frequency information I through side window filter dj And I b Nth low frequency information I through side window filter hn Is that
Where j is {1,2,3}, n is {1,2,3}, I d0 =I a ,I h0 =I b SWF (·) represents performing a side window filtering operation, r represents the radius of the filtering window, e represents the number of filter iterations;
specifically, the radius r of the filter window is 1 and the number e of filter iterations is 7.
(303) I is determined according to a Detail-preserving layer S of (2) da And I b Detail-preserving layer S of (2) db Is that
(304) I is determined according to a Edge preserving layer S of (2) ea And I b Edge preserving layer S of (2) eb Is that
(305) I is determined according to a Is a basic energy layer S of (1) ga And I b Is a basic energy layer S of (1) gb Is that
Step five, using a guide filter, utilizing S da For S db Performing guided fusion to obtain I b Detail preserving fusion layer S of (2) fb ;
(401) Using guided filtering operations, using S da For S db Guiding to obtain asymmetric guiding reinforcement layer G as
G=GUF(S da ,S db ) (12)
Wherein, GUF (·) represents performing a guided filtering operation;
(402) I is determined according to b Asymmetric detail preserving fusion layer S of (1) fb Is that
S fb =S db +G (13)。
Specifically, FIG. 4 is I b Detail-preserving layer S of (2) db FIG. 5 shows the use of S by a pilot filtering operation da For S db Guiding to obtain a guiding reinforcement layer G, FIG. 6 is I b Final detail preserving fusion layer S fb 。
Step six, determining S da Local variance L of (2) da (k, o) and S da Spatial frequency F of (2) da Determining S fb Local variance L of (2) fb (k, o) and S fb Spatial frequency F of (2) fb By L da (k,o)、L fb (k,o)、F da And F fb Structural discriminant pair S da And S is fb Fusing to obtain a final detail preserving fusion layer S fd ;
(501) S is determined according to the following da Local variance L of (2) da (k, o) and S fb Local variance L of (2) fb (k, o) is
Wherein at S da Randomly generates a region with the length of P and the width of Q, and at S fb A region of the size P and Q is also generated, k represents the abscissa of the region center position, o represents the ordinate of the region center position, w represents the abscissa pixel index in the region, z represents the ordinate pixel index in the region, μ da Represent S da Average value of pixel gray values in this region, mu fb Represent S fb An average value of pixel gradation values in this region;
specifically, the randomly generated regions have a length P of 3, a width Q of 3, a value of k of 25, a value of o of 37, μ da Has a value of 0.0215 mu fb Has a value of 0.0201, L da (k, o) has a value of 0.000175, L fb The value of (k, o) is 00.000154.
(502) S is determined according to the following da Spatial frequency F of (2) da And S is fb Spatial frequency F of (2) fb Is that
Wherein C is da Represent S da R is the column frequency of da Represent S da Line frequency of C fb Represent S fb R is the column frequency of fb Represent S fb Is a line frequency of (2);
specifically, C da Has a value of 0.0472, R da Has a value of 0.0602, C fb Has a value of 0.0432, R fb Has a value of 0.0253, F da Has a value of 0.0765, F fb Is 0.0501.
(503) Determining a final detail preserving fusion layer S according to fd Is that
Specifically F fb Less than F da ,L fb (k, o) is less than L da (k, o), so S fd Equal to S da FIG. 7 shows a final detail-preserving fusion layer S fd 。
Step seven, determining S ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Using sigma ea Sum sigma eb Determining a weight coefficient omega, using omega to S ea And S is eb Weighted fusion is carried out to obtain a final edge preserving fusion layer S fe ;
(601) S is determined according to the following ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Is that
Wherein F is ea Represent S ea Space of (2)Frequency, V ea Represent S ea Gray value variance of F eb Represent S eb Spatial frequency of V eb Represent S eb Is a gray value variance of (2);
specifically F ea Has a value of 0.0762, V ea Has a value of 0.000246, F eb Has a value of 0.0489, V eb Has a value of 0.000101, sigma ea Has a value of 0.000245, sigma eb The value of (2) is 0.0001.
(602) Determining the weight coefficient omega as follows
Specifically, ω has a value of 0.71.
(603) Determining the final edge preserving fusion layer S according to the following fe Is that
S fe =ω×S ea +(1-ω)×S eb (19)。
Specifically, FIG. 8 shows a final edge preserving fusion layer S fd 。
Step eight, S is carried out according to the following steps ga And S is equal to gb Fusion to obtain the final basic energy fusion layer S fg Is that
Specifically, fig. 9 is the final basic energy fusion layer.
Step nine, utilize S fd 、S fe And S is fg The sum of the additions determines the final fusion image I f Is that
I f =S fd +S fe +S fg (21)
Specifically, fig. 10 is a final fusion result diagram, the red frame is marked as a detail after the ISAR and VIS are fused, the blue frame is marked as a visible background area, and the clearer the red frame area and the brighter the blue frame are, indicating that the better the fusion result is.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the present invention.
Claims (8)
1. An ISAR and VIS image fusion method based on asymmetric multi-layer decomposition is characterized in that the method comprises the following steps:
step one, loading inverse synthetic aperture radar Image (ISAR) I with same spatial resolution 1 And visible light image (VIS) I 2 ;
Step two, determining I 1 Spatial frequency F of (2) 1 And pass through F 1 Determining I 1 Is of weighted spatial frequency variance sigma 1 The method comprises the steps of carrying out a first treatment on the surface of the Determining I 2 Spatial frequency F of (2) 2 And pass through F 2 Determining I 2 Is of weighted spatial frequency variance sigma 2 ;
Step three, comparing sigma 1 And sigma (sigma) 2 If sigma 1 Greater than or equal to sigma 2 Will I 1 Recorded as detail image I a ,I 2 Recorded as coarse image I b The method comprises the steps of carrying out a first treatment on the surface of the If sigma 1 Less than sigma 2 Will I 2 Recorded as detail image I a ,I 1 Recorded as coarse image I b ;
Step four, decomposing the frame pair I through a plurality of layers of Gaussian edge window filters a And I b Respectively carrying out multi-layer decomposition to obtain I a Detail-preserving layer S of (2) da 、I a Edge preserving layer S of (2) ea 、I a Is a basic energy layer S of (1) ga 、I b Detail-preserving layer S of (2) db 、I b Edge preserving layer S of (2) eb And I b Is a basic energy layer S of (1) gb ;
Step five, through S da For S db Performing guided fusion to obtain I b Detail preserving fusion layer S of (2) fb ;
Step six, determining S da Local variance L of (2) da (k, o) and spatial frequency F da Determining S fb Local variance L of (2) fb (k, o) and spatial frequency F fb Through L da (k,o)、L fb (k,o)、F da And F fb Structural discriminant pair S da And S is fb Fusing to obtain a final detail preserving fusion layer S fd ;
Step seven, determining S ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Through sigma ea Sum sigma eb Determining a weight coefficient omega, and comparing S with omega ea And S is eb Weighted fusion is carried out to obtain a final edge preserving fusion layer S fe ;
Step eight, S is carried out ga And S is equal to gb Fusion to obtain the final basic energy fusion layer S fg ;
Step nine, through S fd 、S fe And S is fg The sum of the additions determines the final fusion image I f 。
2. The method for merging an ISAR image and a VIS image based on asymmetric multi-layer decomposition according to claim 1, wherein the second step is specifically implemented by:
(201) I is determined according to 1 Line frequency R of (2) 1 And I 1 Column frequency C of (2) 1 Is that
Wherein M and N respectively represent I 1 Length and width of I 1 (x, y) represents I 1 Gray values of pixel points in the x-th row and the y-th column, x is {1,.. M }, y is {1,.. A., N }, and sigma (·) represents a summation operation;
(202) I is determined according to 2 Line frequency R of (2) 2 And I 2 Column frequency C of (2) 2 Is that
Wherein M and N respectively represent I 2 Length and (2) of (2)Width, I 2 (x, y) represents I 2 Gray values of pixel points in the x-th row and y-th column, x epsilon { 1..once, M }, y epsilon { 1..once, N };
(203) I is determined according to 1 Spatial frequency F of (2) 1 And I 2 Spatial frequency F of (2) 2 Is that
The spatial frequency operation of one image can be completed by utilizing the steps (201) - (203);
(204) I is determined according to 1 Average value A of gray scale of (2) 1 And I 2 Average value A of gray scale of (2) 2 Is that
(205) I is determined according to 1 Gray value variance V of (2) 1 And I 2 Gray value variance V of (2) 2 Is that
(206) I is determined according to 1 Is of weighted spatial frequency variance sigma 1 And I 2 Is of weighted spatial frequency variance sigma 2 Is that
3. The method for image fusion between ISAR and VIS based on asymmetric multi-layer decomposition according to claim 1 or 2, wherein said step four is specifically implemented by:
(301) Gauss high frequency extraction method in decomposition framework of multilayer Gauss side window filter a And I b Respectively decomposing, and determining I according to the following formula a Ith low frequency information I through gaussian filter gi ,I b Mth low frequency information I through gaussian filter fm Is that
Wherein I is {1,2,3}, m is {1,2,3}, I g0 =I a ,I f0 =I b GF (·) represents performing a gaussian filtering operation;
(302) Edge window high frequency extraction method in decomposition frame of multi-layer Gaussian edge window filter a And I b Respectively decomposing, and determining I according to the following formula a Jth low frequency information I through side window filter dj And I b Nth low frequency information I through side window filter hn Is that
Where j is {1,2,3}, n is {1,2,3}, I d0 =I a ,I h0 =I b SWF (·) represents performing a side window filtering operation, r represents the radius of the filtering window, e represents the number of filter iterations;
(303) I is determined according to a Detail-preserving layer S of (2) da And I b Detail-preserving layer S of (2) db Is that
(304) I is determined according to a Edge preserving layer S of (2) ea And I b Edge preserving layer S of (2) eb Is that
(305) I is determined according to a Is a basic energy layer S of (1) ga And I b Is a basic energy layer S of (1) gb Is that
4. The method for merging an ISAR and a VIS image based on an asymmetric multi-layer decomposition according to claim 3, wherein said step five is specifically implemented by:
(401) Using guided filtering operations, using S da For S db Guiding to obtain asymmetric guiding reinforcement layer G as
G=GUF(S da ,S db ) (12)
Wherein, GUF (·) represents performing a guided filtering operation;
(402) I is determined according to b Asymmetric detail preserving fusion layer S of (1) fb Is that
S fb =S db +G (13)。
5. The method for merging an ISAR image and a VIS image based on asymmetric multi-layer decomposition according to claim 4, wherein said step six is specifically implemented by:
(501) S is determined according to the following da Local variance L of (2) da (k, o) and S fb Local variance L of (2) fb (k, o) is
Wherein at S da Randomly generates a region with the length of P and the width of Q, and at S fb A region of length P and width Q is also generated at the same position of the region, k represents the abscissa of the region center position, and o representsThe ordinate of the center position of the region, w represents the abscissa pixel index in the region, z represents the ordinate pixel index in the region, μ da Represent S da Average value of pixel gray values in this region, mu fb Represent S fb An average value of pixel gradation values in this region;
(502) S is determined according to the following da Spatial frequency F of (2) da And S is fb Spatial frequency F of (2) fb Is that
Wherein C is da Represent S da R is the column frequency of da Represent S da Line frequency of C fb Represent S fb R is the column frequency of fb Represent S fb Is a line frequency of (2);
(503) Determining a final detail preserving fusion layer S according to fd Is that
6. The method for merging an ISAR and a VIS image based on asymmetric multi-layer decomposition according to claim 5, wherein said step seven is specifically implemented by:
(601) S is determined according to the following ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Is that
Wherein F is ea Represent S ea Spatial frequency of V ea Represent S ea Gray value variance of F eb Represent S eb Spatial frequency of V eb Represent S eb Is a gray value variance of (2);
(602) Determining the weight coefficient omega as follows
(603) Determining the final edge preserving fusion layer S according to the following fe Is that
S fe =ω×S ea +(1-ω)×S eb (19)。
8. The method for merging an ISAR image and a VIS image based on asymmetric multi-layer decomposition according to claim 7, wherein said step nine specifically comprises: the final fused image I is determined according to the following formula f Is that
I f =S fd +S fe +S fg (21)。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310313924.6A CN116167956B (en) | 2023-03-28 | 2023-03-28 | ISAR and VIS image fusion method based on asymmetric multi-layer decomposition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310313924.6A CN116167956B (en) | 2023-03-28 | 2023-03-28 | ISAR and VIS image fusion method based on asymmetric multi-layer decomposition |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116167956A true CN116167956A (en) | 2023-05-26 |
CN116167956B CN116167956B (en) | 2023-11-17 |
Family
ID=86416492
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310313924.6A Active CN116167956B (en) | 2023-03-28 | 2023-03-28 | ISAR and VIS image fusion method based on asymmetric multi-layer decomposition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116167956B (en) |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100172567A1 (en) * | 2007-04-17 | 2010-07-08 | Prokoski Francine J | System and method for using three dimensional infrared imaging to provide detailed anatomical structure maps |
DE102010051207A1 (en) * | 2010-11-12 | 2012-05-16 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method for three-dimensional imaging e.g. weapon hidden under cloth of people, involves producing three-dimensional radar image of object from radar data for image representation in e.g. correct position on three-dimensional surface model |
CN109035188A (en) * | 2018-07-16 | 2018-12-18 | 西北工业大学 | A kind of intelligent image fusion method based on target signature driving |
CN109063729A (en) * | 2018-06-20 | 2018-12-21 | 上海电力学院 | A kind of Multisensor Image Fusion Scheme based on PSO-NSCT |
US20190066266A1 (en) * | 2016-03-11 | 2019-02-28 | Bertin Technologies | Method for processing images |
CN109816618A (en) * | 2019-01-25 | 2019-05-28 | 山东理工大学 | A kind of region energy photon counting Image Fusion based on adaptive threshold |
CN110175970A (en) * | 2019-05-20 | 2019-08-27 | 桂林电子科技大学 | Based on the infrared and visible light image fusion method for improving FPDE and PCA |
AU2020100199A4 (en) * | 2020-02-08 | 2020-03-19 | Cao, Sihua MR | A medical image fusion method based on two-layer decomposition and improved spatial frequency |
CN113920047A (en) * | 2021-09-30 | 2022-01-11 | 广东双电科技有限公司 | Infrared and visible light image fusion method based on mixed curvature filter |
US20220044375A1 (en) * | 2019-12-17 | 2022-02-10 | Dalian University Of Technology | Saliency Map Enhancement-Based Infrared and Visible Light Fusion Method |
KR102388831B1 (en) * | 2021-02-09 | 2022-04-21 | 인천대학교 산학협력단 | Apparatus and Method for Fusing Intelligent Multi Focus Image |
CN114418913A (en) * | 2022-01-03 | 2022-04-29 | 中国电子科技集团公司第二十研究所 | ISAR and infrared image pixel level fusion method based on wavelet transformation |
CN115330653A (en) * | 2022-08-16 | 2022-11-11 | 西安电子科技大学 | Multi-source image fusion method based on side window filtering |
CN115345909A (en) * | 2022-10-18 | 2022-11-15 | 无锡学院 | Hyperspectral target tracking method based on depth space spectrum convolution fusion characteristics |
-
2023
- 2023-03-28 CN CN202310313924.6A patent/CN116167956B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100172567A1 (en) * | 2007-04-17 | 2010-07-08 | Prokoski Francine J | System and method for using three dimensional infrared imaging to provide detailed anatomical structure maps |
DE102010051207A1 (en) * | 2010-11-12 | 2012-05-16 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method for three-dimensional imaging e.g. weapon hidden under cloth of people, involves producing three-dimensional radar image of object from radar data for image representation in e.g. correct position on three-dimensional surface model |
US20190066266A1 (en) * | 2016-03-11 | 2019-02-28 | Bertin Technologies | Method for processing images |
CN109063729A (en) * | 2018-06-20 | 2018-12-21 | 上海电力学院 | A kind of Multisensor Image Fusion Scheme based on PSO-NSCT |
CN109035188A (en) * | 2018-07-16 | 2018-12-18 | 西北工业大学 | A kind of intelligent image fusion method based on target signature driving |
CN109816618A (en) * | 2019-01-25 | 2019-05-28 | 山东理工大学 | A kind of region energy photon counting Image Fusion based on adaptive threshold |
CN110175970A (en) * | 2019-05-20 | 2019-08-27 | 桂林电子科技大学 | Based on the infrared and visible light image fusion method for improving FPDE and PCA |
US20220044375A1 (en) * | 2019-12-17 | 2022-02-10 | Dalian University Of Technology | Saliency Map Enhancement-Based Infrared and Visible Light Fusion Method |
AU2020100199A4 (en) * | 2020-02-08 | 2020-03-19 | Cao, Sihua MR | A medical image fusion method based on two-layer decomposition and improved spatial frequency |
KR102388831B1 (en) * | 2021-02-09 | 2022-04-21 | 인천대학교 산학협력단 | Apparatus and Method for Fusing Intelligent Multi Focus Image |
CN113920047A (en) * | 2021-09-30 | 2022-01-11 | 广东双电科技有限公司 | Infrared and visible light image fusion method based on mixed curvature filter |
CN114418913A (en) * | 2022-01-03 | 2022-04-29 | 中国电子科技集团公司第二十研究所 | ISAR and infrared image pixel level fusion method based on wavelet transformation |
CN115330653A (en) * | 2022-08-16 | 2022-11-11 | 西安电子科技大学 | Multi-source image fusion method based on side window filtering |
CN115345909A (en) * | 2022-10-18 | 2022-11-15 | 无锡学院 | Hyperspectral target tracking method based on depth space spectrum convolution fusion characteristics |
Non-Patent Citations (5)
Title |
---|
JIAJIA ZHANG等: "An ISAR and Visible Image Fusion Algorithm Based on Adaptive Guided Multi-Layer Side Window Box Filter Decomposition", 《ADVANCES AND CHALLENGES ON MULTISOURCE REMOTE SENSING IMAGE FUSION: DATASETS, NEW TECHNOLOGIES, AND APPLICATIONS》, pages 1 - 29 * |
夏明革, 何友, 黄晓冬, 夏仕昌: "多传感器图像融合应用评述", 舰船电子对抗, no. 05, pages 38 - 44 * |
徐丹萍;王海梅;: "基于双边滤波和NSST的红外与可见光图像融合", 计算机测量与控制, no. 04, pages 201 - 204 * |
罗益荣;: "基于多层小波分析的图像融合技术研究", 计算机应用与软件, no. 12, pages 108 - 112 * |
郑睿;庞全;: "基于邻域方差加权平均的多聚焦图像融合法", 机械制造, no. 09, pages 33 - 36 * |
Also Published As
Publication number | Publication date |
---|---|
CN116167956B (en) | 2023-11-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Chen et al. | Median filtering forensics based on convolutional neural networks | |
CN109035172B (en) | Non-local mean ultrasonic image denoising method based on deep learning | |
CN113870335A (en) | Monocular depth estimation method based on multi-scale feature fusion | |
Xiao et al. | Single image dehazing based on learning of haze layers | |
CN111915486B (en) | Confrontation sample defense method based on image super-resolution reconstruction | |
CN117079139B (en) | Remote sensing image target detection method and system based on multi-scale semantic features | |
Shu et al. | LVC-Net: Medical image segmentation with noisy label based on local visual cues | |
CN116596792B (en) | Inland river foggy scene recovery method, system and equipment for intelligent ship | |
CN112052877A (en) | Image fine-grained classification method based on cascade enhanced network | |
CN115018748A (en) | Aerospace remote sensing image fusion method combining model structure reconstruction and attention mechanism | |
Shit et al. | An encoder‐decoder based CNN architecture using end to end dehaze and detection network for proper image visualization and detection | |
CN116167956B (en) | ISAR and VIS image fusion method based on asymmetric multi-layer decomposition | |
CN113487530A (en) | Infrared and visible light fusion imaging method based on deep learning | |
Zhang et al. | Iterative multi‐scale residual network for deblurring | |
CN115358962B (en) | End-to-end visual odometer method and device | |
Kumar et al. | Underwater image enhancement using deep learning | |
Liu et al. | Joint dehazing and denoising for single nighttime image via multi-scale decomposition | |
CN116228537A (en) | Attack image defense method based on denoising and super-resolution reconstruction fusion | |
CN113344110B (en) | Fuzzy image classification method based on super-resolution reconstruction | |
Ang et al. | Noise-aware zero-reference low-light image enhancement for object detection | |
Hu et al. | Pyramid feature boosted network for single image dehazing | |
CN113159158A (en) | License plate correction and reconstruction method and system based on generation countermeasure network | |
Cahill et al. | Exploring the viability of bypassing the image signal processor for CNN-based object detection in autonomous vehicles | |
Liu et al. | Deep multi-scale network for single image dehazing with self-guided maps | |
CN116342660B (en) | Multi-scale analysis fusion weighting filtering bionic compound eye optical flow field estimation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20240411 Address after: Room 1007, Building 1, No. 688, Zhenze Road, the Taihu Lake Street, Wuxi Economic Development Zone, Jiangsu Province, 214000 Patentee after: Yuanchi (Jiangsu) Information Technology Co.,Ltd. Country or region after: China Address before: No.333 Xishan Avenue, Wuxi City, Jiangsu Province Patentee before: Wuxi University Country or region before: China |