CN115797244A - Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission - Google Patents

Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission Download PDF

Info

Publication number
CN115797244A
CN115797244A CN202310069737.8A CN202310069737A CN115797244A CN 115797244 A CN115797244 A CN 115797244A CN 202310069737 A CN202310069737 A CN 202310069737A CN 115797244 A CN115797244 A CN 115797244A
Authority
CN
China
Prior art keywords
image
scale
layer sub
infrared
base layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310069737.8A
Other languages
Chinese (zh)
Inventor
韩玺钰
李宁
曹立华
李焱
兰太吉
周渝人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Original Assignee
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Institute of Optics Fine Mechanics and Physics of CAS filed Critical Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority to CN202310069737.8A priority Critical patent/CN115797244A/en
Publication of CN115797244A publication Critical patent/CN115797244A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

An image fusion method based on a multi-scale direction co-occurrence filter and intensity transfer relates to the field of image fusion, and comprises the following steps: acquiring and preprocessing source infrared and visible light image data; iteratively using a co-occurrence filter to carry out multi-scale decomposition on the preliminarily decomposed image; carrying out multidirectional decomposition on the multi-scale infrared basic layer sub-image and the multi-scale visible light basic layer sub-image by adopting discrete tight support shear wave transformation; fusing detail layer sub-images; fusing the base layer sub-images; and reconstructing the fused multi-scale detail layer sub-image and the multi-scale multi-direction basic layer sub-images in different directions to obtain a fused image. The invention can effectively retain the remarkable information and more effective detail information of the heat radiation target and reduce the edge blurring; the invention effectively distinguishes the detail information of different scales and has the direction representation characteristic; the invention obviously improves the integral contrast and definition of the fused image and improves the integral fusion effect.

Description

Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission
Technical Field
The invention relates to the technical field of image fusion, in particular to an image fusion method based on a multi-scale direction co-occurrence filter and intensity transmission.
Background
In recent years, the demand for multi-sensor image fusion technology, especially for infrared and visible light image fusion, is increasing rapidly in various fields, such as fields of reconnaissance, auxiliary driving, geological monitoring and the like. The image fusion aims to combine a plurality of groups of source images of the multi-modal sensor to form a single image, so that the aims of comprehensively displaying image information and enhancing scene understanding are fulfilled. In the infrared and visible light image fusion technology, an infrared sensor can effectively reflect heat radiation target information in a weak light or shielded environment, which is not possessed by a visible light sensor; meanwhile, the high resolution of the visible light sensor is also an effective supplement of the infrared sensor. The infrared and visible light image fusion method can effectively utilize the complementary information of the two and can fully remove redundant information. According to the research of relevant scholars, the current infrared and visible light image fusion method can be divided into a traditional method and a deep learning-based method. The fusion algorithm based on the coupled impulse neural network is a representative of deep learning, and related learners perform migration application to obtain good effect, but the fusion algorithm is only suitable for specific application at present due to the characteristics of large requirements on training original data, network structures which are difficult to explain and the like. Whereas conventional algorithms can solve the above problems. In the traditional algorithm, the effect of a multi-scale transformation fusion method represented by a wavelet family and multi-scale geometric analysis is the most outstanding, and the fusion method is the currently generally adopted fusion method because the fusion mechanism of the fusion method accords with the vision of human eyes and can fully express the inherent characteristics of images. However, the traditional algorithm does not distinguish high-frequency detail information, and is still difficult to distinguish texture information in a large-scale contour and a small gradient change internal area. For example, the document "Aishwarya, N., and C. Bennla Thangamma." visual and associated image fusion DTCTCW and adaptive combined fused dictionary. "Infrered Physics & Technology 93 (2018): 300-309" discloses the fusion of Infrared and Visible light images using a modified dual-tree complex wavelet transform (DTCTWT), which is a typical representative algorithm of multi-scale geometric analysis, which solves the problem of fusion result undertargeting by modifying and using DTCTWT to decompose a source initial image and applying a targeted fusion strategy to decomposed high and low frequency sub-bands. However, in this method, since the detailed information of the image is entirely decomposed into high-frequency subbands, and the burden of direction representation and transmission of the subsequent directional filter is increased, the final fusion result is not only loss of the detailed information and edge blurring, but also the overall fusion speed is slow. Also, as the chinese patent "infrared and visible light image fusion method under non-downsampling shear wave transform domain" with publication number CN114549379a, by designing a low-frequency fusion strategy for extracting detail information remaining in a low-frequency subband and a high-frequency strategy for guiding filtering optimization weighting, the problems of partial artifacts and detail transmission are solved, but the situations of detail information loss and edge blurring still occur because the adopted non-downsampling shear wave transform fails to effectively distinguish different types of detail information. The problems of loss of detail information, edge blurring and low fusion speed of a fusion result are effectively solved by using a co-occurrence filter-based multi-scale fusion method, but the method is lack of directional description capability and cannot adjust the contrast of a final fusion image, so that the overall fusion effect is poor, and meanwhile, the problems of loss of detail direction characteristics and difficulty in highlighting a target in a complex scene are not well improved, and further solution is still needed.
Disclosure of Invention
The invention provides an image fusion method based on a multi-scale direction co-occurrence filter and strength transmission, aiming at solving the problems that the existing multi-scale fusion method based on the co-occurrence filter is lack of directional description capability, cannot adjust the contrast of a final fusion image, is poor in integral fusion effect, loses the characteristics of detailed directions and is not prominent in a remarkable target after fusion.
The technical scheme adopted by the invention for solving the technical problem is as follows:
the invention discloses an image fusion method based on a multi-scale direction co-occurrence filter and intensity transmission, which comprises the following steps of:
s1, acquiring and preprocessing source infrared and visible light image data;
s2, filtering the preprocessed source infrared and visible light images by using a co-occurrence filter to realize preliminary decomposition; iteratively using a co-occurrence filter to carry out multi-scale decomposition on the preliminarily decomposed image to obtain a multi-scale infrared detail layer sub-image, a multi-scale infrared basic layer sub-image, a multi-scale visible light detail layer sub-image and a multi-scale visible light basic layer sub-image;
s3, carrying out multi-directional decomposition on the multi-scale infrared basic layer sub-image and the multi-scale visible light basic layer sub-image by adopting discrete tight support shear wave transformation to obtain a multi-scale multi-directional infrared basic layer sub-image and a multi-scale multi-directional visible light basic layer sub-image;
s4, constructing a fusion weight strategy by adopting a fusion strategy based on phase consistency and intensity measurement to perform detail layer sub-image fusion to obtain a fused multi-scale detail layer sub-image;
s5, designing a self-adaptive intensity transfer function as a fusion criterion to perform base layer sub-image fusion to obtain fused multi-scale and multi-direction base layer sub-images in different directions;
and S6, reconstructing the fused multi-scale detail layer sub-image and the multi-scale multi-direction basic layer sub-images in different directions to obtain a final fused image.
Further, the specific operation steps of step S1 are as follows:
s1.1, respectively acquiring a source infrared image and a source visible light image from an infrared camera and a visible light camera;
s1.2, respectively carrying out image denoising operation and image enhancement operation on the source infrared image and the source visible light image.
Further, the image denoising operation adopts a denoising algorithm based on filtering; the image enhancement operation adopts an image enhancement algorithm based on scene fitting.
Further, the specific operation steps of step S2 are as follows:
s2.1, performing primary decomposition;
distributing pixel weights by utilizing co-occurrence information in the preprocessed source infrared and visible light images, and respectively adopting a co-occurrence filter to carry out filtering operation on the preprocessed source infrared and visible light images to realize preliminary decomposition to obtain a 0-level infrared basic layer sub-image and a 0-level visible light basic layer sub-image; subtracting the preprocessed source infrared image from the 0-level infrared basic layer sub-image to obtain a 0-level infrared detail layer sub-image, and simultaneously subtracting the preprocessed source visible light image from the 0-level visible light basic layer sub-image to obtain a 0-level visible light detail layer sub-image;
s2.2, carrying out multi-scale decomposition;
respectively and iteratively using a co-occurrence filter filtering level subtraction operator to carry out multi-scale decomposition on the 0-level infrared basic layer sub-image and the 0-level visible light basic layer sub-image to obtain a multi-scale infrared basic layer sub-image
Figure SMS_1
Multi-scale visible light base layer sub-image
Figure SMS_2
Multi-scale infrared detail layer sub-image
Figure SMS_3
And multi-scale visible detail layer subimages
Figure SMS_4
I =1 … k, k represents the decomposition order.
Further, the specific operation steps of step S3 are as follows:
respectively utilizing the constructed horizontal discrete shear wave transformation and vertical discrete shear wave transformation to carry out multi-scale infrared basic layer sub-image
Figure SMS_7
And a multiscale visible light base layer sub-image
Figure SMS_9
Are carried out horizontally and verticallyDecomposing in K directions to obtain K multi-scale multi-direction infrared base layer sub-images in the horizontal direction
Figure SMS_11
Perpendicular K multi-scale multi-directional infrared base layer sub-images
Figure SMS_6
Horizontal K multi-scale multi-directional visible light base layer sub-images
Figure SMS_8
Perpendicular K multi-scale multi-directional visible light base layer subimages
Figure SMS_10
K multi-scale infrared detail layer sub-images
Figure SMS_12
And K multiple multi-scale visible detail layer sub-images
Figure SMS_5
Further, the calculation formula of the horizontal discrete shear wave transformation operation is shown in formula (6), and the calculation formula of the vertical discrete shear wave transformation operation is shown in formula (7);
Figure SMS_13
(6)
Figure SMS_14
Figure SMS_15
(7)
in the formula, k represents the decomposition order; n is 1 And n 2 Which represents two translation factors, the number of which,
Figure SMS_16
Figure SMS_19
represents the square of an integer field; s represents a direction factor, s =1,0, -1,s =1 represents a 45 ° direction, s =0 represents a 90 ° direction, and s = -1 represents a 180 ° direction;
Figure SMS_22
represents a round-down operation;
Figure SMS_17
and
Figure SMS_20
respectively representing a horizontal discrete shear wave transformation and a vertical discrete shear wave transformation;
Figure SMS_23
the infrared basic layer sub-image adopts translation transformation when the kth decomposition is expressed,
Figure SMS_25
representing the equivalent transformation of the infrared base layer sub-image subjected to horizontal discrete shear wave transformation at the k-th decomposition,
Figure SMS_18
the visible light base layer sub-image adopts translation transformation when the kth level decomposition is represented,
Figure SMS_21
representing the equivalent transformation of the visible light base layer sub-image subjected to the horizontal discrete shear wave transformation at the k-th decomposition,
Figure SMS_24
and (4) representing the equivalent transformation of the visible light base layer sub-image subjected to vertical discrete shear wave transformation at the k-th decomposition level.
Further, the specific operation steps of step S4 are as follows:
s4.1, the K multi-scale infrared detail layer sub-images obtained in the step S2
Figure SMS_26
And K multiple multi-scale visible detail layer sub-images
Figure SMS_27
Obtaining a phase consistency measure operator by adopting a phase consistency operation; the phase congruency measure operator
Figure SMS_28
The calculation formula of (a) is as follows;
Figure SMS_29
(8)
Figure SMS_30
(9)
in the formula, theta k Representing the direction angle at the decomposition level k,
Figure SMS_31
the n-th direction angle is theta k The magnitude of the Fourier series, [ e ] n,θk ,o n,θk Is the result of the convolution of the image with a log-Gabor filter at position (x, y); sigma k Represents the summation operation, sigma, on variable k n Representing a summation operation on a variable n;
Figure SMS_32
is a small positive constant;
s4.2, designing a strength measurement strategy by a windowing method, wherein the strength measurement strategy is shown as a formula (10);
Figure SMS_33
(10)
in the formula (I), the compound is shown in the specification,
Figure SMS_34
(. Represents an intensity measurement operator at the decomposition level l, I l (x, y) represents the multi-scale detail layer sub-image pixel value at a position (x, y) below the decomposition level l, I l (x 0 ,y 0 ) Expressed at a position (x) below the decomposition level l 0 ,y 0 ) The sub-image pixel values of the multi-scale detail layer, omega, are expressed in terms of position
Figure SMS_35
A partially windowed area that is a center of the window,
Figure SMS_36
representing a center pixel within the windowed area;
s4.3, a fusion weight strategy is constructed by combining a phase consistency method and an intensity measurement strategy, multi-scale detail layer sub-image fusion is realized, and a series of fused multi-scale detail layer sub-images are obtained; the fusion weight strategy is shown as formula (11);
Figure SMS_37
(11)
in the formula (I), the compound is shown in the specification,
Figure SMS_38
representing the fused multi-scale detail layer sub-image, D IR (x, y) denotes an infrared detail layer sub-image, D VI (x, y) represents a visible light segment layer sub-image,
Figure SMS_39
(. Cndot.) represents the phase consistency measure operator at the decomposition level l.
Further, the specific operation steps of step S5 are as follows:
s5.1, calculating local energy of the image pixel by pixel according to the K multi-scale multi-direction infrared base layer sub-images in the horizontal direction and the K multi-scale multi-direction infrared base layer sub-images in the vertical direction obtained in the step S3;
Figure SMS_40
(12)
in the formula, E (x, y) represents local energy of an image, and W le (i, j) represents a local energy window of size i × j, i ≦ 1 ≦ 3,1 ≦ j ≦ 3,B IR (x + i, y + j) represents performing a pixel-by-pixel traversal operation on the image;
s5.2, carrying out normalization operation on the local energy of the image to obtain a feature distribution expression operator P of the subimage of the basic layer;
Figure SMS_41
(13)
in the formula (I), the compound is shown in the specification,
Figure SMS_42
,E max (x, y) represents a local energy maximum value calculated according to the local energy window, and E (x + i, y + j) represents traversing operation on the local energy value in the local energy window;
s5.3, introducing a proportion parameter into the base layer sub-image feature distribution expression operator P, constructing an adaptive intensity transfer function, and fusing the multi-scale multi-direction base layer sub-images in the horizontal direction and the vertical direction respectively to obtain a fused multi-scale multi-direction infrared base layer sub-image in the horizontal direction, a multi-scale multi-direction visible light base layer sub-image in the horizontal direction, a multi-scale multi-direction infrared base layer sub-image in the vertical direction and a multi-scale multi-direction visible light base layer sub-image in the vertical direction; the calculation formula of the fusion operation is shown as formula (14);
Figure SMS_43
(14)
in the formula (I), the compound is shown in the specification,
Figure SMS_44
representing multi-scale multi-directional base layer sub-image fusion weights; gamma represents the introduced ratio parameter; arctan (·) represents an arctangent operator; p represents a base layer sub-image feature distribution representation operator;
s5.4 the expression of the final fused multi-scale multi-direction base layer subimages in different directions is as follows:
Figure SMS_45
(15)
in the formula (I), the compound is shown in the specification,
Figure SMS_46
representing the fused multi-scale multi-directional base layer sub-image in different directions whenWhen the value is not less than 0, the reaction time is not less than 0,
Figure SMS_47
a multi-scale multi-directional base layer sub-image representing the fused horizontal direction,
Figure SMS_48
a multi-scale multi-directional infrared base layer sub-image representing the fused horizontal direction,
Figure SMS_49
a multi-scale multi-directional visible light base layer sub-image representing the fused horizontal direction; when a =1, the number of the bits is set to a =1,
Figure SMS_50
a multi-scale multi-directional base layer sub-image representing the fused vertical direction,
Figure SMS_51
a multi-scale multi-directional infrared base layer sub-image representing the fused vertical direction,
Figure SMS_52
a multi-scale multi-directional visible light base layer sub-image representing the fused vertical direction; k represents a direction factor.
Further, the specific operation steps of step S6 are as follows:
reconstructing the fused multi-scale and multi-directional base layer sub-images in different directions by utilizing inverse operation of discrete tight support shear wave transformation to obtain a fused base layer image
Figure SMS_53
Fusing the base layer image
Figure SMS_54
And adding the fused multi-scale detail layer sub-image to obtain a final fused image F.
Further, the fusion base layer image
Figure SMS_55
The formula (2) is shown in formula (16); the describedThe calculation formula of the fusion image F is shown as formula (17);
Figure SMS_56
(16)
Figure SMS_57
(17)
in the formula (I), the compound is shown in the specification,
Figure SMS_58
representing the inverse operator of the discrete tightly-supported shear wave transformation,
Figure SMS_59
representing the fused ith-level multi-scale detail layer sub-image;
Figure SMS_60
a multi-scale, multi-directional base layer sub-image representing the fused horizontal direction;
Figure SMS_61
a multi-scale, multi-directional base layer sub-image representing the fused vertical direction.
The invention has the beneficial effects that:
1) The invention can effectively retain the remarkable information and more effective detail information of the heat radiation target and remarkably reduce the situation of edge blurring; the invention effectively balances the optimal relationship among the differentiation of detail information of different scales, the intensity adjustment and the salient information of the heat radiation target;
2) According to the invention, the multi-scale direction co-occurrence filter is designed through the joint co-occurrence filter and the discrete tight support shear wave transformation in the decomposition stage, the advantages of the co-occurrence filter and the directional shear wave transformation are fully combined, the detail information of different scales is effectively distinguished, and meanwhile, the direction representation characteristic is provided, and the edge blurring is reduced while more effective detail information is transmitted;
3) According to the invention, by designing the self-adaptive intensity transfer function as the fusion strategy of the basic layer, the overall contrast and definition of the fused image can be obviously improved, the effective fusion of the infrared and visible light images is realized, the image quality is enhanced, and the overall fusion effect is improved;
4) The fusion result of the invention has outstanding effects on the reservation of detail information and the adjustment of the whole contrast ratio, and provides a high-quality enhanced fusion image for a subsequent advanced visual image processing system;
5) The invention can also accelerate the processing efficiency of the whole advanced visual image processing system.
Drawings
FIG. 1 is a flow chart of an image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to the present invention;
FIG. 2 is a schematic diagram of an embodiment of the image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to the present invention;
FIG. 3 is a result of fusing a pair of source infrared and visible light image sequences describing a real urban highway scene in a RoadScreen public dataset using the image fusion method based on multi-scale direction co-occurrence filters and intensity transfer of the present invention;
FIG. 4 is a result of fusing a pair of source infrared and visible light image sequences describing a road scene in real rain in a RoadScreen public data set using the image fusion method based on multi-scale direction co-occurrence filter and intensity transfer of the present invention;
fig. 5 is a result of fusing a source infrared and visible light image sequence pair describing a real field defense scene in a TNO public data set by using the image fusion method based on the multi-scale direction co-occurrence filter and intensity transfer of the present invention.
Detailed Description
The present invention will be described in further detail below with reference to the accompanying drawings.
The invention relates to an image fusion method based on a multi-scale direction co-occurrence filter and intensity transmission, which comprises the following steps of: acquiring and preprocessing source infrared and visible light image data → performing filtering operation on the preprocessed source infrared and visible light images by adopting a co-occurrence filter to realize preliminary decomposition; iteratively performing multi-scale decomposition on the preliminarily decomposed image by using a co-occurrence filter to obtain a multi-scale infrared detail layer sub-image, a multi-scale infrared base layer sub-image, a multi-scale visible light detail layer sub-image and a multi-scale visible light base layer sub-image → performing multi-direction decomposition on the multi-scale infrared base layer sub-image and the multi-scale visible light base layer sub-image by using discrete tight support shear wave transformation to obtain a multi-scale multi-direction infrared base layer sub-image and a multi-scale multi-direction visible light base layer sub-image → constructing a fusion weight strategy based on phase consistency and intensity measurement to perform detail layer sub-image fusion to obtain a fused multi-scale detail layer sub-image → designing an adaptive intensity transfer function as a fusion criterion to perform base layer sub-image fusion to obtain a fused multi-scale multi-direction base layer sub-image in different directions → reconstructing the fused multi-scale detail layer sub-image and the multi-scale base layer sub-image in different directions to obtain a final fused image.
The invention discloses an image fusion method based on a multi-scale direction co-occurrence filter and intensity transmission, which comprises the following specific operation processes:
s1, acquiring and preprocessing source infrared and visible light image data; the specific operation steps are as follows:
s1.1, acquiring a source initial image, specifically acquiring a source infrared image from an infrared camera, and acquiring a source visible light image from a visible light camera;
s1.2, respectively carrying out preprocessing operation on the acquired source infrared image and the acquired source visible light image; the preprocessing operation comprises: image denoising operation and image enhancement operation; when the image denoising operation is performed, the image denoising can be specifically performed by adopting a denoising algorithm based on filtering, and when the image enhancement operation is performed, the image enhancement can be specifically performed by adopting an image enhancement algorithm based on scene fitting; by carrying out image denoising and image enhancement operations on the source initial image, partial noise of the source initial image can be eliminated, and the contrast and the definition of the source initial image are initially improved; specifically, the preprocessing operation of the source initial image can be realized through the formula (1), including the preliminary image denoising and image enhancement operation;
Figure SMS_62
(1)
in formula (1), I represents a source infrared and visible light image data matrix after preprocessing operation, T (right) represents preprocessing operation, and I Representing source infrared and visible light images.
S2, filtering the preprocessed source infrared and visible light images by using a co-occurrence filter to realize preliminary decomposition; iteratively using a co-occurrence filter to carry out multi-scale decomposition on the preliminarily decomposed image to obtain a multi-scale infrared detail layer sub-image, a multi-scale infrared basic layer sub-image, a multi-scale visible light detail layer sub-image and a multi-scale visible light basic layer sub-image; the specific operation steps are as follows:
s2.1, performing primary decomposition;
distributing pixel weights by utilizing co-occurrence information in the preprocessed source infrared and visible light images, respectively adopting a co-occurrence filter to carry out filtering operation on the preprocessed source infrared and visible light images, realizing preliminary decomposition, and further obtaining a 0-level infrared basic layer sub-image
Figure SMS_63
And a 0-level visible base layer sub-image
Figure SMS_64
As shown in formula (2) and formula (3); preprocessing a source infrared image I IR And 0-level infrared base layer sub-image
Figure SMS_65
Subtracting to obtain a sub-image of the 0-level infrared detail layer
Figure SMS_66
Simultaneously, the preprocessed source visible light image I is processed as shown in formula (2) VI And 0-level visible base layer sub-image
Figure SMS_67
Subtracting to obtain 0-level visible light detail layer sub-image
Figure SMS_68
As shown in formula (3);
Figure SMS_69
=CoF(I IR ) ,
Figure SMS_70
=CoF(I VI ) (2)
Figure SMS_71
=I IR -
Figure SMS_72
=I VI -
Figure SMS_73
(3)
in equations (2) and (3), coF (.) represents a filtering operation performed on an image using a co-occurrence filter,
Figure SMS_74
representing a level 0 infrared base layer sub-image,
Figure SMS_75
representing a level 0 visible base layer sub-image,
Figure SMS_76
representing a level 0 infrared detail layer sub-image,
Figure SMS_77
representing a visible detail layer sub-image of level 0, I IR Representing the pre-processed source infrared image, I VI Representing the pre-processed source visible light image;
s2.2, carrying out multi-scale decomposition;
then the obtained 0-level infrared basic layer sub-image is subjected to
Figure SMS_78
And a 0-level visible base layer sub-image
Figure SMS_79
Respectively and iteratively using the filter stage subtraction operator of the co-occurrence filter to carry out multi-scale decomposition, and finally obtaining the sub-image of the multi-scale infrared basic layer
Figure SMS_80
Multi-scale visible base layer sub-image
Figure SMS_81
Multi-scale infrared detail layer sub-image
Figure SMS_82
(i =1 … k) and multi-scale visible detail layer subimages
Figure SMS_83
(i =1 … k), k represents the decomposition order;
for the i-1 st level infrared base layer sub-image, taking k-level decomposition as an example
Figure SMS_89
Carrying out co-occurrence filtering operation to obtain the ith-level infrared basic layer sub-image
Figure SMS_85
For the i-1 st visible light base layer sub-image at the same time
Figure SMS_94
Carrying out co-occurrence filtering operation to obtain the i-th level visible light basic layer sub-image
Figure SMS_88
(ii) a Then, the i-1 level infrared basic layer sub-image is processed
Figure SMS_92
And ith level infrared base layer sub-image
Figure SMS_100
Performing subtraction operation to obtain the ith-level infrared detail layer sub-image
Figure SMS_102
For the i-1 st visible light base layer sub-image at the same time
Figure SMS_87
And ith visible light base layer sub-image
Figure SMS_98
Performing subtraction operation to obtain the i-th level visible light detail layer sub-image
Figure SMS_91
(ii) a Repeating the steps k times to finally obtain the multi-scale infrared basic layer sub-image
Figure SMS_99
Multi-scale visible light base layer sub-image
Figure SMS_84
Multi-scale infrared detail layer sub-image
Figure SMS_95
(i =1 … k) and multi-scale visible detail layer subimages
Figure SMS_93
(i =1 … k); wherein, the i-1 level infrared basic layer sub-image
Figure SMS_101
Obtaining an i-level infrared basic layer sub-image after k-level decomposition
Figure SMS_86
And ith visible light base layer sub-image
Figure SMS_97
The formula (4) shows, the multi-scale infrared detail layer sub-image
Figure SMS_90
And the multi-scale visible light detail layer sub-image
Figure SMS_96
Is calculated asFormula (5);
Figure SMS_103
(4)
Figure SMS_104
(5)
in the formulae (4) and (5),
Figure SMS_105
representing the ith-level infrared base layer sub-image,
Figure SMS_106
representing an i-1 th level infrared base layer sub-image,
Figure SMS_107
representing the ith-level visible base layer sub-image,
Figure SMS_108
representing the i-1 st level visible base layer sub-image,
Figure SMS_109
representing the ith level infrared detail layer sub-image,
Figure SMS_110
to represent
Figure SMS_111
S3, further adopting discrete tight support shear wave transformation to carry out multi-scale infrared base layer sub-image
Figure SMS_112
And a multiscale visible light base layer sub-image
Figure SMS_113
Performing multi-directional decomposition to obtain a multi-scale multi-directional infrared basic layer sub-image and a multi-scale multi-directional visible light basic layer sub-image; aiming at the defect of lack of directivity of multi-scale decomposition in the step S2, the discrete tight support shear wave is adoptedCarrying out multi-directional decomposition on the multi-scale base layer sub-image by transformation, and extracting the large-scale contour edge of the image; the specific operation steps are as follows:
s3.1 respectively utilizing the constructed horizontal discrete shear wave transformation and vertical discrete shear wave transformation to the multi-scale infrared basic layer sub-image obtained in the step S2
Figure SMS_114
And a multiscale visible light base layer sub-image
Figure SMS_115
Performing decomposition in each of K (K represents a direction factor) directions horizontally and vertically; wherein, the calculation formula of the horizontal discrete shear wave conversion operation is shown as the formula (6), and the calculation formula of the vertical discrete shear wave conversion operation is shown as the formula (7);
Figure SMS_116
(6)
Figure SMS_117
Figure SMS_118
(7)
in the formulae (6) and (7), k represents the number of decomposition stages; n is 1 And n 2 Which represents two translation factors, the number of which,
Figure SMS_121
Figure SMS_123
represents the square of an integer field; s represents a direction factor, s =1,0, -1,s =1 represents a 45 ° direction, s =0 represents a 90 ° direction, and s = -1 represents a 180 ° direction;
Figure SMS_125
represents a round-down operation;
Figure SMS_120
and
Figure SMS_122
respectively representing a horizontal discrete shear wave transformation and a vertical discrete shear wave transformation;
Figure SMS_124
the infrared basic layer sub-image adopts translation transformation when the kth decomposition is expressed,
Figure SMS_126
representing the equivalent transformation of the infrared base layer sub-image subjected to horizontal discrete shear wave transformation at the k-th decomposition,
Figure SMS_119
the visible light base layer sub-image adopts translation transformation when representing the k-th decomposition,
Figure SMS_127
representing the equivalent transformation of the visible light base layer sub-image subjected to the horizontal discrete shear wave transformation at the k-th decomposition level,
Figure SMS_128
representing the equivalent transformation of the visible light base layer sub-image subjected to vertical discrete shear wave transformation in the k-th decomposition;
s3.2, after horizontal discrete shear wave transformation and vertical discrete shear wave transformation, multi-scale infrared basic layer sub-images
Figure SMS_129
And a multiscale visible light base layer sub-image
Figure SMS_130
Is divided into 2K directions, namely a horizontal direction and a vertical direction; wherein the multi-scale multi-directional base layer sub-image data set obtained after the multi-directional decomposition can be represented as
Figure SMS_131
Wherein, in the step (A),
Figure SMS_132
individual watchShowing the multi-scale multi-direction infrared base layer sub-image obtained after horizontal discrete shear wave transformation and vertical discrete shear wave transformation,
Figure SMS_133
respectively representing the sub-images of the multi-scale and multi-direction visible light base layer obtained after horizontal discrete shear wave transformation and vertical discrete shear wave transformation;
s3.3 horizontal and vertical, respectively, three directions, i.e. the direction factor K =3,s =1,0, -1; after multidirectional decomposition, K multi-scale multidirectional infrared base layer sub-images in the horizontal direction are obtained
Figure SMS_134
Perpendicular K multi-scale multi-directional infrared base layer sub-images
Figure SMS_135
Horizontal K multi-scale multi-directional visible light base layer sub-images
Figure SMS_136
And K multi-scale multi-direction visible light base layer sub-images in the vertical direction
Figure SMS_137
And K multi-scale infrared detail layer sub-images
Figure SMS_138
S4, fusing the detail layer sub-images;
performing fusion strategy based on phase consistency and intensity measurement on the K multi-scale infrared detail layer sub-images
Figure SMS_139
And K multiple multi-scale visible detail layer sub-images
Figure SMS_140
Performing fusion operation to obtain a series of fused multi-scale detail layer sub-images so as to transfer more texture details; the specific operation steps are as follows:
s4.1, firstly, the K multi-scale infrared detail layer sub-images obtained in the way are processed
Figure SMS_141
And K multiple multi-scale visible detail layer sub-images
Figure SMS_142
Adopting phase consistency operation of engineering application to obtain a phase consistency measure operator for designing a subsequent fusion strategy; wherein the phase coincidence measure operator
Figure SMS_143
The calculation formula of (a) is as follows;
Figure SMS_144
(8)
Figure SMS_145
(9)
in formulae (8) and (9), θ k Representing the direction angle at the decomposition level k,
Figure SMS_146
the n-th direction angle is theta k The magnitude of the Fourier series, [ e ] n,θ k ,o n,θ k Is an image (multiscale infrared detail layer sub-image or
Figure SMS_147
) Convolution results with a log-Gabor filter at position (x, y); sigma k Represents the summation operation, sigma, on the variable k n Representing a summation operation on a variable n;
Figure SMS_148
is a small positive number for preventing the denominator from being 0;
s4.2, designing a strength measurement strategy by a windowing method, wherein the strength measurement strategy is shown as a formula (10);
Figure SMS_149
(10)
in the formula (10), N l (. Represents an intensity measurement operator at the decomposition level l, I l (x, y) represents the multi-scale detail layer sub-image pixel value at a position (x, y) below the decomposition level l, I l (x 0 ,y 0 ) Expressed at the position (x) below the decomposition level l 0 ,y 0 ) The multi-scale detail layer sub-image pixel values of (a), omega, are expressed in terms of position (x) 0 ,y 0 ) (x) a central partially windowed area 0 ,y 0 ) Representing a center pixel within the windowed area;
s4.3, finally, a fusion weight strategy is constructed by combining a phase consistency method and an intensity measurement strategy, so that the fusion of the sub-images of the multi-scale detail layer is realized, a series of fused sub-images of the multi-scale detail layer are obtained, and more texture details are transmitted; the fusion weight strategy of the multi-scale detail layer sub-image is shown as a formula (11);
Figure SMS_150
(11)
in formula (11), D F (x, y) denotes the fused multi-scale detail layer sub-image, D IR (x, y) denotes an infrared detail layer sub-image, D VI (x, y) represents a visible light segment layer sub-image, P l (. Cndot.) represents the phase consistency measure operator at the decomposition level l.
S5, fusing the base layer sub-images;
by designing a self-adaptive intensity transfer function as a fusion criterion of the multi-scale multi-direction base layer subimages, adopting the infrared image base layer pixel intensity distribution information to adjust the multi-scale multi-direction base layer subimage intensity distribution, highlighting the heat radiation target significant information, completing the fusion of the multi-scale multi-direction base layer subimages in different directions, guiding the final fusion result by utilizing the intensity distribution of the infrared base layer subimages, and ensuring that the fused base layer subimages have high-contrast characteristics; the specific operation steps are as follows:
s5.1, firstly, aiming at K multi-scale in the horizontal direction obtained in the step S3Directional infrared base layer subimage
Figure SMS_151
And K multi-scale multi-direction infrared base layer sub-images in vertical direction
Figure SMS_152
Calculating the local energy of the image pixel by pixel;
Figure SMS_153
(12)
in the formula (12), E (x, y) represents the local energy of the image, W le (i, j) represents a local energy window of size i × j, i ≦ 1 ≦ 3,1 ≦ j ≦ 3,B IR (x + i, y + j) represents performing a pixel-by-pixel traversal operation on the image;
s5.2, carrying out normalization operation on the local energy obtained by calculation to obtain a base layer sub-image feature distribution expression operator P;
Figure SMS_154
(13)
in the formula (13), the reaction mixture is,
Figure SMS_155
,E max (x, y) represents a local energy maximum value calculated according to the local energy window, and E (x + i, y + j) represents traversing operation on the local energy value in the local energy window;
s5.3, introducing a proportion parameter into the base layer sub-image feature distribution expression operator P, constructing an adaptive intensity transfer function to fuse the multi-scale multi-direction base layer sub-images, adjusting the intensity distribution of the multi-scale multi-direction base layer sub-images, namely respectively fusing the multi-scale multi-direction base layer sub-images in the horizontal direction and the vertical direction to obtain a fused multi-scale multi-direction infrared base layer sub-image in the horizontal direction, a fused multi-scale multi-direction visible light base layer sub-image in the horizontal direction, a fused multi-scale multi-direction infrared base layer sub-image in the vertical direction and a fused multi-scale multi-direction visible light base layer sub-image in the vertical direction; the specific calculation formula of the fusion operation is shown as formula (14);
Figure SMS_156
(14)
in the formula (14), the compound represented by the formula (I),
Figure SMS_157
representing multi-scale multi-directional base layer sub-image fusion weights; gamma represents an introduced proportion parameter and is used for better regulating and controlling the fusion weight of the multi-scale multidirectional base layer sub-image; arctan (·) represents an arctangent operator; p represents a base layer sub-image feature distribution representation operator;
s5.4, the expression of the finally fused multi-scale and multi-direction base layer sub-images in different directions is as follows:
Figure SMS_158
(15)
in the formula (15), the reaction mixture is,
Figure SMS_159
multi-scale, multi-directional base layer subimages representing the fused different directions are merged, when a =0,
Figure SMS_160
a multi-scale multi-directional base layer sub-image representing the fused horizontal direction,
Figure SMS_161
a multi-scale multi-directional infrared base layer sub-image representing the fused horizontal direction,
Figure SMS_162
a multi-scale multi-directional visible light base layer sub-image representing the fused horizontal direction; when a =1, the number of the bits is set to a =1,
Figure SMS_163
a multi-scale multi-directional base layer sub-image representing the fused vertical direction,
Figure SMS_164
a multi-scale multi-directional infrared base layer sub-image representing the fused vertical direction,
Figure SMS_165
a multi-scale multi-directional visible light base layer sub-image representing the fused vertical direction; k represents a direction factor.
S6, reconstructing the fused multi-scale detail layer sub-image (the result obtained in the step four) and the fused multi-scale multi-direction basic layer sub-image (the result obtained in the step five) in different directions to obtain a final fused image F; the specific operation steps are as follows:
reconstructing the fused multi-scale and multi-direction base layer sub-images in different directions by utilizing the inverse operation of discrete tight support shear wave transformation to obtain a fused base layer image B F Finally, the base layer image B is fused F Performing simple addition operation on the fused multi-scale detail layer sub-image to obtain a final fused image F, wherein a specific calculation formula is shown as follows;
Figure SMS_166
(16)
Figure SMS_167
(17)
in the formulae (16) and (17),
Figure SMS_168
representing the inverse operator of the discrete tightly-supported shear wave transformation,
Figure SMS_169
representing the fused ith-level multi-scale detail layer sub-image;
Figure SMS_170
a multi-scale, multi-directional base layer sub-image representing the fused horizontal direction;
Figure SMS_171
a multi-scale, multi-directional base layer sub-image representing the fused vertical direction.
In order to verify the effectiveness of the image fusion method based on the multi-scale direction co-occurrence filter and the intensity transfer, the following verification test is performed.
As shown in fig. 2, according to the image fusion method based on multi-scale direction co-occurrence filter and intensity transfer of the present invention, (1) a source infrared image is first acquired from an infrared camera (a in fig. 2), and a source visible light image is simultaneously acquired from a visible light camera (b in fig. 2); (2) Performing filtering operation on the preprocessed source infrared and visible light images by using a co-occurrence filter to realize preliminary decomposition, and then performing multi-scale decomposition on the preliminarily decomposed images by iteratively using the co-occurrence filter to obtain a multi-scale infrared basic layer sub-image (c in figure 2), a multi-scale visible light basic layer sub-image (d in figure 2), a multi-scale infrared detail layer sub-image (e in figure 2) and a multi-scale visible light detail layer sub-image (f in figure 2); (3) Carrying out multi-directional decomposition on the multi-scale infrared basic layer sub-image and the multi-scale visible light basic layer sub-image by adopting discrete tight support shear wave transformation to obtain a multi-scale multi-directional infrared basic layer sub-image and a multi-scale multi-directional visible light basic layer sub-image; (4) Constructing a fusion weight strategy by adopting a fusion strategy based on phase consistency and intensity measurement to perform detail layer sub-image fusion on a multi-scale infrared detail layer sub-image (e in figure 2) and a multi-scale visible light detail layer sub-image (f in figure 2) to obtain a fused multi-scale detail layer sub-image (h in figure 2); (5) Designing a self-adaptive intensity transfer function as a fusion criterion to perform base layer sub-image fusion on the multi-scale infrared base layer sub-image (c in figure 2) and the multi-scale visible light base layer sub-image (d in figure 2) to obtain fused multi-scale multi-direction base layer sub-images (g in figure 2) in different directions; (6) And reconstructing the fused multi-scale detail layer sub-image and the multi-scale multi-direction base layer sub-images in different directions to obtain a final fused image (F in fig. 2).
The image fusion method based on the multi-scale direction co-occurrence filter and the intensity transfer is utilized to fuse the source infrared and visible light image sequence pair describing the real urban highway scene in the RoadScreen public data set, and the result is shown in figure 3, wherein the first row is the source infrared image describing the real urban highway scene, the second row is the source visible light image describing the real urban highway scene, and the third row is the final fused image.
The image fusion method based on the multi-scale direction co-occurrence filter and the intensity transfer is utilized to fuse the source infrared and visible light image sequence pair describing the real road scene in the rain in the RoadScreen public data set, and as a result, as shown in FIG. 4, the first row is the source infrared image describing the real road scene in the rain, the second row is the source visible light image describing the real road scene in the rain, and the third row is the final fused image.
The image fusion method based on the multi-scale direction co-occurrence filter and the intensity transfer is utilized to fuse the source infrared and visible light image sequence pair describing the real field defense scene in the TNO public data set, and the result is shown in figure 5, wherein the first row is the source infrared image describing the real field defense scene, the second row is the source visible light image describing the real field defense scene, and the third row is the final fused image.
In conclusion, the multi-scale multi-direction decomposition is carried out on the source initial image by designing the multi-direction co-occurrence filter, so that large-scale contour information and small-gradient texture information in an image area can be distinguished, and the method has good direction representation capability; the whole contrast and definition of the image can be effectively adjusted by designing a self-adaptive intensity transfer function and fusing the multi-direction base layer sub-image; in the aspect of detail layer sub-image fusion, a strategy based on phase consistency and intensity measurement is adopted to construct fusion weight, and more texture details are transmitted to a final fusion result; in addition, the method of the invention is simple and easy to operate, can balance the quality and efficiency of the algorithm by only adjusting a few parameters, and can provide necessary technical support for the subsequent advanced visual processing process.
The above description is only a preferred embodiment of the present invention, and does not limit the present invention in any way; any person skilled in the art can make any equivalent substitutions or modifications on the technical solutions and technical contents disclosed in the present invention without departing from the scope of the technical solutions of the present invention, and still fall within the protection scope of the present invention without departing from the technical solutions of the present invention.

Claims (10)

1. The image fusion method based on the multi-scale direction co-occurrence filter and the intensity transmission is characterized by comprising the following steps of:
s1, acquiring and preprocessing source infrared and visible light image data;
s2, filtering the preprocessed source infrared and visible light images by using a co-occurrence filter to realize preliminary decomposition; iteratively using a co-occurrence filter to carry out multi-scale decomposition on the preliminarily decomposed image to obtain a multi-scale infrared detail layer sub-image, a multi-scale infrared basic layer sub-image, a multi-scale visible light detail layer sub-image and a multi-scale visible light basic layer sub-image;
s3, carrying out multi-directional decomposition on the multi-scale infrared basic layer sub-image and the multi-scale visible light basic layer sub-image by adopting discrete tight support shear wave transformation to obtain a multi-scale multi-directional infrared basic layer sub-image and a multi-scale multi-directional visible light basic layer sub-image;
s4, constructing a fusion weight strategy by adopting a fusion strategy based on phase consistency and intensity measurement to perform detail layer sub-image fusion to obtain a fused multi-scale detail layer sub-image;
s5, designing a self-adaptive intensity transfer function as a fusion criterion to perform base layer sub-image fusion to obtain fused multi-scale and multi-direction base layer sub-images in different directions;
and S6, reconstructing the fused multi-scale detail layer sub-image and the multi-scale multi-direction basic layer sub-images in different directions to obtain a final fused image.
2. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to claim 1, wherein the specific operation steps of step S1 are as follows:
s1.1, respectively acquiring a source infrared image and a source visible light image from an infrared camera and a visible light camera;
s1.2, respectively carrying out image denoising operation and image enhancement operation on the source infrared image and the source visible light image.
3. The method for image fusion based on multi-scale direction co-occurrence filter and intensity transfer as claimed in claim 2, wherein the image denoising operation adopts a filter-based denoising algorithm; the image enhancement operation adopts an image enhancement algorithm based on scene fitting.
4. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer as claimed in claim 2, wherein the specific operation steps of step S2 are as follows:
s2.1, performing primary decomposition;
distributing pixel weights by utilizing co-occurrence information in the preprocessed source infrared and visible light images, and respectively adopting a co-occurrence filter to carry out filtering operation on the preprocessed source infrared and visible light images to realize preliminary decomposition to obtain a 0-level infrared basic layer sub-image and a 0-level visible light basic layer sub-image; subtracting the preprocessed source infrared image from the 0-level infrared basic layer sub-image to obtain a 0-level infrared detail layer sub-image, and simultaneously subtracting the preprocessed source visible light image from the 0-level visible light basic layer sub-image to obtain a 0-level visible light detail layer sub-image;
s2.2, carrying out multi-scale decomposition;
respectively and iteratively using a co-occurrence filter filtering level subtraction operator to carry out multi-scale decomposition on the 0-level infrared basic layer sub-image and the 0-level visible light basic layer sub-image to obtain a multi-scale infrared basic layer sub-image
Figure QLYQS_1
Multi-scale visible light base layer sub-image
Figure QLYQS_2
Multi-scale infraredDetail layer sub-images
Figure QLYQS_3
And multi-scale visible detail layer subimages
Figure QLYQS_4
I =1 … k, k represents the decomposition order.
5. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to claim 4, wherein the specific operation steps of step S3 are as follows:
respectively utilizing the constructed horizontal discrete shear wave transformation and vertical discrete shear wave transformation to carry out multi-scale infrared basic layer sub-image
Figure QLYQS_6
And a multiscale visible light base layer sub-image
Figure QLYQS_8
Decomposing the horizontal direction and the vertical direction in K directions to obtain K multi-scale multi-direction infrared basic layer sub-images in the horizontal direction
Figure QLYQS_10
Perpendicular K multi-scale multi-directional infrared base layer sub-images
Figure QLYQS_7
K multi-scale multi-directional visible light base layer subimages in horizontal direction
Figure QLYQS_9
Perpendicular K multi-scale multi-directional visible light base layer sub-images
Figure QLYQS_11
K multi-scale infrared detail layer sub-images
Figure QLYQS_12
And K multi-scale visible detail layer sub-images
Figure QLYQS_5
6. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to claim 5, wherein the calculation formula of the horizontal discrete shear wave transformation operation is shown in formula (6), and the calculation formula of the vertical discrete shear wave transformation operation is shown in formula (7);
Figure QLYQS_13
(6)
Figure QLYQS_14
Figure QLYQS_15
(7)
in the formula, k represents the decomposition order; n is 1 And n 2 Which represents two translation factors, the number of which,
Figure QLYQS_18
Figure QLYQS_20
represents the square of the integer field; s represents a direction factor, s =1,0, -1,s =1 represents a 45 ° direction, s =0 represents a 90 ° direction, and s = -1 represents a 180 ° direction;
Figure QLYQS_23
represents a round-down operation;
Figure QLYQS_17
and
Figure QLYQS_19
representing horizontal and vertical discrete shear wave transformations, respectivelyTransforming;
Figure QLYQS_22
the infrared basic layer sub-image adopts translation transformation when the kth decomposition is expressed,
Figure QLYQS_25
representing the equivalent transformation of the infrared base layer sub-image subjected to horizontal discrete shear wave transformation at the k-th decomposition,
Figure QLYQS_16
the visible light base layer sub-image adopts translation transformation when representing the k-th decomposition,
Figure QLYQS_21
representing the equivalent transformation of the visible light base layer sub-image subjected to the horizontal discrete shear wave transformation at the k-th decomposition,
Figure QLYQS_24
representing the equivalent transformation of the visible light base layer sub-image subjected to the vertical discrete shear wave transformation at the k-th decomposition.
7. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to claim 5, wherein the specific operation steps of step S4 are as follows:
s4.1, the K multi-scale infrared detail layer sub-images obtained in the step S2
Figure QLYQS_26
And K multi-scale visible detail layer sub-images
Figure QLYQS_27
Obtaining a phase consistency measure operator by adopting phase consistency operation; the phase congruency measure operator
Figure QLYQS_28
The calculation formula of (a) is as follows;
Figure QLYQS_29
(8)
Figure QLYQS_30
(9)
in the formula, theta k Representing the direction angle at the decomposition level k,
Figure QLYQS_31
the n-th direction angle is theta k The magnitude of the Fourier series, [ e ] n,θk ,o n,θk Is the result of the convolution of the image with a log-Gabor filter at position (x, y); sigma k Represents the summation operation, sigma, on the variable k n Representing a summation operation on a variable n;
Figure QLYQS_32
is a small positive constant;
s4.2, designing a strength measurement strategy by a windowing method, wherein the strength measurement strategy is shown as a formula (10);
Figure QLYQS_33
(10)
in the formula (I), the compound is shown in the specification,
Figure QLYQS_34
(. Represents an intensity measurement operator at the decomposition level l, I l (x, y) represents the multi-scale detail layer sub-image pixel value at a position (x, y) below the decomposition level l, I l (x 0 ,y 0 ) Expressed at the position (x) below the decomposition level l 0 ,y 0 ) The sub-image pixel values of the multi-scale detail layer, omega, are expressed in terms of position
Figure QLYQS_35
A partially windowed area that is a center of the window,
Figure QLYQS_36
indicating in windowed areasA center pixel;
s4.3, a fusion weight strategy is constructed by combining a phase consistency method and an intensity measurement strategy, multi-scale detail layer sub-image fusion is realized, and a series of fused multi-scale detail layer sub-images are obtained; the fusion weight strategy is shown as a formula (11);
Figure QLYQS_37
(11)
in the formula (I), the compound is shown in the specification,
Figure QLYQS_38
representing the fused multi-scale detail layer sub-image, D IR (x, y) denotes an infrared detail layer sub-image, D VI (x, y) represents a visible light detail layer sub-image,
Figure QLYQS_39
(. Cndot.) represents the phase consistency measure operator at the decomposition level l.
8. The image fusion method based on multi-scale direction co-occurrence filter and intensity transfer according to claim 7, wherein the specific operation steps of step S5 are as follows:
s5.1, calculating local energy of the image pixel by pixel according to the K multi-scale multi-direction infrared base layer sub-images in the horizontal direction and the K multi-scale multi-direction infrared base layer sub-images in the vertical direction obtained in the step S3;
Figure QLYQS_40
(12)
in the formula, E (x, y) represents local energy of an image, and W le (i, j) represents a local energy window of size i × j, i ≦ 1 ≦ 3,1 ≦ j ≦ 3,B IR (x + i, y + j) represents performing a pixel-by-pixel traversal operation on the image;
s5.2, carrying out normalization operation on the local energy of the image to obtain a feature distribution expression operator P of the subimage of the basic layer;
Figure QLYQS_41
(13)
in the formula (I), the compound is shown in the specification,
Figure QLYQS_42
,E max (x, y) represents a local energy maximum value calculated according to the local energy window, and E (x + i, y + j) represents traversing operation on the local energy value in the local energy window;
s5.3, introducing a proportion parameter into the characteristic distribution expression operator P of the base layer subimage, constructing an adaptive intensity transfer function, and fusing the multi-scale and multi-direction base layer subimages in the horizontal direction and the vertical direction respectively to obtain a fused multi-scale and multi-direction infrared base layer subimage in the horizontal direction, a multi-scale and multi-direction visible light base layer subimage in the horizontal direction, a multi-scale and multi-direction infrared base layer subimage in the vertical direction and a multi-scale and multi-direction visible light base layer subimage in the vertical direction; the calculation formula of the fusion operation is shown as formula (14);
Figure QLYQS_43
(14)
in the formula (I), the compound is shown in the specification,
Figure QLYQS_44
representing multi-scale multi-directional base layer sub-image fusion weights; gamma represents the introduced ratio parameter; arctan (·) represents an arctangent operator; p represents a base layer sub-image feature distribution representation operator;
s5.4, the expression of the finally fused multi-scale and multi-direction base layer sub-images in different directions is as follows:
Figure QLYQS_45
(15)
in the formula (I), the compound is shown in the specification,
Figure QLYQS_46
means after fusionThe co-directional, multi-scale, multi-directional base layer sub-image, when a =0,
Figure QLYQS_47
a multi-scale multi-directional base layer sub-image representing the fused horizontal direction,
Figure QLYQS_48
a multi-scale multi-directional infrared base layer sub-image representing the fused horizontal direction,
Figure QLYQS_49
a multi-scale multi-directional visible light base layer sub-image representing the fused horizontal direction; when a =1, the number of the bits is set to a =1,
Figure QLYQS_50
a multi-scale multi-directional base layer sub-image representing the fused vertical direction,
Figure QLYQS_51
a multi-scale multi-directional infrared base layer sub-image representing the fused vertical direction,
Figure QLYQS_52
a multi-scale multi-directional visible light base layer sub-image representing the fused vertical direction; k represents a direction factor.
9. The method for image fusion based on multi-scale direction co-occurrence filter and intensity transfer according to claim 8, wherein the specific operation steps of step S6 are as follows:
reconstructing the fused multi-scale and multi-directional base layer sub-images in different directions by utilizing inverse operation of discrete tight support shear wave transformation to obtain a fused base layer image
Figure QLYQS_53
Fusing the base layer image
Figure QLYQS_54
And adding the fused multi-scale detail layer sub-image to obtain a final fused image F.
10. The method of claim 9, wherein the fusing base layer images is based on image fusion of multi-scale direction co-occurrence filter and intensity transfer
Figure QLYQS_55
The formula (2) is shown in formula (16); the calculation formula of the fusion image F is shown as a formula (17);
Figure QLYQS_56
(16)
Figure QLYQS_57
(17)
in the formula (I), the compound is shown in the specification,
Figure QLYQS_58
representing the inverse operator of the discrete tightly-supported shear wave transformation,
Figure QLYQS_59
representing the fused ith-level multi-scale detail layer sub-image;
Figure QLYQS_60
a multi-scale, multi-directional base layer sub-image representing the fused horizontal direction;
Figure QLYQS_61
a multi-scale, multi-directional base layer sub-image representing the fused vertical direction.
CN202310069737.8A 2023-02-07 2023-02-07 Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission Pending CN115797244A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310069737.8A CN115797244A (en) 2023-02-07 2023-02-07 Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310069737.8A CN115797244A (en) 2023-02-07 2023-02-07 Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission

Publications (1)

Publication Number Publication Date
CN115797244A true CN115797244A (en) 2023-03-14

Family

ID=85430098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310069737.8A Pending CN115797244A (en) 2023-02-07 2023-02-07 Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission

Country Status (1)

Country Link
CN (1) CN115797244A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104809734A (en) * 2015-05-11 2015-07-29 中国人民解放军总装备部军械技术研究所 Infrared image and visible image fusion method based on guide filtering
CN112017139A (en) * 2020-09-14 2020-12-01 南昌航空大学 Infrared and visible light image perception fusion method
CN113222877A (en) * 2021-06-03 2021-08-06 北京理工大学 Infrared and visible light image fusion method and application thereof in airborne photoelectric video
CN114897751A (en) * 2022-04-12 2022-08-12 北京理工大学 Infrared and visible light image perception fusion method based on multi-scale structural decomposition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104809734A (en) * 2015-05-11 2015-07-29 中国人民解放军总装备部军械技术研究所 Infrared image and visible image fusion method based on guide filtering
CN112017139A (en) * 2020-09-14 2020-12-01 南昌航空大学 Infrared and visible light image perception fusion method
CN113222877A (en) * 2021-06-03 2021-08-06 北京理工大学 Infrared and visible light image fusion method and application thereof in airborne photoelectric video
CN114897751A (en) * 2022-04-12 2022-08-12 北京理工大学 Infrared and visible light image perception fusion method based on multi-scale structural decomposition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
韩玺钰: "基于多尺度及显著区域分析的红外与可见光图像融合算法研究" *

Similar Documents

Publication Publication Date Title
CN110119780B (en) Hyper-spectral image super-resolution reconstruction method based on generation countermeasure network
CN112767251B (en) Image super-resolution method based on multi-scale detail feature fusion neural network
CN110443768B (en) Single-frame image super-resolution reconstruction method based on multiple consistency constraints
CN104318569B (en) Space salient region extraction method based on depth variation model
CN113284051B (en) Face super-resolution method based on frequency decomposition multi-attention machine system
CN107123089A (en) Remote sensing images super-resolution reconstruction method and system based on depth convolutional network
CN108830796A (en) Based on the empty high spectrum image super-resolution reconstructing method combined and gradient field is lost of spectrum
CN112001843B (en) Infrared image super-resolution reconstruction method based on deep learning
CN109272447A (en) A kind of depth map super-resolution method
CN111738948B (en) Underwater image enhancement method based on double U-nets
CN110675462A (en) Gray level image colorizing method based on convolutional neural network
CN102243711A (en) Neighbor embedding-based image super-resolution reconstruction method
CN109472743A (en) The super resolution ratio reconstruction method of remote sensing images
Yang et al. Image super-resolution based on deep neural network of multiple attention mechanism
CN114581560A (en) Attention mechanism-based multi-scale neural network infrared image colorizing method
CN112163998A (en) Single-image super-resolution analysis method matched with natural degradation conditions
Zhu et al. Super-resolving commercial satellite imagery using realistic training data
CN101739670B (en) Non-local mean space domain time varying image filtering method
CN115326809A (en) Apparent crack detection method and detection device for tunnel lining
CN117593188B (en) Super-resolution method based on unsupervised deep learning and corresponding equipment
CN117953310A (en) Remote sensing multi-mode image classification method based on continuous scale feature network
CN112686804B (en) Image super-resolution reconstruction method and device for mine low-light environment
CN113724139A (en) Unsupervised infrared single-image hyper-resolution for generation of countermeasure network based on dual discriminators
CN111652826B (en) Method for homogenizing multiple/hyperspectral remote sensing images based on Wallis filtering and histogram matching
CN115797244A (en) Image fusion method based on multi-scale direction co-occurrence filter and intensity transmission

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20230314

RJ01 Rejection of invention patent application after publication